Dec 06 06:46:21 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Dec 06 06:46:21 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 06 06:46:21 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 06 06:46:21 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 06 06:46:21 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 06 06:46:21 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 06 06:46:21 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 06 06:46:21 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Dec 06 06:46:21 localhost kernel: signal: max sigframe size: 1776
Dec 06 06:46:21 localhost kernel: BIOS-provided physical RAM map:
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 06 06:46:21 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Dec 06 06:46:21 localhost kernel: NX (Execute Disable) protection: active
Dec 06 06:46:21 localhost kernel: SMBIOS 2.8 present.
Dec 06 06:46:21 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 06 06:46:21 localhost kernel: Hypervisor detected: KVM
Dec 06 06:46:21 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 06 06:46:21 localhost kernel: kvm-clock: using sched offset of 1955091986 cycles
Dec 06 06:46:21 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 06 06:46:21 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 06 06:46:21 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 06 06:46:21 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 06 06:46:21 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Dec 06 06:46:21 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 06 06:46:21 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 06 06:46:21 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 06 06:46:21 localhost kernel: Using GB pages for direct mapping
Dec 06 06:46:21 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Dec 06 06:46:21 localhost kernel: ACPI: Early table checksum verification disabled
Dec 06 06:46:21 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 06 06:46:21 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:21 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:21 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:21 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 06 06:46:21 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:21 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:21 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 06 06:46:21 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 06 06:46:21 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 06 06:46:21 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 06 06:46:21 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 06 06:46:21 localhost kernel: No NUMA configuration found
Dec 06 06:46:21 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Dec 06 06:46:21 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd3000-0x43fffdfff]
Dec 06 06:46:21 localhost kernel: Reserving 256MB of memory at 2608MB for crashkernel (System RAM: 16383MB)
Dec 06 06:46:21 localhost kernel: Zone ranges:
Dec 06 06:46:21 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 06 06:46:21 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 06 06:46:21 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Dec 06 06:46:21 localhost kernel:   Device   empty
Dec 06 06:46:21 localhost kernel: Movable zone start for each node
Dec 06 06:46:21 localhost kernel: Early memory node ranges
Dec 06 06:46:21 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 06 06:46:21 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 06 06:46:21 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Dec 06 06:46:21 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Dec 06 06:46:21 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 06 06:46:21 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 06 06:46:21 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 06 06:46:21 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 06 06:46:21 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 06 06:46:21 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 06 06:46:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 06 06:46:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 06 06:46:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 06 06:46:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 06 06:46:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 06 06:46:21 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 06 06:46:21 localhost kernel: TSC deadline timer available
Dec 06 06:46:21 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 06 06:46:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 06 06:46:21 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 06 06:46:21 localhost kernel: Booting paravirtualized kernel on KVM
Dec 06 06:46:21 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 06 06:46:21 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 06 06:46:21 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Dec 06 06:46:21 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Dec 06 06:46:21 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 06 06:46:21 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 06 06:46:21 localhost kernel: Fallback order for Node 0: 0 
Dec 06 06:46:21 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Dec 06 06:46:21 localhost kernel: Policy zone: Normal
Dec 06 06:46:21 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 06 06:46:21 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Dec 06 06:46:21 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Dec 06 06:46:21 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 06 06:46:21 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 06 06:46:21 localhost kernel: software IO TLB: area num 8.
Dec 06 06:46:21 localhost kernel: Memory: 2826284K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741268K reserved, 0K cma-reserved)
Dec 06 06:46:21 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Dec 06 06:46:21 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 06 06:46:21 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Dec 06 06:46:21 localhost kernel: ftrace: allocated 176 pages with 3 groups
Dec 06 06:46:21 localhost kernel: Dynamic Preempt: voluntary
Dec 06 06:46:21 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 06 06:46:21 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 06 06:46:21 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 06 06:46:21 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 06 06:46:21 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 06 06:46:21 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 06 06:46:21 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 06 06:46:21 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 06 06:46:21 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 06 06:46:21 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 06 06:46:21 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Dec 06 06:46:21 localhost kernel: Console: colour VGA+ 80x25
Dec 06 06:46:21 localhost kernel: printk: console [tty0] enabled
Dec 06 06:46:21 localhost kernel: printk: console [ttyS0] enabled
Dec 06 06:46:21 localhost kernel: ACPI: Core revision 20211217
Dec 06 06:46:21 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 06 06:46:21 localhost kernel: x2apic enabled
Dec 06 06:46:21 localhost kernel: Switched APIC routing to physical x2apic.
Dec 06 06:46:21 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 06 06:46:21 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 06 06:46:21 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 06 06:46:21 localhost kernel: LSM: Security Framework initializing
Dec 06 06:46:21 localhost kernel: Yama: becoming mindful.
Dec 06 06:46:21 localhost kernel: SELinux:  Initializing.
Dec 06 06:46:21 localhost kernel: LSM support for eBPF active
Dec 06 06:46:21 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Dec 06 06:46:21 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Dec 06 06:46:21 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 06 06:46:21 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 06 06:46:21 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 06 06:46:21 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 06 06:46:21 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 06 06:46:21 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Dec 06 06:46:21 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Dec 06 06:46:21 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 06 06:46:21 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 06 06:46:21 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 06 06:46:21 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 06 06:46:21 localhost kernel: Freeing SMP alternatives memory: 36K
Dec 06 06:46:21 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 06 06:46:21 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Dec 06 06:46:21 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 06 06:46:21 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 06 06:46:21 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 06 06:46:21 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 06 06:46:21 localhost kernel: ... version:                0
Dec 06 06:46:21 localhost kernel: ... bit width:              48
Dec 06 06:46:21 localhost kernel: ... generic registers:      6
Dec 06 06:46:21 localhost kernel: ... value mask:             0000ffffffffffff
Dec 06 06:46:21 localhost kernel: ... max period:             00007fffffffffff
Dec 06 06:46:21 localhost kernel: ... fixed-purpose events:   0
Dec 06 06:46:21 localhost kernel: ... event mask:             000000000000003f
Dec 06 06:46:21 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 06 06:46:21 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 06 06:46:21 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 06 06:46:21 localhost kernel: x86: Booting SMP configuration:
Dec 06 06:46:21 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 06 06:46:21 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 06 06:46:21 localhost kernel: smpboot: Max logical packages: 8
Dec 06 06:46:21 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 06 06:46:21 localhost kernel: node 0 deferred pages initialised in 24ms
Dec 06 06:46:21 localhost kernel: devtmpfs: initialized
Dec 06 06:46:21 localhost kernel: x86/mm: Memory block size: 128MB
Dec 06 06:46:21 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 06 06:46:21 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec 06 06:46:21 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 06 06:46:21 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 06 06:46:21 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Dec 06 06:46:21 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 06 06:46:21 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 06 06:46:21 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 06 06:46:21 localhost kernel: audit: type=2000 audit(1765003580.150:1): state=initialized audit_enabled=0 res=1
Dec 06 06:46:21 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 06 06:46:21 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 06 06:46:21 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 06 06:46:21 localhost kernel: cpuidle: using governor menu
Dec 06 06:46:21 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Dec 06 06:46:21 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 06 06:46:21 localhost kernel: PCI: Using configuration type 1 for base access
Dec 06 06:46:21 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 06 06:46:21 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 06 06:46:21 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Dec 06 06:46:21 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Dec 06 06:46:21 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Dec 06 06:46:21 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Dec 06 06:46:21 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Dec 06 06:46:21 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 06 06:46:21 localhost kernel: ACPI: Interpreter enabled
Dec 06 06:46:21 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 06 06:46:21 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 06 06:46:21 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 06 06:46:21 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 06 06:46:21 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 06 06:46:21 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 06 06:46:21 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [3] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [4] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [5] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [6] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [7] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [8] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [9] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [10] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [11] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [12] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [13] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [14] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [15] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [16] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [17] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [18] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [19] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [20] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [21] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [22] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [23] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [24] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [25] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [26] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [27] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [28] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [29] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [30] registered
Dec 06 06:46:21 localhost kernel: acpiphp: Slot [31] registered
Dec 06 06:46:21 localhost kernel: PCI host bridge to bus 0000:00
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Dec 06 06:46:21 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Dec 06 06:46:21 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Dec 06 06:46:21 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Dec 06 06:46:21 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Dec 06 06:46:21 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 06 06:46:21 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Dec 06 06:46:21 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Dec 06 06:46:21 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 06 06:46:21 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 06 06:46:21 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 06 06:46:21 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 06 06:46:21 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 06 06:46:21 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 06 06:46:21 localhost kernel: iommu: Default domain type: Translated 
Dec 06 06:46:21 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Dec 06 06:46:21 localhost kernel: SCSI subsystem initialized
Dec 06 06:46:21 localhost kernel: ACPI: bus type USB registered
Dec 06 06:46:21 localhost kernel: usbcore: registered new interface driver usbfs
Dec 06 06:46:21 localhost kernel: usbcore: registered new interface driver hub
Dec 06 06:46:21 localhost kernel: usbcore: registered new device driver usb
Dec 06 06:46:21 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 06 06:46:21 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 06 06:46:21 localhost kernel: PTP clock support registered
Dec 06 06:46:21 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 06 06:46:21 localhost kernel: NetLabel: Initializing
Dec 06 06:46:21 localhost kernel: NetLabel:  domain hash size = 128
Dec 06 06:46:21 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 06 06:46:21 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 06 06:46:21 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 06 06:46:21 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 06 06:46:21 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 06 06:46:21 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 06 06:46:21 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 06 06:46:21 localhost kernel: vgaarb: loaded
Dec 06 06:46:21 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 06 06:46:21 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 06 06:46:21 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 06 06:46:21 localhost kernel: pnp: PnP ACPI init
Dec 06 06:46:21 localhost kernel: pnp 00:03: [dma 2]
Dec 06 06:46:21 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 06 06:46:21 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 06 06:46:21 localhost kernel: NET: Registered PF_INET protocol family
Dec 06 06:46:21 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Dec 06 06:46:21 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Dec 06 06:46:21 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 06 06:46:21 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 06 06:46:21 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 06 06:46:21 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Dec 06 06:46:21 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Dec 06 06:46:21 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Dec 06 06:46:21 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Dec 06 06:46:21 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 06 06:46:21 localhost kernel: NET: Registered PF_XDP protocol family
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 06 06:46:21 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 06 06:46:21 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 06 06:46:21 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 06 06:46:21 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27110 usecs
Dec 06 06:46:21 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 06 06:46:21 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 06 06:46:21 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 06 06:46:21 localhost kernel: software IO TLB: mapped [mem 0x00000000bbfdb000-0x00000000bffdb000] (64MB)
Dec 06 06:46:21 localhost kernel: ACPI: bus type thunderbolt registered
Dec 06 06:46:21 localhost kernel: Initialise system trusted keyrings
Dec 06 06:46:21 localhost kernel: Key type blacklist registered
Dec 06 06:46:21 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Dec 06 06:46:21 localhost kernel: zbud: loaded
Dec 06 06:46:21 localhost kernel: integrity: Platform Keyring initialized
Dec 06 06:46:21 localhost kernel: NET: Registered PF_ALG protocol family
Dec 06 06:46:21 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 06 06:46:21 localhost kernel: Key type asymmetric registered
Dec 06 06:46:21 localhost kernel: Asymmetric key parser 'x509' registered
Dec 06 06:46:21 localhost kernel: Running certificate verification selftests
Dec 06 06:46:21 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 06 06:46:21 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 06 06:46:21 localhost kernel: io scheduler mq-deadline registered
Dec 06 06:46:21 localhost kernel: io scheduler kyber registered
Dec 06 06:46:21 localhost kernel: io scheduler bfq registered
Dec 06 06:46:21 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 06 06:46:21 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 06 06:46:21 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 06 06:46:21 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 06 06:46:21 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 06 06:46:21 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 06 06:46:21 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 06 06:46:21 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 06 06:46:21 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 06 06:46:21 localhost kernel: Non-volatile memory driver v1.3
Dec 06 06:46:21 localhost kernel: rdac: device handler registered
Dec 06 06:46:21 localhost kernel: hp_sw: device handler registered
Dec 06 06:46:21 localhost kernel: emc: device handler registered
Dec 06 06:46:21 localhost kernel: alua: device handler registered
Dec 06 06:46:21 localhost kernel: libphy: Fixed MDIO Bus: probed
Dec 06 06:46:21 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Dec 06 06:46:21 localhost kernel: ehci-pci: EHCI PCI platform driver
Dec 06 06:46:21 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Dec 06 06:46:21 localhost kernel: ohci-pci: OHCI PCI platform driver
Dec 06 06:46:21 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Dec 06 06:46:21 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 06 06:46:21 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 06 06:46:21 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 06 06:46:21 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 06 06:46:21 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 06 06:46:21 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 06 06:46:21 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 06 06:46:21 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Dec 06 06:46:21 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 06 06:46:21 localhost kernel: hub 1-0:1.0: USB hub found
Dec 06 06:46:21 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 06 06:46:21 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 06 06:46:21 localhost kernel: usbserial: USB Serial support registered for generic
Dec 06 06:46:21 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 06 06:46:21 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 06 06:46:21 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 06 06:46:21 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 06 06:46:21 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 06 06:46:21 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 06 06:46:21 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 06 06:46:21 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-06T06:46:20 UTC (1765003580)
Dec 06 06:46:21 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 06 06:46:21 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 06 06:46:21 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 06 06:46:21 localhost kernel: usbcore: registered new interface driver usbhid
Dec 06 06:46:21 localhost kernel: usbhid: USB HID core driver
Dec 06 06:46:21 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 06 06:46:21 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 06 06:46:21 localhost kernel: Initializing XFRM netlink socket
Dec 06 06:46:21 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 06 06:46:21 localhost kernel: Segment Routing with IPv6
Dec 06 06:46:21 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 06 06:46:21 localhost kernel: mpls_gso: MPLS GSO support
Dec 06 06:46:21 localhost kernel: IPI shorthand broadcast: enabled
Dec 06 06:46:21 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 06 06:46:21 localhost kernel: AES CTR mode by8 optimization enabled
Dec 06 06:46:21 localhost kernel: sched_clock: Marking stable (718068304, 174151152)->(1020533805, -128314349)
Dec 06 06:46:21 localhost kernel: registered taskstats version 1
Dec 06 06:46:21 localhost kernel: Loading compiled-in X.509 certificates
Dec 06 06:46:21 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Dec 06 06:46:21 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 06 06:46:21 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 06 06:46:21 localhost kernel: zswap: loaded using pool lzo/zbud
Dec 06 06:46:21 localhost kernel: page_owner is disabled
Dec 06 06:46:21 localhost kernel: Key type big_key registered
Dec 06 06:46:21 localhost kernel: Freeing initrd memory: 74232K
Dec 06 06:46:21 localhost kernel: Key type encrypted registered
Dec 06 06:46:21 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 06 06:46:21 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 06 06:46:21 localhost kernel: Loading compiled-in module X.509 certificates
Dec 06 06:46:21 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Dec 06 06:46:21 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 06 06:46:21 localhost kernel: ima: No architecture policies found
Dec 06 06:46:21 localhost kernel: evm: Initialising EVM extended attributes:
Dec 06 06:46:21 localhost kernel: evm: security.selinux
Dec 06 06:46:21 localhost kernel: evm: security.SMACK64 (disabled)
Dec 06 06:46:21 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 06 06:46:21 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 06 06:46:21 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 06 06:46:21 localhost kernel: evm: security.apparmor (disabled)
Dec 06 06:46:21 localhost kernel: evm: security.ima
Dec 06 06:46:21 localhost kernel: evm: security.capability
Dec 06 06:46:21 localhost kernel: evm: HMAC attrs: 0x1
Dec 06 06:46:21 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 06 06:46:21 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 06 06:46:21 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 06 06:46:21 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 06 06:46:21 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 06 06:46:21 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 06 06:46:21 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 06 06:46:21 localhost kernel: Freeing unused decrypted memory: 2036K
Dec 06 06:46:21 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Dec 06 06:46:21 localhost kernel: Write protecting the kernel read-only data: 26624k
Dec 06 06:46:21 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Dec 06 06:46:21 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Dec 06 06:46:21 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 06 06:46:21 localhost kernel: Run /init as init process
Dec 06 06:46:21 localhost kernel:   with arguments:
Dec 06 06:46:21 localhost kernel:     /init
Dec 06 06:46:21 localhost kernel:   with environment:
Dec 06 06:46:21 localhost kernel:     HOME=/
Dec 06 06:46:21 localhost kernel:     TERM=linux
Dec 06 06:46:21 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Dec 06 06:46:21 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 06:46:21 localhost systemd[1]: Detected virtualization kvm.
Dec 06 06:46:21 localhost systemd[1]: Detected architecture x86-64.
Dec 06 06:46:21 localhost systemd[1]: Running in initrd.
Dec 06 06:46:21 localhost systemd[1]: No hostname configured, using default hostname.
Dec 06 06:46:21 localhost systemd[1]: Hostname set to <localhost>.
Dec 06 06:46:21 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 06 06:46:21 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 06 06:46:21 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 06 06:46:21 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 06 06:46:21 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 06 06:46:21 localhost systemd[1]: Reached target Local File Systems.
Dec 06 06:46:21 localhost systemd[1]: Reached target Path Units.
Dec 06 06:46:21 localhost systemd[1]: Reached target Slice Units.
Dec 06 06:46:21 localhost systemd[1]: Reached target Swaps.
Dec 06 06:46:21 localhost systemd[1]: Reached target Timer Units.
Dec 06 06:46:21 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 06 06:46:21 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 06 06:46:21 localhost systemd[1]: Listening on Journal Socket.
Dec 06 06:46:21 localhost systemd[1]: Listening on udev Control Socket.
Dec 06 06:46:21 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 06 06:46:21 localhost systemd[1]: Reached target Socket Units.
Dec 06 06:46:21 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 06 06:46:21 localhost systemd[1]: Starting Journal Service...
Dec 06 06:46:21 localhost systemd[1]: Starting Load Kernel Modules...
Dec 06 06:46:21 localhost systemd[1]: Starting Create System Users...
Dec 06 06:46:21 localhost systemd[1]: Starting Setup Virtual Console...
Dec 06 06:46:21 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 06 06:46:21 localhost systemd[1]: Finished Load Kernel Modules.
Dec 06 06:46:21 localhost systemd-journald[284]: Journal started
Dec 06 06:46:21 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/74aa0f2ebd78406da4f02263c03ef4c3) is 8.0M, max 314.7M, 306.7M free.
Dec 06 06:46:21 localhost systemd-modules-load[285]: Module 'msr' is built in
Dec 06 06:46:21 localhost systemd[1]: Started Journal Service.
Dec 06 06:46:21 localhost systemd[1]: Finished Setup Virtual Console.
Dec 06 06:46:21 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 06 06:46:21 localhost systemd[1]: Starting dracut cmdline hook...
Dec 06 06:46:21 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 06 06:46:21 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Dec 06 06:46:21 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Dec 06 06:46:21 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Dec 06 06:46:21 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 06 06:46:21 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 06 06:46:21 localhost systemd[1]: Finished Create System Users.
Dec 06 06:46:21 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 06 06:46:21 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 06 06:46:21 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Dec 06 06:46:21 localhost dracut-cmdline[289]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 06 06:46:21 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 06 06:46:21 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 06 06:46:21 localhost systemd[1]: Finished dracut cmdline hook.
Dec 06 06:46:21 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 06 06:46:21 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 06 06:46:21 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 06 06:46:21 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Dec 06 06:46:21 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 06 06:46:21 localhost kernel: RPC: Registered udp transport module.
Dec 06 06:46:21 localhost kernel: RPC: Registered tcp transport module.
Dec 06 06:46:21 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 06 06:46:21 localhost rpc.statd[407]: Version 2.5.4 starting
Dec 06 06:46:21 localhost rpc.statd[407]: Initializing NSM state
Dec 06 06:46:21 localhost rpc.idmapd[412]: Setting log level to 0
Dec 06 06:46:21 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 06 06:46:21 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 06:46:21 localhost systemd-udevd[425]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 06:46:21 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 06:46:21 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 06 06:46:21 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 06 06:46:21 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 06 06:46:21 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 06 06:46:21 localhost systemd[1]: Reached target System Initialization.
Dec 06 06:46:21 localhost systemd[1]: Reached target Basic System.
Dec 06 06:46:21 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 06 06:46:21 localhost systemd[1]: Reached target Network.
Dec 06 06:46:21 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 06 06:46:21 localhost systemd[1]: Starting dracut initqueue hook...
Dec 06 06:46:21 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Dec 06 06:46:21 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Dec 06 06:46:21 localhost kernel: GPT:20971519 != 838860799
Dec 06 06:46:21 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Dec 06 06:46:21 localhost kernel: GPT:20971519 != 838860799
Dec 06 06:46:21 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Dec 06 06:46:21 localhost kernel:  vda: vda1 vda2 vda3 vda4
Dec 06 06:46:21 localhost kernel: libata version 3.00 loaded.
Dec 06 06:46:21 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 06 06:46:21 localhost systemd-udevd[452]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 06:46:21 localhost kernel: scsi host0: ata_piix
Dec 06 06:46:21 localhost kernel: scsi host1: ata_piix
Dec 06 06:46:21 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Dec 06 06:46:21 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Dec 06 06:46:21 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Dec 06 06:46:21 localhost systemd[1]: Reached target Initrd Root Device.
Dec 06 06:46:22 localhost kernel: ata1: found unknown device (class 0)
Dec 06 06:46:22 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 06 06:46:22 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 06 06:46:22 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 06 06:46:22 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 06 06:46:22 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 06 06:46:22 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 06 06:46:22 localhost systemd[1]: Finished dracut initqueue hook.
Dec 06 06:46:22 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 06 06:46:22 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 06 06:46:22 localhost systemd[1]: Reached target Remote File Systems.
Dec 06 06:46:22 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 06 06:46:22 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 06 06:46:22 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Dec 06 06:46:22 localhost systemd-fsck[511]: /usr/sbin/fsck.xfs: XFS file system.
Dec 06 06:46:22 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Dec 06 06:46:22 localhost systemd[1]: Mounting /sysroot...
Dec 06 06:46:22 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 06 06:46:22 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Dec 06 06:46:22 localhost kernel: XFS (vda4): Ending clean mount
Dec 06 06:46:22 localhost systemd[1]: Mounted /sysroot.
Dec 06 06:46:22 localhost systemd[1]: Reached target Initrd Root File System.
Dec 06 06:46:22 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 06 06:46:22 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 06 06:46:22 localhost systemd[1]: Reached target Initrd File Systems.
Dec 06 06:46:22 localhost systemd[1]: Reached target Initrd Default Target.
Dec 06 06:46:22 localhost systemd[1]: Starting dracut mount hook...
Dec 06 06:46:22 localhost systemd[1]: Finished dracut mount hook.
Dec 06 06:46:22 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 06 06:46:22 localhost rpc.idmapd[412]: exiting on signal 15
Dec 06 06:46:22 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 06 06:46:22 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 06 06:46:22 localhost systemd[1]: Stopped target Network.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Timer Units.
Dec 06 06:46:22 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 06 06:46:22 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Basic System.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Path Units.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Remote File Systems.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Slice Units.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Socket Units.
Dec 06 06:46:22 localhost systemd[1]: Stopped target System Initialization.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Local File Systems.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Swaps.
Dec 06 06:46:22 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut mount hook.
Dec 06 06:46:22 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 06 06:46:22 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 06 06:46:22 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 06 06:46:22 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 06 06:46:22 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 06 06:46:22 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Load Kernel Modules.
Dec 06 06:46:22 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 06 06:46:22 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 06 06:46:22 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 06 06:46:22 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 06 06:46:22 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 06 06:46:22 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 06 06:46:22 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Closed udev Control Socket.
Dec 06 06:46:22 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Closed udev Kernel Socket.
Dec 06 06:46:22 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 06 06:46:22 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 06 06:46:22 localhost systemd[1]: Starting Cleanup udev Database...
Dec 06 06:46:22 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 06 06:46:22 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 06 06:46:22 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Stopped Create System Users.
Dec 06 06:46:22 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 06 06:46:22 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 06 06:46:22 localhost systemd[1]: Finished Cleanup udev Database.
Dec 06 06:46:22 localhost systemd[1]: Reached target Switch Root.
Dec 06 06:46:22 localhost systemd[1]: Starting Switch Root...
Dec 06 06:46:22 localhost systemd[1]: Switching root.
Dec 06 06:46:22 localhost systemd-journald[284]: Journal stopped
Dec 06 06:46:23 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Dec 06 06:46:23 localhost kernel: audit: type=1404 audit(1765003582.957:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability open_perms=1
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 06:46:23 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 06:46:23 localhost kernel: audit: type=1403 audit(1765003583.059:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 06 06:46:23 localhost systemd[1]: Successfully loaded SELinux policy in 104.437ms.
Dec 06 06:46:23 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.617ms.
Dec 06 06:46:23 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 06:46:23 localhost systemd[1]: Detected virtualization kvm.
Dec 06 06:46:23 localhost systemd[1]: Detected architecture x86-64.
Dec 06 06:46:23 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 06:46:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 06:46:23 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 06 06:46:23 localhost systemd[1]: Stopped Switch Root.
Dec 06 06:46:23 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 06 06:46:23 localhost systemd[1]: Created slice Slice /system/getty.
Dec 06 06:46:23 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 06 06:46:23 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 06 06:46:23 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 06 06:46:23 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Dec 06 06:46:23 localhost systemd[1]: Created slice User and Session Slice.
Dec 06 06:46:23 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 06 06:46:23 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 06 06:46:23 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 06 06:46:23 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 06 06:46:23 localhost systemd[1]: Stopped target Switch Root.
Dec 06 06:46:23 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 06 06:46:23 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 06 06:46:23 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 06 06:46:23 localhost systemd[1]: Reached target Path Units.
Dec 06 06:46:23 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 06 06:46:23 localhost systemd[1]: Reached target Slice Units.
Dec 06 06:46:23 localhost systemd[1]: Reached target Swaps.
Dec 06 06:46:23 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 06 06:46:23 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 06 06:46:23 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 06 06:46:23 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 06 06:46:23 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 06 06:46:23 localhost systemd[1]: Listening on udev Control Socket.
Dec 06 06:46:23 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 06 06:46:23 localhost systemd[1]: Mounting Huge Pages File System...
Dec 06 06:46:23 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 06 06:46:23 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 06 06:46:23 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 06 06:46:23 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 06 06:46:23 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 06 06:46:23 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 06:46:23 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 06 06:46:23 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 06 06:46:23 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 06 06:46:23 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 06 06:46:23 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 06 06:46:23 localhost systemd[1]: Stopped Journal Service.
Dec 06 06:46:23 localhost systemd[1]: Starting Journal Service...
Dec 06 06:46:23 localhost systemd[1]: Starting Load Kernel Modules...
Dec 06 06:46:23 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 06 06:46:23 localhost kernel: fuse: init (API version 7.36)
Dec 06 06:46:23 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 06 06:46:23 localhost systemd-journald[618]: Journal started
Dec 06 06:46:23 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 8.0M, max 314.7M, 306.7M free.
Dec 06 06:46:23 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 06 06:46:23 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 06 06:46:23 localhost systemd-modules-load[619]: Module 'msr' is built in
Dec 06 06:46:23 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 06 06:46:23 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 06 06:46:23 localhost systemd[1]: Started Journal Service.
Dec 06 06:46:23 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 06 06:46:23 localhost systemd[1]: Mounted Huge Pages File System.
Dec 06 06:46:23 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 06 06:46:23 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 06 06:46:23 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 06 06:46:23 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 06 06:46:23 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 06:46:23 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 06:46:23 localhost kernel: ACPI: bus type drm_connector registered
Dec 06 06:46:23 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 06 06:46:23 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 06 06:46:23 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 06 06:46:23 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 06 06:46:23 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 06 06:46:23 localhost systemd[1]: Finished Load Kernel Modules.
Dec 06 06:46:23 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 06 06:46:23 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 06 06:46:23 localhost systemd[1]: Mounting FUSE Control File System...
Dec 06 06:46:23 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 06 06:46:23 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 06 06:46:23 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 06 06:46:23 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 06 06:46:23 localhost systemd[1]: Starting Load/Save Random Seed...
Dec 06 06:46:23 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 06 06:46:23 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 8.0M, max 314.7M, 306.7M free.
Dec 06 06:46:23 localhost systemd-journald[618]: Received client request to flush runtime journal.
Dec 06 06:46:23 localhost systemd[1]: Starting Create System Users...
Dec 06 06:46:23 localhost systemd[1]: Mounted FUSE Control File System.
Dec 06 06:46:23 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 06 06:46:23 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 06 06:46:23 localhost systemd[1]: Finished Load/Save Random Seed.
Dec 06 06:46:23 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 06 06:46:23 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 06 06:46:23 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989.
Dec 06 06:46:23 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988.
Dec 06 06:46:23 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Dec 06 06:46:23 localhost systemd[1]: Finished Create System Users.
Dec 06 06:46:23 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 06 06:46:23 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 06 06:46:23 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 06 06:46:23 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 06 06:46:23 localhost systemd[1]: Set up automount EFI System Partition Automount.
Dec 06 06:46:24 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 06 06:46:24 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 06:46:24 localhost systemd-udevd[636]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 06:46:24 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 06:46:24 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 06:46:24 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 06:46:24 localhost systemd-udevd[638]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 06:46:24 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 06 06:46:24 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Dec 06 06:46:24 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Dec 06 06:46:24 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Dec 06 06:46:24 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 06 06:46:24 localhost systemd-fsck[680]: fsck.fat 4.2 (2021-01-31)
Dec 06 06:46:24 localhost systemd-fsck[680]: /dev/vda2: 12 files, 1782/51145 clusters
Dec 06 06:46:24 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Dec 06 06:46:24 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 06 06:46:24 localhost kernel: SVM: TSC scaling supported
Dec 06 06:46:24 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 06 06:46:24 localhost kernel: kvm: Nested Virtualization enabled
Dec 06 06:46:24 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 06 06:46:24 localhost kernel: SVM: kvm: Nested Paging enabled
Dec 06 06:46:24 localhost kernel: SVM: LBR virtualization supported
Dec 06 06:46:24 localhost kernel: Console: switching to colour dummy device 80x25
Dec 06 06:46:24 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 06 06:46:24 localhost kernel: [drm] features: -context_init
Dec 06 06:46:24 localhost kernel: [drm] number of scanouts: 1
Dec 06 06:46:24 localhost kernel: [drm] number of cap sets: 0
Dec 06 06:46:24 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Dec 06 06:46:24 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Dec 06 06:46:24 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 06 06:46:24 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 06 06:46:24 localhost systemd[1]: Mounting /boot...
Dec 06 06:46:24 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Dec 06 06:46:24 localhost kernel: XFS (vda3): Ending clean mount
Dec 06 06:46:24 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Dec 06 06:46:24 localhost systemd[1]: Mounted /boot.
Dec 06 06:46:24 localhost systemd[1]: Mounting /boot/efi...
Dec 06 06:46:24 localhost systemd[1]: Mounted /boot/efi.
Dec 06 06:46:24 localhost systemd[1]: Reached target Local File Systems.
Dec 06 06:46:24 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 06 06:46:24 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 06 06:46:24 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 06 06:46:24 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 06:46:24 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 06 06:46:24 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 06 06:46:24 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 06 06:46:24 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 718 (bootctl)
Dec 06 06:46:24 localhost systemd[1]: Starting File System Check on /dev/vda2...
Dec 06 06:46:24 localhost systemd[1]: Finished File System Check on /dev/vda2.
Dec 06 06:46:24 localhost systemd[1]: Mounting EFI System Partition Automount...
Dec 06 06:46:24 localhost systemd[1]: Mounted EFI System Partition Automount.
Dec 06 06:46:24 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 06 06:46:24 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 06 06:46:24 localhost systemd[1]: Starting Security Auditing Service...
Dec 06 06:46:24 localhost systemd[1]: Starting RPC Bind...
Dec 06 06:46:24 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 06 06:46:24 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 06 06:46:24 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 06 06:46:24 localhost systemd[1]: Starting Update is Completed...
Dec 06 06:46:24 localhost auditd[728]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Dec 06 06:46:24 localhost auditd[728]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Dec 06 06:46:24 localhost systemd[1]: Started RPC Bind.
Dec 06 06:46:24 localhost systemd[1]: Finished Update is Completed.
Dec 06 06:46:24 localhost augenrules[733]: /sbin/augenrules: No change
Dec 06 06:46:24 localhost augenrules[743]: No rules
Dec 06 06:46:24 localhost augenrules[743]: enabled 1
Dec 06 06:46:24 localhost augenrules[743]: failure 1
Dec 06 06:46:24 localhost augenrules[743]: pid 728
Dec 06 06:46:24 localhost augenrules[743]: rate_limit 0
Dec 06 06:46:24 localhost augenrules[743]: backlog_limit 8192
Dec 06 06:46:24 localhost augenrules[743]: lost 0
Dec 06 06:46:24 localhost augenrules[743]: backlog 2
Dec 06 06:46:24 localhost augenrules[743]: backlog_wait_time 60000
Dec 06 06:46:24 localhost augenrules[743]: backlog_wait_time_actual 0
Dec 06 06:46:24 localhost augenrules[743]: enabled 1
Dec 06 06:46:24 localhost augenrules[743]: failure 1
Dec 06 06:46:24 localhost augenrules[743]: pid 728
Dec 06 06:46:24 localhost augenrules[743]: rate_limit 0
Dec 06 06:46:24 localhost augenrules[743]: backlog_limit 8192
Dec 06 06:46:24 localhost augenrules[743]: lost 0
Dec 06 06:46:24 localhost augenrules[743]: backlog 0
Dec 06 06:46:24 localhost augenrules[743]: backlog_wait_time 60000
Dec 06 06:46:24 localhost augenrules[743]: backlog_wait_time_actual 0
Dec 06 06:46:24 localhost augenrules[743]: enabled 1
Dec 06 06:46:24 localhost augenrules[743]: failure 1
Dec 06 06:46:24 localhost augenrules[743]: pid 728
Dec 06 06:46:24 localhost augenrules[743]: rate_limit 0
Dec 06 06:46:24 localhost augenrules[743]: backlog_limit 8192
Dec 06 06:46:24 localhost augenrules[743]: lost 0
Dec 06 06:46:24 localhost augenrules[743]: backlog 2
Dec 06 06:46:24 localhost augenrules[743]: backlog_wait_time 60000
Dec 06 06:46:24 localhost augenrules[743]: backlog_wait_time_actual 0
Dec 06 06:46:24 localhost systemd[1]: Started Security Auditing Service.
Dec 06 06:46:24 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 06 06:46:24 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 06 06:46:24 localhost systemd[1]: Reached target System Initialization.
Dec 06 06:46:24 localhost systemd[1]: Started dnf makecache --timer.
Dec 06 06:46:24 localhost systemd[1]: Started Daily rotation of log files.
Dec 06 06:46:24 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 06 06:46:24 localhost systemd[1]: Reached target Timer Units.
Dec 06 06:46:24 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 06 06:46:24 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 06 06:46:24 localhost systemd[1]: Reached target Socket Units.
Dec 06 06:46:24 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Dec 06 06:46:24 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 06 06:46:24 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 06:46:24 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 06 06:46:24 localhost systemd[1]: Reached target Basic System.
Dec 06 06:46:24 localhost systemd[1]: Starting NTP client/server...
Dec 06 06:46:24 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 06 06:46:24 localhost systemd[1]: Started irqbalance daemon.
Dec 06 06:46:24 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 06 06:46:24 localhost dbus-broker-lau[752]: Ready
Dec 06 06:46:24 localhost systemd[1]: Starting System Logging Service...
Dec 06 06:46:24 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 06:46:24 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 06:46:24 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 06:46:24 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 06 06:46:24 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 06 06:46:24 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 06 06:46:24 localhost rsyslogd[760]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="760" x-info="https://www.rsyslog.com"] start
Dec 06 06:46:24 localhost rsyslogd[760]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Dec 06 06:46:24 localhost systemd[1]: Starting User Login Management...
Dec 06 06:46:24 localhost systemd[1]: Started System Logging Service.
Dec 06 06:46:24 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 06 06:46:25 localhost chronyd[767]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 06:46:25 localhost chronyd[767]: Using right/UTC timezone to obtain leap second data
Dec 06 06:46:25 localhost chronyd[767]: Loaded seccomp filter (level 2)
Dec 06 06:46:25 localhost systemd[1]: Started NTP client/server.
Dec 06 06:46:25 localhost systemd-logind[765]: New seat seat0.
Dec 06 06:46:25 localhost systemd-logind[765]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 06 06:46:25 localhost systemd-logind[765]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 06 06:46:25 localhost systemd[1]: Started User Login Management.
Dec 06 06:46:25 localhost rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 06:46:25 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sat, 06 Dec 2025 06:46:25 +0000. Up 5.48 seconds.
Dec 06 06:46:25 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 06 06:46:25 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 06 06:46:25 localhost systemd[1]: Starting Hostname Service...
Dec 06 06:46:25 localhost systemd[1]: Started Hostname Service.
Dec 06 06:46:25 np0005548788.novalocal systemd-hostnamed[785]: Hostname set to <np0005548788.novalocal> (static)
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Reached target Preparation for Network.
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting Network Manager...
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8070] NetworkManager (version 1.42.2-1.el9) is starting... (boot:a7a8faba-4bf4-4554-a65c-09a9226535fb)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8077] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Started Network Manager.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8106] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Reached target Network.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8162] manager[0x55b6d1c83020]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8214] hostname: hostname: using hostnamed
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8215] hostname: static hostname changed from (none) to "np0005548788.novalocal"
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8222] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8374] manager[0x55b6d1c83020]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8375] manager[0x55b6d1c83020]: rfkill: WWAN hardware radio set enabled
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8442] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8443] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8447] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8448] manager: Networking is enabled by state file
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8466] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8466] settings: Loaded settings plugin: keyfile (internal)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8503] dhcp: init: Using DHCP client 'internal'
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8508] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8529] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8538] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8551] device (lo): Activation: starting connection 'lo' (ba27b317-3333-4514-8ab8-e5b47b3963f3)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8565] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8571] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Reached target NFS client services.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8619] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8626] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8629] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8632] device (eth0): carrier: link connected
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8634] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8640] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8649] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8657] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8660] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8664] manager: NetworkManager state is now CONNECTING
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8666] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Reached target Remote File Systems.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8680] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8686] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8800] dhcp4 (eth0): state changed new lease, address=38.102.83.97
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8807] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8832] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8841] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8850] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8859] device (lo): Activation: successful, device activated.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8868] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8872] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8877] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8882] device (eth0): Activation: successful, device activated.
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8890] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 06:46:25 np0005548788.novalocal NetworkManager[790]: <info>  [1765003585.8896] manager: startup complete
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 06 06:46:25 np0005548788.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: Cloud-init v. 22.1-9.el9 running 'init' at Sat, 06 Dec 2025 06:46:26 +0000. Up 6.29 seconds.
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |  eth0  | True |         38.102.83.97         | 255.255.255.0 | global | fa:16:3e:86:7b:58 |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |  eth0  | True | fe80::f816:3eff:fe86:7b58/64 |       .       |  link  | fa:16:3e:86:7b:58 |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 06 06:46:26 np0005548788.novalocal cloud-init[972]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 06:46:26 np0005548788.novalocal systemd[1]: Starting Authorization Manager...
Dec 06 06:46:26 np0005548788.novalocal polkitd[1037]: Started polkitd version 0.117
Dec 06 06:46:26 np0005548788.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 06:46:26 np0005548788.novalocal polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 06:46:26 np0005548788.novalocal polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 06:46:26 np0005548788.novalocal polkitd[1037]: Finished loading, compiling and executing 4 rules
Dec 06 06:46:26 np0005548788.novalocal systemd[1]: Started Authorization Manager.
Dec 06 06:46:26 np0005548788.novalocal polkitd[1037]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 06 06:46:27 np0005548788.novalocal useradd[1116]: new group: name=cloud-user, GID=1001
Dec 06 06:46:27 np0005548788.novalocal useradd[1116]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 06 06:46:27 np0005548788.novalocal useradd[1116]: add 'cloud-user' to group 'adm'
Dec 06 06:46:27 np0005548788.novalocal useradd[1116]: add 'cloud-user' to group 'systemd-journal'
Dec 06 06:46:27 np0005548788.novalocal useradd[1116]: add 'cloud-user' to shadow group 'adm'
Dec 06 06:46:27 np0005548788.novalocal useradd[1116]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Generating public/private rsa key pair.
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: The key fingerprint is:
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: SHA256:yAOwaWQF6sQs1hN2i25uqu0A2HD33ED7NOqTQuxvEaw root@np0005548788.novalocal
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: The key's randomart image is:
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: +---[RSA 3072]----+
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |  ==...          |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |o+o++...         |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |+==+ooo o        |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |*=.o.=oB .       |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |o..oo.B.S        |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |. ooE..o         |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |.  oo +.         |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: | oo  o..         |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |ooo  ..          |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: +----[SHA256]-----+
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Generating public/private ecdsa key pair.
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: The key fingerprint is:
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: SHA256:vewxbv/YFN5vLyagOr63XSyVjBySEiflfKQqYlh9ZiA root@np0005548788.novalocal
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: The key's randomart image is:
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: +---[ECDSA 256]---+
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |  E . o.o .      |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |   o . * +       |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |  . . = * o      |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: | o   + o = + .   |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |. o . . S + + .  |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: | . . .   ..+ . o |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |         .*.o o .|
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |      . o+.=.+o o|
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |     .+=.o+.o+oo+|
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: +----[SHA256]-----+
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Generating public/private ed25519 key pair.
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: The key fingerprint is:
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: SHA256:eiQetORtgwEQIF0y2sruKNTNeLE/6hEjeWOVDH/Tkc4 root@np0005548788.novalocal
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: The key's randomart image is:
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: +--[ED25519 256]--+
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |oo=+o.     ..    |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |.o.o .+ . ...    |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |. .   += oo.     |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |..  .=.=. .E     |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |...o+*O S        |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |.. o+*+* .       |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |..  ..+ .        |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |+     .+         |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: |o.  .o. .        |
Dec 06 06:46:28 np0005548788.novalocal cloud-init[972]: +----[SHA256]-----+
Dec 06 06:46:28 np0005548788.novalocal sm-notify[1129]: Version 2.5.4 starting
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Dec 06 06:46:28 np0005548788.novalocal sshd[1130]: Server listening on 0.0.0.0 port 22.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 06 06:46:28 np0005548788.novalocal sshd[1130]: Server listening on :: port 22.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Reached target Network is Online.
Dec 06 06:46:28 np0005548788.novalocal crond[1132]: (CRON) STARTUP (1.5.7)
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Dec 06 06:46:28 np0005548788.novalocal crond[1132]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 06 06:46:28 np0005548788.novalocal sshd[1130]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Dec 06 06:46:28 np0005548788.novalocal crond[1132]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 62% if used.)
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 06 06:46:28 np0005548788.novalocal crond[1132]: (CRON) INFO (running with inotify support)
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting Permit User Sessions...
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Finished Permit User Sessions.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Started Command Scheduler.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Started Getty on tty1.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Reached target Login Prompts.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Reached target Multi-User System.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 06 06:46:28 np0005548788.novalocal sshd[1133]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 06 06:46:28 np0005548788.novalocal sshd[1157]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1157]: Unable to negotiate with 38.102.83.114 port 49790: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 06 06:46:28 np0005548788.novalocal sshd[1168]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1168]: Connection reset by 38.102.83.114 port 49804 [preauth]
Dec 06 06:46:28 np0005548788.novalocal kdumpctl[1136]: kdump: No kdump initial ramdisk found.
Dec 06 06:46:28 np0005548788.novalocal kdumpctl[1136]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Dec 06 06:46:28 np0005548788.novalocal sshd[1187]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1187]: Unable to negotiate with 38.102.83.114 port 49816: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 06 06:46:28 np0005548788.novalocal sshd[1196]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1133]: Connection closed by 38.102.83.114 port 49776 [preauth]
Dec 06 06:46:28 np0005548788.novalocal sshd[1196]: Unable to negotiate with 38.102.83.114 port 49826: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 06 06:46:28 np0005548788.novalocal sshd[1214]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1228]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1249]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1255]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sat, 06 Dec 2025 06:46:28 +0000. Up 8.60 seconds.
Dec 06 06:46:28 np0005548788.novalocal sshd[1249]: fatal: mm_answer_sign: sign: error in libcrypto
Dec 06 06:46:28 np0005548788.novalocal sshd[1266]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:28 np0005548788.novalocal sshd[1266]: Unable to negotiate with 38.102.83.114 port 49868: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 06 06:46:28 np0005548788.novalocal sshd[1214]: Connection closed by 38.102.83.114 port 49834 [preauth]
Dec 06 06:46:28 np0005548788.novalocal sshd[1228]: Connection closed by 38.102.83.114 port 49844 [preauth]
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Dec 06 06:46:28 np0005548788.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1430]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sat, 06 Dec 2025 06:46:28 +0000. Up 8.96 seconds.
Dec 06 06:46:28 np0005548788.novalocal dracut[1435]: dracut-057-21.git20230214.el9
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1452]: #############################################################
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1453]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1455]: 256 SHA256:vewxbv/YFN5vLyagOr63XSyVjBySEiflfKQqYlh9ZiA root@np0005548788.novalocal (ECDSA)
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1457]: 256 SHA256:eiQetORtgwEQIF0y2sruKNTNeLE/6hEjeWOVDH/Tkc4 root@np0005548788.novalocal (ED25519)
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1460]: 3072 SHA256:yAOwaWQF6sQs1hN2i25uqu0A2HD33ED7NOqTQuxvEaw root@np0005548788.novalocal (RSA)
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1462]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 06 06:46:28 np0005548788.novalocal cloud-init[1464]: #############################################################
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Dec 06 06:46:29 np0005548788.novalocal cloud-init[1430]: Cloud-init v. 22.1-9.el9 finished at Sat, 06 Dec 2025 06:46:29 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.22 seconds
Dec 06 06:46:29 np0005548788.novalocal systemd[1]: Reloading Network Manager...
Dec 06 06:46:29 np0005548788.novalocal NetworkManager[790]: <info>  [1765003589.1836] audit: op="reload" arg="0" pid=1552 uid=0 result="success"
Dec 06 06:46:29 np0005548788.novalocal NetworkManager[790]: <info>  [1765003589.1843] config: signal: SIGHUP (no changes from disk)
Dec 06 06:46:29 np0005548788.novalocal systemd[1]: Reloaded Network Manager.
Dec 06 06:46:29 np0005548788.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Dec 06 06:46:29 np0005548788.novalocal systemd[1]: Reached target Cloud-init target.
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: memstrack is not available
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:29 np0005548788.novalocal dracut[1437]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: memstrack is not available
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: *** Including module: systemd ***
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: *** Including module: systemd-initrd ***
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: *** Including module: i18n ***
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: No KEYMAP configured.
Dec 06 06:46:30 np0005548788.novalocal dracut[1437]: *** Including module: drm ***
Dec 06 06:46:30 np0005548788.novalocal chronyd[767]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org)
Dec 06 06:46:30 np0005548788.novalocal chronyd[767]: System clock TAI offset set to 37 seconds
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: prefixdevname ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: kernel-modules ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: kernel-modules-extra ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: qemu ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: fstab-sys ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: rootfs-block ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: terminfo ***
Dec 06 06:46:31 np0005548788.novalocal dracut[1437]: *** Including module: udev-rules ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: Skipping udev rule: 91-permissions.rules
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: virtiofs ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: dracut-systemd ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: usrmount ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: base ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: fs-lib ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: kdumpbase ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]:   microcode_ctl module: mangling fw_dir
Dec 06 06:46:32 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]: *** Including module: shutdown ***
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]: *** Including module: squash ***
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]: *** Including modules done ***
Dec 06 06:46:33 np0005548788.novalocal dracut[1437]: *** Installing kernel module dependencies ***
Dec 06 06:46:34 np0005548788.novalocal dracut[1437]: *** Installing kernel module dependencies done ***
Dec 06 06:46:34 np0005548788.novalocal dracut[1437]: *** Resolving executable dependencies ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Resolving executable dependencies done ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Hardlinking files ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Mode:           real
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Files:          1099
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Linked:         3 files
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Compared:       0 xattrs
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Compared:       373 files
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Saved:          61.04 KiB
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Duration:       0.026310 seconds
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Hardlinking files done ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Could not find 'strip'. Not stripping the initramfs.
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Generating early-microcode cpio image ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Constructing AuthenticAMD.bin ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Store current command line parameters ***
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: Stored kernel commandline:
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: No dracut internal kernel commandline stored in the initramfs
Dec 06 06:46:35 np0005548788.novalocal dracut[1437]: *** Install squash loader ***
Dec 06 06:46:36 np0005548788.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 06:46:36 np0005548788.novalocal dracut[1437]: *** Squashing the files inside the initramfs ***
Dec 06 06:46:37 np0005548788.novalocal dracut[1437]: *** Squashing the files inside the initramfs done ***
Dec 06 06:46:37 np0005548788.novalocal dracut[1437]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Dec 06 06:46:37 np0005548788.novalocal dracut[1437]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Dec 06 06:46:37 np0005548788.novalocal kdumpctl[1136]: kdump: kexec: loaded kdump kernel
Dec 06 06:46:37 np0005548788.novalocal kdumpctl[1136]: kdump: Starting kdump: [OK]
Dec 06 06:46:37 np0005548788.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 06 06:46:37 np0005548788.novalocal systemd[1]: Startup finished in 1.157s (kernel) + 1.978s (initrd) + 15.015s (userspace) = 18.152s.
Dec 06 06:46:55 np0005548788.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 06:48:24 np0005548788.novalocal sshd[4175]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:48:24 np0005548788.novalocal sshd[4175]: Accepted publickey for zuul from 38.102.83.114 port 54718 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 06 06:48:24 np0005548788.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 06 06:48:24 np0005548788.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 06 06:48:24 np0005548788.novalocal systemd-logind[765]: New session 1 of user zuul.
Dec 06 06:48:24 np0005548788.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 06 06:48:24 np0005548788.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Queued start job for default target Main User Target.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Created slice User Application Slice.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Reached target Paths.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Reached target Timers.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Starting D-Bus User Message Bus Socket...
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Starting Create User's Volatile Files and Directories...
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Finished Create User's Volatile Files and Directories.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Listening on D-Bus User Message Bus Socket.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Reached target Sockets.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Reached target Basic System.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Reached target Main User Target.
Dec 06 06:48:24 np0005548788.novalocal systemd[4179]: Startup finished in 114ms.
Dec 06 06:48:24 np0005548788.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 06 06:48:24 np0005548788.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 06 06:48:24 np0005548788.novalocal sshd[4175]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:48:25 np0005548788.novalocal python3[4231]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:34 np0005548788.novalocal python3[4250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:42 np0005548788.novalocal python3[4303]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:43 np0005548788.novalocal python3[4333]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 06 06:48:46 np0005548788.novalocal python3[4349]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:48:47 np0005548788.novalocal python3[4363]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:48:48 np0005548788.novalocal python3[4422]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:48:48 np0005548788.novalocal python3[4463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765003728.1249325-390-10508191377235/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa follow=False checksum=59556e0a2f4b936183817041ae1f59f0f3c92dd9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:48:51 np0005548788.novalocal python3[4536]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:48:51 np0005548788.novalocal python3[4577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765003730.8454409-485-129026405216558/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa.pub follow=False checksum=2b77fe3fb3441abe077d8d93b68745bd8f418f92 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:48:53 np0005548788.novalocal python3[4605]: ansible-ping Invoked with data=pong
Dec 06 06:48:55 np0005548788.novalocal python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:58 np0005548788.novalocal python3[4672]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 06 06:49:01 np0005548788.novalocal python3[4694]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:01 np0005548788.novalocal python3[4708]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:02 np0005548788.novalocal python3[4722]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:03 np0005548788.novalocal python3[4736]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:04 np0005548788.novalocal python3[4750]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:04 np0005548788.novalocal python3[4764]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:06 np0005548788.novalocal sudo[4779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umrsbitlsxvqiscqyfvmbkflklhownku ; /usr/bin/python3
Dec 06 06:49:06 np0005548788.novalocal sudo[4779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:07 np0005548788.novalocal python3[4781]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:07 np0005548788.novalocal sudo[4779]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:08 np0005548788.novalocal sudo[4827]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwhuozvjjroduyklcndyqqmjimbrbqev ; /usr/bin/python3
Dec 06 06:49:08 np0005548788.novalocal sudo[4827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:08 np0005548788.novalocal python3[4829]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:08 np0005548788.novalocal sudo[4827]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:08 np0005548788.novalocal sudo[4870]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvtbflqjajuwxnrtqlfkkfhyftfmtefv ; /usr/bin/python3
Dec 06 06:49:08 np0005548788.novalocal sudo[4870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:08 np0005548788.novalocal python3[4872]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765003748.2656374-98-173038063422752/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:08 np0005548788.novalocal sudo[4870]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:16 np0005548788.novalocal python3[4900]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:16 np0005548788.novalocal python3[4914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:16 np0005548788.novalocal python3[4928]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548788.novalocal python3[4942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548788.novalocal python3[4956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548788.novalocal python3[4970]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548788.novalocal python3[4984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548788.novalocal python3[4998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548788.novalocal python3[5012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548788.novalocal python3[5026]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548788.novalocal python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:19 np0005548788.novalocal python3[5054]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:19 np0005548788.novalocal python3[5068]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:19 np0005548788.novalocal python3[5082]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548788.novalocal python3[5096]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548788.novalocal python3[5110]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548788.novalocal python3[5124]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548788.novalocal python3[5138]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548788.novalocal python3[5152]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548788.novalocal python3[5166]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548788.novalocal python3[5180]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548788.novalocal python3[5194]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548788.novalocal python3[5208]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548788.novalocal python3[5222]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548788.novalocal python3[5236]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548788.novalocal python3[5250]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:24 np0005548788.novalocal sudo[5264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wubppefxpmdvrtoizrtralerkppgfwwq ; /usr/bin/python3
Dec 06 06:49:24 np0005548788.novalocal sudo[5264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:25 np0005548788.novalocal python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 06:49:25 np0005548788.novalocal systemd[1]: Starting Time & Date Service...
Dec 06 06:49:25 np0005548788.novalocal systemd[1]: Started Time & Date Service.
Dec 06 06:49:26 np0005548788.novalocal systemd-timedated[5268]: Changed time zone to 'UTC' (UTC).
Dec 06 06:49:26 np0005548788.novalocal sudo[5264]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:26 np0005548788.novalocal sudo[5285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-liwrrakxfpnbokinnuhsjrdfkcpvfpey ; /usr/bin/python3
Dec 06 06:49:26 np0005548788.novalocal sudo[5285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:26 np0005548788.novalocal python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:26 np0005548788.novalocal sudo[5285]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:28 np0005548788.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:28 np0005548788.novalocal python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765003767.8318548-490-170644905658474/source _original_basename=tmp5i2kca7u follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:29 np0005548788.novalocal python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:30 np0005548788.novalocal python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765003769.4530132-582-213367783804023/source _original_basename=tmp9l9lgzp5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:31 np0005548788.novalocal sudo[5535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpxhvdaakcuktzxbuawfptucksshmmmt ; /usr/bin/python3
Dec 06 06:49:31 np0005548788.novalocal sudo[5535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:31 np0005548788.novalocal python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:31 np0005548788.novalocal sudo[5535]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:31 np0005548788.novalocal sudo[5578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zybyphiaxfiknvlmzrwknumncusntnov ; /usr/bin/python3
Dec 06 06:49:31 np0005548788.novalocal sudo[5578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:32 np0005548788.novalocal python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765003771.491039-724-212617820439926/source _original_basename=tmpy7mfd0y_ follow=False checksum=8c2ca5bc92adf57e5f110fdd685e6d08e9897451 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:32 np0005548788.novalocal sudo[5578]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:33 np0005548788.novalocal python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:49:33 np0005548788.novalocal python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:49:34 np0005548788.novalocal sudo[5672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqctejphhtkpxhatlbehoazdjjzysdlb ; /usr/bin/python3
Dec 06 06:49:34 np0005548788.novalocal sudo[5672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:34 np0005548788.novalocal python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:34 np0005548788.novalocal sudo[5672]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:34 np0005548788.novalocal sudo[5715]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apaqxbpgguzhguqwwzkzqfkkespqdebe ; /usr/bin/python3
Dec 06 06:49:34 np0005548788.novalocal sudo[5715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:35 np0005548788.novalocal python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765003774.4121156-847-257341831791375/source _original_basename=tmpp4s_9iix follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:35 np0005548788.novalocal sudo[5715]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:36 np0005548788.novalocal sudo[5746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfvpwtzkcfsagldlhylqrahnnuwcdyvp ; /usr/bin/python3
Dec 06 06:49:36 np0005548788.novalocal sudo[5746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:36 np0005548788.novalocal python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-8d81-2216-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:49:36 np0005548788.novalocal sudo[5746]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:37 np0005548788.novalocal python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8d81-2216-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 06 06:49:39 np0005548788.novalocal python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:56 np0005548788.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 06:50:37 np0005548788.novalocal sudo[5802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnsczphquqgzrwzyzflwrgdbqwlhcdzl ; /usr/bin/python3
Dec 06 06:50:37 np0005548788.novalocal sudo[5802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:50:37 np0005548788.novalocal python3[5804]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:50:37 np0005548788.novalocal sudo[5802]: pam_unix(sudo:session): session closed for user root
Dec 06 06:51:21 np0005548788.novalocal systemd[4179]: Starting Mark boot as successful...
Dec 06 06:51:21 np0005548788.novalocal systemd[4179]: Finished Mark boot as successful.
Dec 06 06:51:37 np0005548788.novalocal sshd[4188]: Received disconnect from 38.102.83.114 port 54718:11: disconnected by user
Dec 06 06:51:37 np0005548788.novalocal sshd[4188]: Disconnected from user zuul 38.102.83.114 port 54718
Dec 06 06:51:37 np0005548788.novalocal sshd[4175]: pam_unix(sshd:session): session closed for user zuul
Dec 06 06:51:37 np0005548788.novalocal systemd-logind[765]: Session 1 logged out. Waiting for processes to exit.
Dec 06 06:51:46 np0005548788.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Dec 06 06:51:46 np0005548788.novalocal systemd[1]: efi.mount: Deactivated successfully.
Dec 06 06:51:46 np0005548788.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Dec 06 06:53:54 np0005548788.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Dec 06 06:53:54 np0005548788.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3303] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 06:53:54 np0005548788.novalocal systemd-udevd[5812]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3416] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3442] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3445] device (eth1): carrier: link connected
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3448] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Dec 06 06:53:54 np0005548788.novalocal systemd[4179]: Created slice User Background Tasks Slice.
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3452] policy: auto-activating connection 'Wired connection 1' (9fd7296e-cdb3-34f8-8a92-f80dbc8f7a1b)
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3456] device (eth1): Activation: starting connection 'Wired connection 1' (9fd7296e-cdb3-34f8-8a92-f80dbc8f7a1b)
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3457] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3460] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3465] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:53:54 np0005548788.novalocal NetworkManager[790]: <info>  [1765004034.3467] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:53:54 np0005548788.novalocal systemd[4179]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 06:53:54 np0005548788.novalocal systemd[4179]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 06:53:54 np0005548788.novalocal sshd[5815]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:53:55 np0005548788.novalocal sshd[5815]: Accepted publickey for zuul from 38.102.83.114 port 46688 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 06:53:55 np0005548788.novalocal systemd-logind[765]: New session 3 of user zuul.
Dec 06 06:53:55 np0005548788.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 06 06:53:55 np0005548788.novalocal sshd[5815]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:53:55 np0005548788.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Dec 06 06:53:55 np0005548788.novalocal python3[5832]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-1ece-0164-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:54:08 np0005548788.novalocal sudo[5881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhkrqdiglioiitofhginwpctqfyztrec ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:54:08 np0005548788.novalocal sudo[5881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:54:08 np0005548788.novalocal python3[5883]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:54:08 np0005548788.novalocal sudo[5881]: pam_unix(sudo:session): session closed for user root
Dec 06 06:54:08 np0005548788.novalocal sudo[5924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltjkqrcpoftnxqkyljmzekorryowqgwk ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:54:08 np0005548788.novalocal sudo[5924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:54:08 np0005548788.novalocal python3[5926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765004048.33794-435-36771268720574/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=ede8644efde08afc76068132d653f04a45a21266 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:54:09 np0005548788.novalocal sudo[5924]: pam_unix(sudo:session): session closed for user root
Dec 06 06:54:09 np0005548788.novalocal sudo[5954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfhfpzhkdtuuupxqxetxidhdyuewmqfo ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:54:09 np0005548788.novalocal sudo[5954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:54:09 np0005548788.novalocal python3[5956]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Stopping Network Manager...
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5696] caught SIGTERM, shutting down normally.
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5833] dhcp4 (eth0): canceled DHCP transaction
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5834] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5834] dhcp4 (eth0): state changed no lease
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5837] manager: NetworkManager state is now CONNECTING
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5934] dhcp4 (eth1): canceled DHCP transaction
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.5934] dhcp4 (eth1): state changed no lease
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[790]: <info>  [1765004049.6012] exiting (success)
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Stopped Network Manager.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: NetworkManager.service: Consumed 2.314s CPU time.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Starting Network Manager...
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.6601] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:a7a8faba-4bf4-4554-a65c-09a9226535fb)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.6602] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Started Network Manager.
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.6623] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.6697] manager[0x55cf9f8ef090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Starting Hostname Service...
Dec 06 06:54:09 np0005548788.novalocal sudo[5954]: pam_unix(sudo:session): session closed for user root
Dec 06 06:54:09 np0005548788.novalocal systemd[1]: Started Hostname Service.
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7530] hostname: hostname: using hostnamed
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7531] hostname: static hostname changed from (none) to "np0005548788.novalocal"
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7537] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7546] manager[0x55cf9f8ef090]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7546] manager[0x55cf9f8ef090]: rfkill: WWAN hardware radio set enabled
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7600] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7601] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7604] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7608] manager: Networking is enabled by state file
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7619] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7620] settings: Loaded settings plugin: keyfile (internal)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7693] dhcp: init: Using DHCP client 'internal'
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7697] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7708] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7718] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7734] device (lo): Activation: starting connection 'lo' (ba27b317-3333-4514-8ab8-e5b47b3963f3)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7746] device (eth0): carrier: link connected
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7752] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7762] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7763] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7777] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7790] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7801] device (eth1): carrier: link connected
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7807] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7818] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (9fd7296e-cdb3-34f8-8a92-f80dbc8f7a1b) (indicated)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7818] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7830] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7842] device (eth1): Activation: starting connection 'Wired connection 1' (9fd7296e-cdb3-34f8-8a92-f80dbc8f7a1b)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7872] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7880] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7884] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7889] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7897] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7901] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7908] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7942] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7953] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7959] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7971] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.7976] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8000] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8009] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8019] device (lo): Activation: successful, device activated.
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8030] dhcp4 (eth0): state changed new lease, address=38.102.83.97
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8037] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8148] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8186] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8187] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8191] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8195] device (eth0): Activation: successful, device activated.
Dec 06 06:54:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004049.8198] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 06:54:09 np0005548788.novalocal python3[6028]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-1ece-0164-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:54:19 np0005548788.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 06:54:39 np0005548788.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 06:54:54 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004094.8198] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:54 np0005548788.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 06:54:54 np0005548788.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 06:54:54 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004094.8438] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:54 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004094.8441] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:54 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004094.8450] device (eth1): Activation: successful, device activated.
Dec 06 06:54:54 np0005548788.novalocal NetworkManager[5968]: <info>  [1765004094.8459] manager: startup complete
Dec 06 06:54:54 np0005548788.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 06 06:55:04 np0005548788.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 06:55:10 np0005548788.novalocal sshd[5818]: Received disconnect from 38.102.83.114 port 46688:11: disconnected by user
Dec 06 06:55:10 np0005548788.novalocal sshd[5818]: Disconnected from user zuul 38.102.83.114 port 46688
Dec 06 06:55:10 np0005548788.novalocal sshd[5815]: pam_unix(sshd:session): session closed for user zuul
Dec 06 06:55:10 np0005548788.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 06 06:55:10 np0005548788.novalocal systemd[1]: session-3.scope: Consumed 1.448s CPU time.
Dec 06 06:55:10 np0005548788.novalocal systemd-logind[765]: Session 3 logged out. Waiting for processes to exit.
Dec 06 06:55:10 np0005548788.novalocal systemd-logind[765]: Removed session 3.
Dec 06 06:56:17 np0005548788.novalocal sshd[6058]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:17 np0005548788.novalocal sshd[6058]: Accepted publickey for zuul from 38.102.83.114 port 52244 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 06:56:17 np0005548788.novalocal systemd-logind[765]: New session 4 of user zuul.
Dec 06 06:56:17 np0005548788.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 06 06:56:17 np0005548788.novalocal sshd[6058]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:56:18 np0005548788.novalocal sudo[6107]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phialrxlmrutpfrxvtufxnpilqweilwb ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:56:18 np0005548788.novalocal sudo[6107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:56:18 np0005548788.novalocal python3[6109]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:56:18 np0005548788.novalocal sudo[6107]: pam_unix(sudo:session): session closed for user root
Dec 06 06:56:18 np0005548788.novalocal sudo[6150]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqtsqdhemhbdziruuhkgyoskebynzmka ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:56:18 np0005548788.novalocal sudo[6150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:56:18 np0005548788.novalocal python3[6152]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765004177.9057024-628-76747397548864/source _original_basename=tmpudg7kout follow=False checksum=301833a7e04d955921816dd6c79e775f1a8a19aa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:56:18 np0005548788.novalocal sudo[6150]: pam_unix(sudo:session): session closed for user root
Dec 06 06:56:22 np0005548788.novalocal sshd[6058]: pam_unix(sshd:session): session closed for user zuul
Dec 06 06:56:22 np0005548788.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 06 06:56:22 np0005548788.novalocal systemd-logind[765]: Session 4 logged out. Waiting for processes to exit.
Dec 06 06:56:22 np0005548788.novalocal systemd-logind[765]: Removed session 4.
Dec 06 07:01:01 np0005548788.novalocal CROND[6170]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 07:01:01 np0005548788.novalocal run-parts[6173]: (/etc/cron.hourly) starting 0anacron
Dec 06 07:01:01 np0005548788.novalocal anacron[6181]: Anacron started on 2025-12-06
Dec 06 07:01:01 np0005548788.novalocal anacron[6181]: Will run job `cron.daily' in 13 min.
Dec 06 07:01:01 np0005548788.novalocal anacron[6181]: Will run job `cron.weekly' in 33 min.
Dec 06 07:01:01 np0005548788.novalocal anacron[6181]: Will run job `cron.monthly' in 53 min.
Dec 06 07:01:01 np0005548788.novalocal anacron[6181]: Jobs will be executed sequentially
Dec 06 07:01:01 np0005548788.novalocal run-parts[6183]: (/etc/cron.hourly) finished 0anacron
Dec 06 07:01:01 np0005548788.novalocal CROND[6169]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 07:01:21 np0005548788.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Dec 06 07:01:21 np0005548788.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 06 07:01:21 np0005548788.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Dec 06 07:01:21 np0005548788.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 06 07:04:29 np0005548788.novalocal sshd[6189]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:04:29 np0005548788.novalocal sshd[6189]: Accepted publickey for zuul from 38.102.83.114 port 40870 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:04:29 np0005548788.novalocal systemd-logind[765]: New session 5 of user zuul.
Dec 06 07:04:29 np0005548788.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 06 07:04:29 np0005548788.novalocal sshd[6189]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:04:29 np0005548788.novalocal sudo[6206]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlvtxwgrjyvyjtxgavbnjdxruxwunjic ; /usr/bin/python3
Dec 06 07:04:29 np0005548788.novalocal sudo[6206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:29 np0005548788.novalocal python3[6208]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-e5b2-9de0-000000001d10-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:30 np0005548788.novalocal sudo[6206]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:41 np0005548788.novalocal sudo[6227]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbvcdhrqdgopphkwnbrvijxxglsblpuv ; /usr/bin/python3
Dec 06 07:04:41 np0005548788.novalocal sudo[6227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:41 np0005548788.novalocal python3[6229]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:41 np0005548788.novalocal sudo[6227]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:41 np0005548788.novalocal sudo[6243]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clwhppirwnspmojtorbukestaecorhaw ; /usr/bin/python3
Dec 06 07:04:41 np0005548788.novalocal sudo[6243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:41 np0005548788.novalocal python3[6245]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:41 np0005548788.novalocal sudo[6243]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:41 np0005548788.novalocal sudo[6259]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xigrgdxszssttcipzmrwonduvgkjibfr ; /usr/bin/python3
Dec 06 07:04:41 np0005548788.novalocal sudo[6259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:41 np0005548788.novalocal python3[6261]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:41 np0005548788.novalocal sudo[6259]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:42 np0005548788.novalocal sudo[6275]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xveixljrtgtrpwdjeoolhgwgryqnhtmd ; /usr/bin/python3
Dec 06 07:04:42 np0005548788.novalocal sudo[6275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:42 np0005548788.novalocal python3[6277]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:42 np0005548788.novalocal sudo[6275]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:42 np0005548788.novalocal sudo[6291]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmdatjildcswovbqrgdmcbuctpzhmkiq ; /usr/bin/python3
Dec 06 07:04:42 np0005548788.novalocal sudo[6291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:43 np0005548788.novalocal python3[6293]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:43 np0005548788.novalocal sudo[6291]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:44 np0005548788.novalocal sudo[6339]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltgrvnbhljsedjiiiubdxfhfsoxpgeyr ; /usr/bin/python3
Dec 06 07:04:44 np0005548788.novalocal sudo[6339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:44 np0005548788.novalocal python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:04:44 np0005548788.novalocal sudo[6339]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:44 np0005548788.novalocal sudo[6382]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbetnrjegbbqmvovkgbfxaxmyeeihfqa ; /usr/bin/python3
Dec 06 07:04:44 np0005548788.novalocal sudo[6382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:44 np0005548788.novalocal python3[6384]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765004684.062176-645-64177874453257/source _original_basename=tmp6beir_n3 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:44 np0005548788.novalocal sudo[6382]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:46 np0005548788.novalocal sudo[6412]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnbnopldsawvtnlophjguaexbwprycjs ; /usr/bin/python3
Dec 06 07:04:46 np0005548788.novalocal sudo[6412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:46 np0005548788.novalocal python3[6414]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 07:04:46 np0005548788.novalocal systemd[1]: Reloading.
Dec 06 07:04:46 np0005548788.novalocal systemd-rc-local-generator[6432]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:04:46 np0005548788.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:04:46 np0005548788.novalocal sudo[6412]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:47 np0005548788.novalocal sudo[6458]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwsbbhcqcyfabqmegxqjcflsikhazjrs ; /usr/bin/python3
Dec 06 07:04:47 np0005548788.novalocal sudo[6458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:48 np0005548788.novalocal python3[6460]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 06 07:04:48 np0005548788.novalocal sudo[6458]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:49 np0005548788.novalocal sudo[6474]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifnqvuchhwxugkkymnmbypenrxfmgddd ; /usr/bin/python3
Dec 06 07:04:49 np0005548788.novalocal sudo[6474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:49 np0005548788.novalocal python3[6476]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:49 np0005548788.novalocal sudo[6474]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:49 np0005548788.novalocal sudo[6492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofepnrmejzcfwwvkxtpjzpuxodqhxnqq ; /usr/bin/python3
Dec 06 07:04:49 np0005548788.novalocal sudo[6492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:49 np0005548788.novalocal python3[6494]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:49 np0005548788.novalocal sudo[6492]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:49 np0005548788.novalocal sudo[6510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycnzfulndgiwslocdslkktckcvrbhsfk ; /usr/bin/python3
Dec 06 07:04:49 np0005548788.novalocal sudo[6510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:50 np0005548788.novalocal python3[6512]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:50 np0005548788.novalocal sudo[6510]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:50 np0005548788.novalocal sudo[6528]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xggmaurzvjcwdqymcbbbrxmhzrrwdbug ; /usr/bin/python3
Dec 06 07:04:50 np0005548788.novalocal sudo[6528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:50 np0005548788.novalocal python3[6530]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:50 np0005548788.novalocal sudo[6528]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:51 np0005548788.novalocal python3[6547]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-e5b2-9de0-000000001d17-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:05:02 np0005548788.novalocal python3[6567]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 07:05:05 np0005548788.novalocal sshd[6189]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:05:05 np0005548788.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 06 07:05:05 np0005548788.novalocal systemd[1]: session-5.scope: Consumed 3.945s CPU time.
Dec 06 07:05:05 np0005548788.novalocal systemd-logind[765]: Session 5 logged out. Waiting for processes to exit.
Dec 06 07:05:05 np0005548788.novalocal systemd-logind[765]: Removed session 5.
Dec 06 07:06:59 np0005548788.novalocal sshd[6573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:06:59 np0005548788.novalocal sshd[6573]: Accepted publickey for zuul from 38.102.83.114 port 55688 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:06:59 np0005548788.novalocal systemd-logind[765]: New session 6 of user zuul.
Dec 06 07:06:59 np0005548788.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 06 07:06:59 np0005548788.novalocal sshd[6573]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:06:59 np0005548788.novalocal sudo[6590]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rifkbaqklglpynsguupwwaychbxehfnp ; /usr/bin/python3
Dec 06 07:06:59 np0005548788.novalocal sudo[6590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:06:59 np0005548788.novalocal systemd[1]: Starting RHSM dbus service...
Dec 06 07:07:00 np0005548788.novalocal systemd[1]: Started RHSM dbus service.
Dec 06 07:07:00 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:00 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:00 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:00 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005548788.novalocal (5e361fc3-fe07-43ea-9bfe-d0f84e52c70c)
Dec 06 07:07:01 np0005548788.novalocal subscription-manager[6597]: Registered system with identity: 5e361fc3-fe07-43ea-9bfe-d0f84e52c70c
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.entcertlib:131] certs updated:
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]: Total updates: 1
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]: Found (local) serial# []
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]: Expected (UEP) serial# [1318196540583554871]
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]: Added (new)
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]:   [sn:1318196540583554871 ( Content Access,) @ /etc/pki/entitlement/1318196540583554871.pem]
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]: Deleted (rogue):
Dec 06 07:07:01 np0005548788.novalocal rhsm-service[6597]:   <NONE>
Dec 06 07:07:01 np0005548788.novalocal subscription-manager[6597]: Added subscription for 'Content Access' contract 'None'
Dec 06 07:07:01 np0005548788.novalocal subscription-manager[6597]: Added subscription for product ' Content Access'
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:04 np0005548788.novalocal sudo[6590]: pam_unix(sudo:session): session closed for user root
Dec 06 07:07:11 np0005548788.novalocal python3[6688]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-ea42-bf82-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:07:13 np0005548788.novalocal sudo[6705]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nelvdtswrjewamkrbcrowgirbmpfdmpv ; /usr/bin/python3
Dec 06 07:07:13 np0005548788.novalocal sudo[6705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:07:13 np0005548788.novalocal python3[6707]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:07:42 np0005548788.novalocal setsebool[6782]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 06 07:07:42 np0005548788.novalocal setsebool[6782]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  Converting 407 SID table entries...
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 07:07:51 np0005548788.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 07:08:03 np0005548788.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Dec 06 07:08:03 np0005548788.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:08:03 np0005548788.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 06 07:08:03 np0005548788.novalocal systemd[1]: Reloading.
Dec 06 07:08:03 np0005548788.novalocal systemd-rc-local-generator[7643]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:08:03 np0005548788.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:08:04 np0005548788.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 07:08:05 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:08:05 np0005548788.novalocal sudo[6705]: pam_unix(sudo:session): session closed for user root
Dec 06 07:08:12 np0005548788.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 07:08:12 np0005548788.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 06 07:08:12 np0005548788.novalocal systemd[1]: man-db-cache-update.service: Consumed 11.142s CPU time.
Dec 06 07:08:12 np0005548788.novalocal systemd[1]: run-r2c84733f3388471bbed56ce34fc55757.service: Deactivated successfully.
Dec 06 07:08:58 np0005548788.novalocal sudo[18372]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yaibhqrhrcuelnavlfqfeljtgmncfeqh ; /usr/bin/python3
Dec 06 07:08:58 np0005548788.novalocal sudo[18372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:08:59 np0005548788.novalocal podman[18375]: 2025-12-06 07:08:59.27457015 +0000 UTC m=+0.142715966 system refresh
Dec 06 07:08:59 np0005548788.novalocal sudo[18372]: pam_unix(sudo:session): session closed for user root
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: Starting D-Bus User Message Bus...
Dec 06 07:09:00 np0005548788.novalocal dbus-broker-launch[18432]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 06 07:09:00 np0005548788.novalocal dbus-broker-launch[18432]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: Started D-Bus User Message Bus.
Dec 06 07:09:00 np0005548788.novalocal dbus-broker-lau[18432]: Ready
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: Created slice Slice /user.
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: podman-18416.scope: unit configures an IP firewall, but not running as root.
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: (This warning is only shown for the first unit using IP firewalling.)
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: Started podman-18416.scope.
Dec 06 07:09:00 np0005548788.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:09:00 np0005548788.novalocal systemd[4179]: Started podman-pause-bdaa8868.scope.
Dec 06 07:09:02 np0005548788.novalocal sshd[6573]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:09:02 np0005548788.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Dec 06 07:09:02 np0005548788.novalocal systemd[1]: session-6.scope: Consumed 50.164s CPU time.
Dec 06 07:09:02 np0005548788.novalocal systemd-logind[765]: Session 6 logged out. Waiting for processes to exit.
Dec 06 07:09:02 np0005548788.novalocal systemd-logind[765]: Removed session 6.
Dec 06 07:09:17 np0005548788.novalocal sshd[18436]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548788.novalocal sshd[18437]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548788.novalocal sshd[18438]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548788.novalocal sshd[18439]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548788.novalocal sshd[18440]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548788.novalocal sshd[18436]: Unable to negotiate with 38.102.83.83 port 40266: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 06 07:09:17 np0005548788.novalocal sshd[18438]: Connection closed by 38.102.83.83 port 40228 [preauth]
Dec 06 07:09:17 np0005548788.novalocal sshd[18439]: Connection closed by 38.102.83.83 port 40244 [preauth]
Dec 06 07:09:17 np0005548788.novalocal sshd[18437]: Unable to negotiate with 38.102.83.83 port 40250: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 06 07:09:17 np0005548788.novalocal sshd[18440]: Unable to negotiate with 38.102.83.83 port 40252: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 06 07:09:22 np0005548788.novalocal sshd[18446]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:22 np0005548788.novalocal sshd[18446]: Accepted publickey for zuul from 38.102.83.114 port 39530 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:09:22 np0005548788.novalocal systemd-logind[765]: New session 7 of user zuul.
Dec 06 07:09:22 np0005548788.novalocal systemd[1]: Started Session 7 of User zuul.
Dec 06 07:09:22 np0005548788.novalocal sshd[18446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:09:22 np0005548788.novalocal python3[18463]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEYVtM235X0xWH2FKli0CUGpvCLQnDDtCI4yCYqNdWcGuxt1LThsgCBuwYYpkvH+K5VLRKMEyM949Yu6yQU/mgI= zuul@np0005548782.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:09:23 np0005548788.novalocal sudo[18477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuyjnghcpfpgjabywkepdedszorddlsd ; /usr/bin/python3
Dec 06 07:09:23 np0005548788.novalocal sudo[18477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:09:23 np0005548788.novalocal python3[18479]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEYVtM235X0xWH2FKli0CUGpvCLQnDDtCI4yCYqNdWcGuxt1LThsgCBuwYYpkvH+K5VLRKMEyM949Yu6yQU/mgI= zuul@np0005548782.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:09:23 np0005548788.novalocal sudo[18477]: pam_unix(sudo:session): session closed for user root
Dec 06 07:09:25 np0005548788.novalocal sshd[18446]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:09:25 np0005548788.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Dec 06 07:09:25 np0005548788.novalocal systemd-logind[765]: Session 7 logged out. Waiting for processes to exit.
Dec 06 07:09:25 np0005548788.novalocal systemd-logind[765]: Removed session 7.
Dec 06 07:10:49 np0005548788.novalocal sshd[18481]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:10:49 np0005548788.novalocal sshd[18481]: Received disconnect from 45.55.249.98 port 47512:11: Bye Bye [preauth]
Dec 06 07:10:49 np0005548788.novalocal sshd[18481]: Disconnected from authenticating user root 45.55.249.98 port 47512 [preauth]
Dec 06 07:10:55 np0005548788.novalocal sshd[18485]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:10:55 np0005548788.novalocal sshd[18486]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:10:55 np0005548788.novalocal sshd[18486]: error: kex_exchange_identification: read: Connection reset by peer
Dec 06 07:10:55 np0005548788.novalocal sshd[18486]: Connection reset by 45.140.17.97 port 30552
Dec 06 07:11:02 np0005548788.novalocal sshd[18487]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:11:02 np0005548788.novalocal sshd[18487]: Accepted publickey for zuul from 38.102.83.114 port 52920 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:11:02 np0005548788.novalocal systemd-logind[765]: New session 8 of user zuul.
Dec 06 07:11:02 np0005548788.novalocal systemd[1]: Started Session 8 of User zuul.
Dec 06 07:11:02 np0005548788.novalocal sshd[18487]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:11:02 np0005548788.novalocal sudo[18504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtmgkmqgdsnnnmghgtikhgtsjnkdwgbm ; /usr/bin/python3
Dec 06 07:11:02 np0005548788.novalocal sudo[18504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:02 np0005548788.novalocal python3[18506]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:11:03 np0005548788.novalocal sudo[18504]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:03 np0005548788.novalocal sudo[18521]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooxscaqvussdkhosrviiqlvaztanamdr ; /usr/bin/python3
Dec 06 07:11:03 np0005548788.novalocal sudo[18521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:03 np0005548788.novalocal python3[18523]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548788.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 07:11:03 np0005548788.novalocal sudo[18521]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:05 np0005548788.novalocal sudo[18571]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cewaiiqkzhyygoyrkplplssduwohceop ; /usr/bin/python3
Dec 06 07:11:05 np0005548788.novalocal sudo[18571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:05 np0005548788.novalocal python3[18573]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:05 np0005548788.novalocal sudo[18571]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:05 np0005548788.novalocal sudo[18614]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnudmznyhbibxapswfrsggoagexxxzoi ; /usr/bin/python3
Dec 06 07:11:05 np0005548788.novalocal sudo[18614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:05 np0005548788.novalocal python3[18616]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765005065.2837627-135-185662226307510/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa follow=False checksum=59556e0a2f4b936183817041ae1f59f0f3c92dd9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:05 np0005548788.novalocal sudo[18614]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:07 np0005548788.novalocal sudo[18676]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqjiuqvyrklygnjhtaawqlqrztsapuhq ; /usr/bin/python3
Dec 06 07:11:07 np0005548788.novalocal sudo[18676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:07 np0005548788.novalocal python3[18678]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:07 np0005548788.novalocal sudo[18676]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:07 np0005548788.novalocal sudo[18719]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klgysppeiooqlhvcxorddiviobricrub ; /usr/bin/python3
Dec 06 07:11:07 np0005548788.novalocal sudo[18719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:07 np0005548788.novalocal python3[18721]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765005066.922495-218-206841473449908/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa.pub follow=False checksum=2b77fe3fb3441abe077d8d93b68745bd8f418f92 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:07 np0005548788.novalocal sudo[18719]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:09 np0005548788.novalocal sudo[18749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocsfjaldahkmejalyubtfmjopfvkcfbn ; /usr/bin/python3
Dec 06 07:11:09 np0005548788.novalocal sudo[18749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:09 np0005548788.novalocal python3[18751]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:09 np0005548788.novalocal sudo[18749]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:10 np0005548788.novalocal python3[18797]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:11 np0005548788.novalocal python3[18813]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpkn4lpnm_ recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:12 np0005548788.novalocal python3[18873]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:12 np0005548788.novalocal python3[18889]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpw1gm6s6c recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:13 np0005548788.novalocal python3[18949]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:14 np0005548788.novalocal python3[18965]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpka9ddu68 recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:14 np0005548788.novalocal sshd[18487]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:11:14 np0005548788.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Dec 06 07:11:14 np0005548788.novalocal systemd[1]: session-8.scope: Consumed 3.711s CPU time.
Dec 06 07:11:14 np0005548788.novalocal systemd-logind[765]: Session 8 logged out. Waiting for processes to exit.
Dec 06 07:11:14 np0005548788.novalocal systemd-logind[765]: Removed session 8.
Dec 06 07:11:23 np0005548788.novalocal sshd[18980]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:11:26 np0005548788.novalocal sshd[18980]: Received disconnect from 45.78.222.109 port 53282:11: Bye Bye [preauth]
Dec 06 07:11:26 np0005548788.novalocal sshd[18980]: Disconnected from authenticating user root 45.78.222.109 port 53282 [preauth]
Dec 06 07:12:14 np0005548788.novalocal sshd[18982]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:12:16 np0005548788.novalocal sshd[18982]: Received disconnect from 103.52.114.250 port 32790:11: Bye Bye [preauth]
Dec 06 07:12:16 np0005548788.novalocal sshd[18982]: Disconnected from authenticating user root 103.52.114.250 port 32790 [preauth]
Dec 06 07:12:31 np0005548788.novalocal sshd[18984]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:12:33 np0005548788.novalocal sshd[18986]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:12:34 np0005548788.novalocal sshd[18986]: Received disconnect from 102.140.97.134 port 43592:11: Bye Bye [preauth]
Dec 06 07:12:34 np0005548788.novalocal sshd[18986]: Disconnected from authenticating user root 102.140.97.134 port 43592 [preauth]
Dec 06 07:12:56 np0005548788.novalocal sshd[18988]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:13:01 np0005548788.novalocal sshd[18990]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:13:02 np0005548788.novalocal sshd[18990]: Received disconnect from 151.36.70.253 port 31026:11: Bye Bye [preauth]
Dec 06 07:13:02 np0005548788.novalocal sshd[18990]: Disconnected from authenticating user root 151.36.70.253 port 31026 [preauth]
Dec 06 07:13:16 np0005548788.novalocal sshd[18992]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:13:27 np0005548788.novalocal sshd[18993]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:13:28 np0005548788.novalocal sshd[18993]: Accepted publickey for zuul from 38.102.83.83 port 47330 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:13:28 np0005548788.novalocal systemd-logind[765]: New session 9 of user zuul.
Dec 06 07:13:28 np0005548788.novalocal systemd[1]: Started Session 9 of User zuul.
Dec 06 07:13:28 np0005548788.novalocal sshd[18993]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:13:28 np0005548788.novalocal python3[19039]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:14:01 np0005548788.novalocal anacron[6181]: Job `cron.daily' started
Dec 06 07:14:01 np0005548788.novalocal anacron[6181]: Job `cron.daily' terminated
Dec 06 07:14:31 np0005548788.novalocal sshd[18984]: fatal: Timeout before authentication for 121.204.171.142 port 39512
Dec 06 07:14:56 np0005548788.novalocal sshd[18988]: fatal: Timeout before authentication for 36.104.144.114 port 54032
Dec 06 07:15:11 np0005548788.novalocal sshd[19044]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:15:11 np0005548788.novalocal sshd[19044]: Received disconnect from 45.55.249.98 port 50902:11: Bye Bye [preauth]
Dec 06 07:15:11 np0005548788.novalocal sshd[19044]: Disconnected from authenticating user root 45.55.249.98 port 50902 [preauth]
Dec 06 07:15:16 np0005548788.novalocal sshd[18992]: fatal: Timeout before authentication for 117.50.119.17 port 39180
Dec 06 07:15:42 np0005548788.novalocal sshd[19046]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:15:44 np0005548788.novalocal sshd[19046]: Received disconnect from 103.52.114.250 port 56098:11: Bye Bye [preauth]
Dec 06 07:15:44 np0005548788.novalocal sshd[19046]: Disconnected from authenticating user root 103.52.114.250 port 56098 [preauth]
Dec 06 07:15:45 np0005548788.novalocal sshd[19048]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:15:48 np0005548788.novalocal sshd[19048]: Received disconnect from 45.78.222.109 port 45048:11: Bye Bye [preauth]
Dec 06 07:15:48 np0005548788.novalocal sshd[19048]: Disconnected from authenticating user root 45.78.222.109 port 45048 [preauth]
Dec 06 07:15:54 np0005548788.novalocal sshd[19050]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:15:56 np0005548788.novalocal sshd[19050]: Received disconnect from 102.140.97.134 port 51674:11: Bye Bye [preauth]
Dec 06 07:15:56 np0005548788.novalocal sshd[19050]: Disconnected from authenticating user root 102.140.97.134 port 51674 [preauth]
Dec 06 07:16:14 np0005548788.novalocal sshd[19052]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:16:14 np0005548788.novalocal sshd[19052]: Received disconnect from 45.55.249.98 port 48224:11: Bye Bye [preauth]
Dec 06 07:16:14 np0005548788.novalocal sshd[19052]: Disconnected from authenticating user root 45.55.249.98 port 48224 [preauth]
Dec 06 07:16:25 np0005548788.novalocal sshd[19054]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:16:44 np0005548788.novalocal sshd[19056]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:16:45 np0005548788.novalocal sshd[19056]: Received disconnect from 151.36.70.253 port 31618:11: Bye Bye [preauth]
Dec 06 07:16:45 np0005548788.novalocal sshd[19056]: Disconnected from authenticating user root 151.36.70.253 port 31618 [preauth]
Dec 06 07:17:13 np0005548788.novalocal sshd[19058]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:17:15 np0005548788.novalocal sshd[19058]: Received disconnect from 103.52.114.250 port 37292:11: Bye Bye [preauth]
Dec 06 07:17:15 np0005548788.novalocal sshd[19058]: Disconnected from authenticating user root 103.52.114.250 port 37292 [preauth]
Dec 06 07:17:15 np0005548788.novalocal sshd[19060]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:17:15 np0005548788.novalocal sshd[19060]: Received disconnect from 45.55.249.98 port 59912:11: Bye Bye [preauth]
Dec 06 07:17:15 np0005548788.novalocal sshd[19060]: Disconnected from authenticating user root 45.55.249.98 port 59912 [preauth]
Dec 06 07:17:33 np0005548788.novalocal sshd[19062]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:17:34 np0005548788.novalocal sshd[19062]: Received disconnect from 151.36.70.253 port 31291:11: Bye Bye [preauth]
Dec 06 07:17:34 np0005548788.novalocal sshd[19062]: Disconnected from authenticating user root 151.36.70.253 port 31291 [preauth]
Dec 06 07:18:16 np0005548788.novalocal sshd[19064]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:18:16 np0005548788.novalocal sshd[19064]: Received disconnect from 45.55.249.98 port 44178:11: Bye Bye [preauth]
Dec 06 07:18:16 np0005548788.novalocal sshd[19064]: Disconnected from authenticating user root 45.55.249.98 port 44178 [preauth]
Dec 06 07:18:20 np0005548788.novalocal sshd[19066]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:18:23 np0005548788.novalocal sshd[19066]: Received disconnect from 102.140.97.134 port 37196:11: Bye Bye [preauth]
Dec 06 07:18:23 np0005548788.novalocal sshd[19066]: Disconnected from authenticating user root 102.140.97.134 port 37196 [preauth]
Dec 06 07:18:24 np0005548788.novalocal sshd[19068]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:18:25 np0005548788.novalocal sshd[19070]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:18:25 np0005548788.novalocal sshd[19054]: fatal: Timeout before authentication for 180.76.134.56 port 51382
Dec 06 07:18:25 np0005548788.novalocal sshd[19068]: Received disconnect from 151.36.70.253 port 31647:11: Bye Bye [preauth]
Dec 06 07:18:25 np0005548788.novalocal sshd[19068]: Disconnected from authenticating user root 151.36.70.253 port 31647 [preauth]
Dec 06 07:18:28 np0005548788.novalocal sshd[18996]: Received disconnect from 38.102.83.83 port 47330:11: disconnected by user
Dec 06 07:18:28 np0005548788.novalocal sshd[18996]: Disconnected from user zuul 38.102.83.83 port 47330
Dec 06 07:18:28 np0005548788.novalocal sshd[18993]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:18:28 np0005548788.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Dec 06 07:18:28 np0005548788.novalocal systemd-logind[765]: Session 9 logged out. Waiting for processes to exit.
Dec 06 07:18:28 np0005548788.novalocal systemd-logind[765]: Removed session 9.
Dec 06 07:18:35 np0005548788.novalocal sshd[19070]: Received disconnect from 45.78.222.109 port 47326:11: Bye Bye [preauth]
Dec 06 07:18:35 np0005548788.novalocal sshd[19070]: Disconnected from authenticating user root 45.78.222.109 port 47326 [preauth]
Dec 06 07:18:55 np0005548788.novalocal sshd[19074]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:18:58 np0005548788.novalocal sshd[19074]: Received disconnect from 103.52.114.250 port 36946:11: Bye Bye [preauth]
Dec 06 07:18:58 np0005548788.novalocal sshd[19074]: Disconnected from authenticating user root 103.52.114.250 port 36946 [preauth]
Dec 06 07:19:22 np0005548788.novalocal sshd[19077]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:19:22 np0005548788.novalocal sshd[19077]: Received disconnect from 45.55.249.98 port 35470:11: Bye Bye [preauth]
Dec 06 07:19:22 np0005548788.novalocal sshd[19077]: Disconnected from authenticating user root 45.55.249.98 port 35470 [preauth]
Dec 06 07:20:17 np0005548788.novalocal sshd[19079]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:20:18 np0005548788.novalocal sshd[19079]: Received disconnect from 151.36.70.253 port 31817:11: Bye Bye [preauth]
Dec 06 07:20:18 np0005548788.novalocal sshd[19079]: Disconnected from authenticating user root 151.36.70.253 port 31817 [preauth]
Dec 06 07:20:26 np0005548788.novalocal sshd[19081]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:20:26 np0005548788.novalocal sshd[19081]: Received disconnect from 45.55.249.98 port 53602:11: Bye Bye [preauth]
Dec 06 07:20:26 np0005548788.novalocal sshd[19081]: Disconnected from authenticating user root 45.55.249.98 port 53602 [preauth]
Dec 06 07:20:30 np0005548788.novalocal sshd[19083]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:20:32 np0005548788.novalocal sshd[19083]: Received disconnect from 103.52.114.250 port 56224:11: Bye Bye [preauth]
Dec 06 07:20:32 np0005548788.novalocal sshd[19083]: Disconnected from authenticating user root 103.52.114.250 port 56224 [preauth]
Dec 06 07:20:43 np0005548788.novalocal sshd[19086]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:20:46 np0005548788.novalocal sshd[19086]: Received disconnect from 102.140.97.134 port 39136:11: Bye Bye [preauth]
Dec 06 07:20:46 np0005548788.novalocal sshd[19086]: Disconnected from authenticating user root 102.140.97.134 port 39136 [preauth]
Dec 06 07:20:58 np0005548788.novalocal sshd[19088]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:21:01 np0005548788.novalocal sshd[19088]: error: kex_exchange_identification: banner line contains invalid characters
Dec 06 07:21:01 np0005548788.novalocal sshd[19088]: banner exchange: Connection from 37.187.50.141 port 47023: invalid format
Dec 06 07:21:04 np0005548788.novalocal sshd[19089]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:21:06 np0005548788.novalocal sshd[19089]: Received disconnect from 45.78.222.109 port 53542:11: Bye Bye [preauth]
Dec 06 07:21:06 np0005548788.novalocal sshd[19089]: Disconnected from authenticating user root 45.78.222.109 port 53542 [preauth]
Dec 06 07:21:25 np0005548788.novalocal sshd[19091]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:21:25 np0005548788.novalocal sshd[19091]: Received disconnect from 45.55.249.98 port 39632:11: Bye Bye [preauth]
Dec 06 07:21:25 np0005548788.novalocal sshd[19091]: Disconnected from authenticating user root 45.55.249.98 port 39632 [preauth]
Dec 06 07:22:05 np0005548788.novalocal sshd[19093]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:22:07 np0005548788.novalocal sshd[19093]: Received disconnect from 103.52.114.250 port 40326:11: Bye Bye [preauth]
Dec 06 07:22:07 np0005548788.novalocal sshd[19093]: Disconnected from authenticating user root 103.52.114.250 port 40326 [preauth]
Dec 06 07:22:13 np0005548788.novalocal sshd[19096]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:22:14 np0005548788.novalocal sshd[19096]: Received disconnect from 151.36.70.253 port 31753:11: Bye Bye [preauth]
Dec 06 07:22:14 np0005548788.novalocal sshd[19096]: Disconnected from authenticating user root 151.36.70.253 port 31753 [preauth]
Dec 06 07:22:22 np0005548788.novalocal sshd[19098]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:22:22 np0005548788.novalocal sshd[19098]: Received disconnect from 45.55.249.98 port 47128:11: Bye Bye [preauth]
Dec 06 07:22:22 np0005548788.novalocal sshd[19098]: Disconnected from authenticating user root 45.55.249.98 port 47128 [preauth]
Dec 06 07:23:06 np0005548788.novalocal sshd[19100]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:23:07 np0005548788.novalocal sshd[19100]: Received disconnect from 102.140.97.134 port 55652:11: Bye Bye [preauth]
Dec 06 07:23:07 np0005548788.novalocal sshd[19100]: Disconnected from authenticating user root 102.140.97.134 port 55652 [preauth]
Dec 06 07:23:17 np0005548788.novalocal sshd[19102]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:23:17 np0005548788.novalocal sshd[19102]: Received disconnect from 45.55.249.98 port 60416:11: Bye Bye [preauth]
Dec 06 07:23:17 np0005548788.novalocal sshd[19102]: Disconnected from authenticating user root 45.55.249.98 port 60416 [preauth]
Dec 06 07:23:31 np0005548788.novalocal sshd[19104]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:23:33 np0005548788.novalocal sshd[19104]: Received disconnect from 103.52.114.250 port 59878:11: Bye Bye [preauth]
Dec 06 07:23:33 np0005548788.novalocal sshd[19104]: Disconnected from authenticating user root 103.52.114.250 port 59878 [preauth]
Dec 06 07:23:45 np0005548788.novalocal sshd[19106]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:23:55 np0005548788.novalocal sshd[19106]: Received disconnect from 45.78.222.109 port 60196:11: Bye Bye [preauth]
Dec 06 07:23:55 np0005548788.novalocal sshd[19106]: Disconnected from authenticating user root 45.78.222.109 port 60196 [preauth]
Dec 06 07:24:14 np0005548788.novalocal sshd[19108]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:14 np0005548788.novalocal sshd[19108]: Received disconnect from 45.55.249.98 port 52020:11: Bye Bye [preauth]
Dec 06 07:24:14 np0005548788.novalocal sshd[19108]: Disconnected from authenticating user root 45.55.249.98 port 52020 [preauth]
Dec 06 07:24:58 np0005548788.novalocal sshd[19111]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:59 np0005548788.novalocal sshd[19111]: Received disconnect from 103.52.114.250 port 35776:11: Bye Bye [preauth]
Dec 06 07:24:59 np0005548788.novalocal sshd[19111]: Disconnected from authenticating user root 103.52.114.250 port 35776 [preauth]
Dec 06 07:25:11 np0005548788.novalocal sshd[19113]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:25:11 np0005548788.novalocal sshd[19113]: Received disconnect from 45.55.249.98 port 38986:11: Bye Bye [preauth]
Dec 06 07:25:11 np0005548788.novalocal sshd[19113]: Disconnected from authenticating user root 45.55.249.98 port 38986 [preauth]
Dec 06 07:25:26 np0005548788.novalocal sshd[19115]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:25:30 np0005548788.novalocal sshd[19115]: Received disconnect from 102.140.97.134 port 40316:11: Bye Bye [preauth]
Dec 06 07:25:30 np0005548788.novalocal sshd[19115]: Disconnected from authenticating user root 102.140.97.134 port 40316 [preauth]
Dec 06 07:26:10 np0005548788.novalocal sshd[19117]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:26:10 np0005548788.novalocal sshd[19117]: Received disconnect from 45.55.249.98 port 40576:11: Bye Bye [preauth]
Dec 06 07:26:10 np0005548788.novalocal sshd[19117]: Disconnected from authenticating user root 45.55.249.98 port 40576 [preauth]
Dec 06 07:26:17 np0005548788.novalocal sshd[19120]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:26:19 np0005548788.novalocal sshd[19122]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:26:19 np0005548788.novalocal sshd[19120]: Invalid user user1 from 150.95.85.24 port 51668
Dec 06 07:26:19 np0005548788.novalocal sshd[19120]: Received disconnect from 150.95.85.24 port 51668:11:  [preauth]
Dec 06 07:26:19 np0005548788.novalocal sshd[19120]: Disconnected from invalid user user1 150.95.85.24 port 51668 [preauth]
Dec 06 07:26:20 np0005548788.novalocal sshd[19122]: Received disconnect from 151.36.70.253 port 31544:11: Bye Bye [preauth]
Dec 06 07:26:20 np0005548788.novalocal sshd[19122]: Disconnected from authenticating user root 151.36.70.253 port 31544 [preauth]
Dec 06 07:26:25 np0005548788.novalocal sshd[19124]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:26:29 np0005548788.novalocal sshd[19124]: Received disconnect from 45.78.222.109 port 38174:11: Bye Bye [preauth]
Dec 06 07:26:29 np0005548788.novalocal sshd[19124]: Disconnected from authenticating user root 45.78.222.109 port 38174 [preauth]
Dec 06 07:26:39 np0005548788.novalocal sshd[19126]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:26:41 np0005548788.novalocal sshd[19126]: Received disconnect from 103.52.114.250 port 56400:11: Bye Bye [preauth]
Dec 06 07:26:41 np0005548788.novalocal sshd[19126]: Disconnected from authenticating user root 103.52.114.250 port 56400 [preauth]
Dec 06 07:27:11 np0005548788.novalocal sshd[19128]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:27:11 np0005548788.novalocal sshd[19128]: Received disconnect from 45.55.249.98 port 57370:11: Bye Bye [preauth]
Dec 06 07:27:11 np0005548788.novalocal sshd[19128]: Disconnected from authenticating user root 45.55.249.98 port 57370 [preauth]
Dec 06 07:27:19 np0005548788.novalocal sshd[19130]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:27:20 np0005548788.novalocal sshd[19130]: Received disconnect from 151.36.70.253 port 31399:11: Bye Bye [preauth]
Dec 06 07:27:20 np0005548788.novalocal sshd[19130]: Disconnected from authenticating user root 151.36.70.253 port 31399 [preauth]
Dec 06 07:27:42 np0005548788.novalocal sshd[19132]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:27:45 np0005548788.novalocal sshd[19132]: Received disconnect from 102.140.97.134 port 58652:11: Bye Bye [preauth]
Dec 06 07:27:45 np0005548788.novalocal sshd[19132]: Disconnected from authenticating user root 102.140.97.134 port 58652 [preauth]
Dec 06 07:28:12 np0005548788.novalocal sshd[19134]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:28:12 np0005548788.novalocal sshd[19134]: Received disconnect from 45.55.249.98 port 58030:11: Bye Bye [preauth]
Dec 06 07:28:12 np0005548788.novalocal sshd[19134]: Disconnected from authenticating user root 45.55.249.98 port 58030 [preauth]
Dec 06 07:28:21 np0005548788.novalocal sshd[19136]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:28:23 np0005548788.novalocal sshd[19136]: Received disconnect from 103.52.114.250 port 39660:11: Bye Bye [preauth]
Dec 06 07:28:23 np0005548788.novalocal sshd[19136]: Disconnected from authenticating user root 103.52.114.250 port 39660 [preauth]
Dec 06 07:28:58 np0005548788.novalocal sshd[19138]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:29:10 np0005548788.novalocal sshd[19140]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:29:10 np0005548788.novalocal sshd[19138]: Connection closed by 45.78.222.109 port 42454 [preauth]
Dec 06 07:29:10 np0005548788.novalocal sshd[19140]: Received disconnect from 45.55.249.98 port 59426:11: Bye Bye [preauth]
Dec 06 07:29:10 np0005548788.novalocal sshd[19140]: Disconnected from authenticating user root 45.55.249.98 port 59426 [preauth]
Dec 06 07:29:23 np0005548788.novalocal sshd[19143]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:29:24 np0005548788.novalocal sshd[19143]: Received disconnect from 151.36.70.253 port 31522:11: Bye Bye [preauth]
Dec 06 07:29:24 np0005548788.novalocal sshd[19143]: Disconnected from authenticating user root 151.36.70.253 port 31522 [preauth]
Dec 06 07:30:00 np0005548788.novalocal sshd[19145]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:06 np0005548788.novalocal sshd[19145]: Connection closed by 102.140.97.134 port 50970 [preauth]
Dec 06 07:30:08 np0005548788.novalocal sshd[19148]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:09 np0005548788.novalocal sshd[19148]: Received disconnect from 103.52.114.250 port 43572:11: Bye Bye [preauth]
Dec 06 07:30:09 np0005548788.novalocal sshd[19148]: Disconnected from authenticating user root 103.52.114.250 port 43572 [preauth]
Dec 06 07:30:10 np0005548788.novalocal sshd[19150]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:11 np0005548788.novalocal sshd[19150]: Received disconnect from 45.55.249.98 port 43642:11: Bye Bye [preauth]
Dec 06 07:30:11 np0005548788.novalocal sshd[19150]: Disconnected from authenticating user root 45.55.249.98 port 43642 [preauth]
Dec 06 07:30:24 np0005548788.novalocal sshd[19153]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:25 np0005548788.novalocal sshd[19153]: Received disconnect from 151.36.70.253 port 31827:11: Bye Bye [preauth]
Dec 06 07:30:25 np0005548788.novalocal sshd[19153]: Disconnected from authenticating user root 151.36.70.253 port 31827 [preauth]
Dec 06 07:30:48 np0005548788.novalocal sshd[19155]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:57 np0005548788.novalocal sshd[19155]: Invalid user NL5xUDpV2xRa from 37.187.50.141 port 36531
Dec 06 07:30:57 np0005548788.novalocal sshd[19155]: fatal: userauth_pubkey: parse packet: incomplete message [preauth]
Dec 06 07:31:13 np0005548788.novalocal sshd[19157]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:13 np0005548788.novalocal sshd[19157]: Received disconnect from 45.55.249.98 port 60464:11: Bye Bye [preauth]
Dec 06 07:31:13 np0005548788.novalocal sshd[19157]: Disconnected from authenticating user root 45.55.249.98 port 60464 [preauth]
Dec 06 07:31:27 np0005548788.novalocal sshd[19159]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:28 np0005548788.novalocal sshd[19159]: Received disconnect from 151.36.70.253 port 31325:11: Bye Bye [preauth]
Dec 06 07:31:28 np0005548788.novalocal sshd[19159]: Disconnected from authenticating user root 151.36.70.253 port 31325 [preauth]
Dec 06 07:31:31 np0005548788.novalocal sshd[19162]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:31 np0005548788.novalocal sshd[19162]: Accepted publickey for zuul from 38.102.83.114 port 49568 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:31:31 np0005548788.novalocal systemd-logind[765]: New session 10 of user zuul.
Dec 06 07:31:31 np0005548788.novalocal systemd[1]: Started Session 10 of User zuul.
Dec 06 07:31:31 np0005548788.novalocal sshd[19162]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:31:32 np0005548788.novalocal python3[19179]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:31:33 np0005548788.novalocal sudo[19197]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgcyuwnvffgzmjrohkfokviztwgzycis ; /usr/bin/python3
Dec 06 07:31:33 np0005548788.novalocal sudo[19197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:31:33 np0005548788.novalocal python3[19199]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:31:36 np0005548788.novalocal sudo[19197]: pam_unix(sudo:session): session closed for user root
Dec 06 07:31:36 np0005548788.novalocal sshd[19203]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:38 np0005548788.novalocal sudo[19218]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nytqqguxoebuweuqvjblfefrxiygflql ; /usr/bin/python3
Dec 06 07:31:38 np0005548788.novalocal sudo[19218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:31:38 np0005548788.novalocal python3[19220]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Dec 06 07:31:40 np0005548788.novalocal sshd[19203]: Received disconnect from 45.78.222.109 port 44146:11: Bye Bye [preauth]
Dec 06 07:31:40 np0005548788.novalocal sshd[19203]: Disconnected from authenticating user root 45.78.222.109 port 44146 [preauth]
Dec 06 07:31:42 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:31:42 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:00 np0005548788.novalocal sshd[19358]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:32:02 np0005548788.novalocal sshd[19358]: Received disconnect from 103.52.114.250 port 56940:11: Bye Bye [preauth]
Dec 06 07:32:02 np0005548788.novalocal sshd[19358]: Disconnected from authenticating user root 103.52.114.250 port 56940 [preauth]
Dec 06 07:32:08 np0005548788.novalocal sudo[19218]: pam_unix(sudo:session): session closed for user root
Dec 06 07:32:16 np0005548788.novalocal sshd[19364]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:32:16 np0005548788.novalocal sshd[19364]: Received disconnect from 45.55.249.98 port 42688:11: Bye Bye [preauth]
Dec 06 07:32:16 np0005548788.novalocal sshd[19364]: Disconnected from authenticating user root 45.55.249.98 port 42688 [preauth]
Dec 06 07:32:21 np0005548788.novalocal sshd[19366]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:32:24 np0005548788.novalocal sshd[19366]: Received disconnect from 102.140.97.134 port 43658:11: Bye Bye [preauth]
Dec 06 07:32:24 np0005548788.novalocal sshd[19366]: Disconnected from authenticating user root 102.140.97.134 port 43658 [preauth]
Dec 06 07:32:36 np0005548788.novalocal sudo[19381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwcaqrznexbnxxpcsvgdcnsppstgijml ; /usr/bin/python3
Dec 06 07:32:36 np0005548788.novalocal sudo[19381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:32:36 np0005548788.novalocal python3[19383]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Dec 06 07:32:39 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:39 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:42 np0005548788.novalocal sudo[19381]: pam_unix(sudo:session): session closed for user root
Dec 06 07:32:47 np0005548788.novalocal sudo[19581]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndkvdcaladjsxmzjryvfkavxfkkbebwo ; /usr/bin/python3
Dec 06 07:32:47 np0005548788.novalocal sudo[19581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:32:48 np0005548788.novalocal python3[19583]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Dec 06 07:32:51 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:51 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:56 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:56 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:03 np0005548788.novalocal sudo[19581]: pam_unix(sudo:session): session closed for user root
Dec 06 07:33:21 np0005548788.novalocal sudo[19916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyfufnzvpmgkrehrhxiotynwaxvthkre ; /usr/bin/python3
Dec 06 07:33:21 np0005548788.novalocal sudo[19916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:33:21 np0005548788.novalocal python3[19918]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Dec 06 07:33:23 np0005548788.novalocal sshd[19921]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:33:23 np0005548788.novalocal sshd[19921]: Received disconnect from 45.55.249.98 port 39142:11: Bye Bye [preauth]
Dec 06 07:33:23 np0005548788.novalocal sshd[19921]: Disconnected from authenticating user root 45.55.249.98 port 39142 [preauth]
Dec 06 07:33:24 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:24 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:29 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:36 np0005548788.novalocal sudo[19916]: pam_unix(sudo:session): session closed for user root
Dec 06 07:33:44 np0005548788.novalocal sshd[20240]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:33:46 np0005548788.novalocal sshd[20240]: Received disconnect from 103.52.114.250 port 48098:11: Bye Bye [preauth]
Dec 06 07:33:46 np0005548788.novalocal sshd[20240]: Disconnected from authenticating user root 103.52.114.250 port 48098 [preauth]
Dec 06 07:33:51 np0005548788.novalocal sudo[20255]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erigucrkahgcmccvfyvzsfvztgrwheom ; /usr/bin/python3
Dec 06 07:33:51 np0005548788.novalocal sudo[20255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:33:52 np0005548788.novalocal python3[20257]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Dec 06 07:33:55 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:55 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:00 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:01 np0005548788.novalocal anacron[6181]: Job `cron.weekly' started
Dec 06 07:34:01 np0005548788.novalocal anacron[6181]: Job `cron.weekly' terminated
Dec 06 07:34:07 np0005548788.novalocal sudo[20255]: pam_unix(sudo:session): session closed for user root
Dec 06 07:34:24 np0005548788.novalocal sudo[20535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckukoykznwwuhhscixtsxarkrjotmoxt ; /usr/bin/python3
Dec 06 07:34:24 np0005548788.novalocal sudo[20535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:34:24 np0005548788.novalocal python3[20537]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:34:26 np0005548788.novalocal sudo[20535]: pam_unix(sudo:session): session closed for user root
Dec 06 07:34:26 np0005548788.novalocal sshd[20541]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:26 np0005548788.novalocal sshd[20541]: Received disconnect from 45.55.249.98 port 41754:11: Bye Bye [preauth]
Dec 06 07:34:26 np0005548788.novalocal sshd[20541]: Disconnected from authenticating user root 45.55.249.98 port 41754 [preauth]
Dec 06 07:34:27 np0005548788.novalocal sshd[20543]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:28 np0005548788.novalocal sshd[20543]: Received disconnect from 151.36.70.253 port 31851:11: Bye Bye [preauth]
Dec 06 07:34:28 np0005548788.novalocal sshd[20543]: Disconnected from authenticating user root 151.36.70.253 port 31851 [preauth]
Dec 06 07:34:29 np0005548788.novalocal sudo[20558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjazpicqjohsgkjopckhjjuqdpvipnsf ; /usr/bin/python3
Dec 06 07:34:29 np0005548788.novalocal sudo[20558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:34:29 np0005548788.novalocal python3[20560]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:34:40 np0005548788.novalocal sshd[20641]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:42 np0005548788.novalocal groupadd[20648]: group added to /etc/group: name=unbound, GID=987
Dec 06 07:34:42 np0005548788.novalocal groupadd[20648]: group added to /etc/gshadow: name=unbound
Dec 06 07:34:42 np0005548788.novalocal groupadd[20648]: new group: name=unbound, GID=987
Dec 06 07:34:42 np0005548788.novalocal useradd[20655]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Dec 06 07:34:42 np0005548788.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 06 07:34:48 np0005548788.novalocal sshd[20641]: Received disconnect from 102.140.97.134 port 48724:11: Bye Bye [preauth]
Dec 06 07:34:48 np0005548788.novalocal sshd[20641]: Disconnected from 102.140.97.134 port 48724 [preauth]
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  Converting 501 SID table entries...
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 07:34:52 np0005548788.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 07:34:52 np0005548788.novalocal groupadd[20691]: group added to /etc/group: name=openvswitch, GID=986
Dec 06 07:34:52 np0005548788.novalocal groupadd[20691]: group added to /etc/gshadow: name=openvswitch
Dec 06 07:34:52 np0005548788.novalocal groupadd[20691]: new group: name=openvswitch, GID=986
Dec 06 07:34:52 np0005548788.novalocal useradd[20698]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Dec 06 07:34:52 np0005548788.novalocal groupadd[20706]: group added to /etc/group: name=hugetlbfs, GID=985
Dec 06 07:34:52 np0005548788.novalocal groupadd[20706]: group added to /etc/gshadow: name=hugetlbfs
Dec 06 07:34:52 np0005548788.novalocal groupadd[20706]: new group: name=hugetlbfs, GID=985
Dec 06 07:34:52 np0005548788.novalocal usermod[20714]: add 'openvswitch' to group 'hugetlbfs'
Dec 06 07:34:52 np0005548788.novalocal usermod[20714]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 06 07:34:54 np0005548788.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Dec 06 07:34:54 np0005548788.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:34:54 np0005548788.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 06 07:34:54 np0005548788.novalocal systemd[1]: Reloading.
Dec 06 07:34:54 np0005548788.novalocal systemd-sysv-generator[21212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:34:54 np0005548788.novalocal systemd-rc-local-generator[21207]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:34:55 np0005548788.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:34:55 np0005548788.novalocal systemd[1]: Starting dnf makecache...
Dec 06 07:34:55 np0005548788.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 07:34:55 np0005548788.novalocal dnf[21404]: Updating Subscription Management repositories.
Dec 06 07:34:55 np0005548788.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 07:34:55 np0005548788.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 06 07:34:55 np0005548788.novalocal systemd[1]: run-r184d99a9e0da410096994957f92aee69.service: Deactivated successfully.
Dec 06 07:34:56 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:56 np0005548788.novalocal rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:56 np0005548788.novalocal sudo[20558]: pam_unix(sudo:session): session closed for user root
Dec 06 07:34:57 np0005548788.novalocal dnf[21404]: Failed determining last makecache time.
Dec 06 07:34:57 np0005548788.novalocal dnf[21404]: Fast Datapath for RHEL 9 x86_64 (RPMs)           30 kB/s | 4.0 kB     00:00
Dec 06 07:34:57 np0005548788.novalocal dnf[21404]: Red Hat Enterprise Linux 9 for x86_64 - High Av  28 kB/s | 4.0 kB     00:00
Dec 06 07:34:57 np0005548788.novalocal dnf[21404]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   30 kB/s | 4.1 kB     00:00
Dec 06 07:34:57 np0005548788.novalocal dnf[21404]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  34 kB/s | 4.5 kB     00:00
Dec 06 07:34:58 np0005548788.novalocal dnf[21404]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   26 kB/s | 4.1 kB     00:00
Dec 06 07:34:58 np0005548788.novalocal dnf[21404]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  32 kB/s | 4.5 kB     00:00
Dec 06 07:34:58 np0005548788.novalocal dnf[21404]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_  31 kB/s | 4.0 kB     00:00
Dec 06 07:34:58 np0005548788.novalocal dnf[21404]: Metadata cache created.
Dec 06 07:34:58 np0005548788.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 07:34:58 np0005548788.novalocal systemd[1]: Finished dnf makecache.
Dec 06 07:34:58 np0005548788.novalocal systemd[1]: dnf-makecache.service: Consumed 2.921s CPU time.
Dec 06 07:35:22 np0005548788.novalocal sudo[21777]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpjicvjkdpwlxsffucseucaeditgdqlx ; /usr/bin/python3
Dec 06 07:35:22 np0005548788.novalocal sudo[21777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:22 np0005548788.novalocal python3[21779]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:35:24 np0005548788.novalocal sshd[21783]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:24 np0005548788.novalocal sshd[21783]: Received disconnect from 45.55.249.98 port 35120:11: Bye Bye [preauth]
Dec 06 07:35:24 np0005548788.novalocal sshd[21783]: Disconnected from authenticating user root 45.55.249.98 port 35120 [preauth]
Dec 06 07:35:25 np0005548788.novalocal sshd[21785]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:26 np0005548788.novalocal sshd[21787]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:27 np0005548788.novalocal sshd[21787]: Received disconnect from 151.36.70.253 port 31507:11: Bye Bye [preauth]
Dec 06 07:35:27 np0005548788.novalocal sshd[21787]: Disconnected from authenticating user root 151.36.70.253 port 31507 [preauth]
Dec 06 07:35:27 np0005548788.novalocal sshd[21785]: Received disconnect from 103.52.114.250 port 45934:11: Bye Bye [preauth]
Dec 06 07:35:27 np0005548788.novalocal sshd[21785]: Disconnected from authenticating user root 103.52.114.250 port 45934 [preauth]
Dec 06 07:35:38 np0005548788.novalocal sudo[21777]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:53 np0005548788.novalocal sudo[21803]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgqnjaspmwxhkgqfihbdxixxwdzclzmd ; /usr/bin/python3
Dec 06 07:35:53 np0005548788.novalocal sudo[21803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:53 np0005548788.novalocal python3[21805]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:35:53 np0005548788.novalocal sudo[21803]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:54 np0005548788.novalocal sudo[21851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vofuyxrrrbtuuzgbrcpeqbgkoyxslbkn ; /usr/bin/python3
Dec 06 07:35:54 np0005548788.novalocal sudo[21851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:54 np0005548788.novalocal python3[21853]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:35:54 np0005548788.novalocal sudo[21851]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:54 np0005548788.novalocal sudo[21894]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsfrcsqgbwpfxelylfurrcqxuzuxjewv ; /usr/bin/python3
Dec 06 07:35:54 np0005548788.novalocal sudo[21894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:54 np0005548788.novalocal python3[21896]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765006553.9564602-291-280620892015955/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:35:54 np0005548788.novalocal sudo[21894]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:55 np0005548788.novalocal sudo[21924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqrdjbkchginmtyuvesjzkyytkkikpni ; /usr/bin/python3
Dec 06 07:35:55 np0005548788.novalocal sudo[21924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548788.novalocal python3[21926]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:56 np0005548788.novalocal sudo[21924]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:56 np0005548788.novalocal systemd-journald[618]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Dec 06 07:35:56 np0005548788.novalocal systemd-journald[618]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 07:35:56 np0005548788.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 07:35:56 np0005548788.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 07:35:56 np0005548788.novalocal sudo[21945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vizzhhwpjeyvknofcfebohncgukzsdwv ; /usr/bin/python3
Dec 06 07:35:56 np0005548788.novalocal sudo[21945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548788.novalocal python3[21947]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:56 np0005548788.novalocal sudo[21945]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:56 np0005548788.novalocal sudo[21965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlqrqzgzaftiwjdhjfyculjpfbtfhlcv ; /usr/bin/python3
Dec 06 07:35:56 np0005548788.novalocal sudo[21965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548788.novalocal python3[21967]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:56 np0005548788.novalocal sudo[21965]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:56 np0005548788.novalocal sudo[21985]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqbcwarimegmtfqzuczehkuplotgkvlg ; /usr/bin/python3
Dec 06 07:35:56 np0005548788.novalocal sudo[21985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:57 np0005548788.novalocal python3[21987]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:57 np0005548788.novalocal sudo[21985]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:57 np0005548788.novalocal sudo[22005]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfoiqhcjsinrrqkckbxaespgtevfivgq ; /usr/bin/python3
Dec 06 07:35:57 np0005548788.novalocal sudo[22005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:57 np0005548788.novalocal python3[22007]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:57 np0005548788.novalocal sudo[22005]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:00 np0005548788.novalocal sudo[22025]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioitjunlsivofupltgebgvxfrskswcpj ; /usr/bin/python3
Dec 06 07:36:00 np0005548788.novalocal sudo[22025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:36:00 np0005548788.novalocal python3[22027]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:36:00 np0005548788.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Dec 06 07:36:00 np0005548788.novalocal network[22030]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:00 np0005548788.novalocal network[22041]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:00 np0005548788.novalocal network[22030]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:00 np0005548788.novalocal network[22042]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:00 np0005548788.novalocal network[22030]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 07:36:00 np0005548788.novalocal network[22043]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 07:36:00 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006560.6530] audit: op="connections-reload" pid=22071 uid=0 result="success"
Dec 06 07:36:00 np0005548788.novalocal network[22030]: Bringing up loopback interface:  [  OK  ]
Dec 06 07:36:00 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006560.8495] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22159 uid=0 result="success"
Dec 06 07:36:00 np0005548788.novalocal network[22030]: Bringing up interface eth0:  [  OK  ]
Dec 06 07:36:00 np0005548788.novalocal systemd[1]: Started LSB: Bring up/down networking.
Dec 06 07:36:00 np0005548788.novalocal sudo[22025]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:01 np0005548788.novalocal sudo[22198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zodjmowxnswzcohjuiiqcwedcfopbvqk ; /usr/bin/python3
Dec 06 07:36:01 np0005548788.novalocal sudo[22198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:36:01 np0005548788.novalocal python3[22200]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Dec 06 07:36:01 np0005548788.novalocal chown[22204]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22209]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22209]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22209]: Starting ovsdb-server [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal ovs-vsctl[22258]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 06 07:36:01 np0005548788.novalocal ovs-vsctl[22278]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"61ffd9e7-81c6-44c4-94c0-846d9931f97c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22209]: Configuring Open vSwitch system IDs [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22209]: Enabling remote OVSDB managers [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Started Open vSwitch Database Unit.
Dec 06 07:36:01 np0005548788.novalocal ovs-vsctl[22284]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005548788.novalocal
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 06 07:36:01 np0005548788.novalocal kernel: openvswitch: Open vSwitch switching datapath
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22328]: Inserting openvswitch module [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22297]: Starting ovs-vswitchd [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal ovs-vsctl[22345]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005548788.novalocal
Dec 06 07:36:01 np0005548788.novalocal ovs-ctl[22297]: Enabling remote OVSDB managers [  OK  ]
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Starting Open vSwitch...
Dec 06 07:36:01 np0005548788.novalocal systemd[1]: Finished Open vSwitch.
Dec 06 07:36:02 np0005548788.novalocal sudo[22198]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:05 np0005548788.novalocal sudo[22362]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcvlmofnhimiiwxcnzpydpxnqdmlkevq ; /usr/bin/python3
Dec 06 07:36:05 np0005548788.novalocal sudo[22362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:36:05 np0005548788.novalocal python3[22364]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:36:06 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006566.3070] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22522 uid=0 result="success"
Dec 06 07:36:06 np0005548788.novalocal ifup[22523]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:06 np0005548788.novalocal ifup[22524]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:06 np0005548788.novalocal ifup[22525]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:06 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006566.3327] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22531 uid=0 result="success"
Dec 06 07:36:06 np0005548788.novalocal ovs-vsctl[22533]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:0d:33:99 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Dec 06 07:36:06 np0005548788.novalocal kernel: device ovs-system entered promiscuous mode
Dec 06 07:36:06 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006566.3578] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Dec 06 07:36:06 np0005548788.novalocal systemd-udevd[22535]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:06 np0005548788.novalocal kernel: Timeout policy base is empty
Dec 06 07:36:06 np0005548788.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Dec 06 07:36:06 np0005548788.novalocal systemd-udevd[22548]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:06 np0005548788.novalocal kernel: device br-ex entered promiscuous mode
Dec 06 07:36:06 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006566.4037] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Dec 06 07:36:06 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006566.4288] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22560 uid=0 result="success"
Dec 06 07:36:06 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006566.4435] device (br-ex): carrier: link connected
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.4912] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22589 uid=0 result="success"
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.5374] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22604 uid=0 result="success"
Dec 06 07:36:09 np0005548788.novalocal NET[22629]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.6288] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.6404] dhcp4 (eth1): canceled DHCP transaction
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.6404] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.6404] dhcp4 (eth1): state changed no lease
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.6438] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22638 uid=0 result="success"
Dec 06 07:36:09 np0005548788.novalocal ifup[22639]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:09 np0005548788.novalocal ifup[22640]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:09 np0005548788.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 07:36:09 np0005548788.novalocal ifup[22642]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:09 np0005548788.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.6858] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22656 uid=0 result="success"
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.7905] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22666 uid=0 result="success"
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.7985] device (eth1): carrier: link connected
Dec 06 07:36:09 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006569.8226] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22675 uid=0 result="success"
Dec 06 07:36:09 np0005548788.novalocal ipv6_wait_tentative[22687]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Dec 06 07:36:10 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006570.8775] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22695 uid=0 result="success"
Dec 06 07:36:10 np0005548788.novalocal ovs-vsctl[22710]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Dec 06 07:36:10 np0005548788.novalocal kernel: device eth1 entered promiscuous mode
Dec 06 07:36:10 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006570.9970] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22718 uid=0 result="success"
Dec 06 07:36:11 np0005548788.novalocal ifup[22719]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:11 np0005548788.novalocal ifup[22720]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:11 np0005548788.novalocal ifup[22721]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:11 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006571.0276] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22727 uid=0 result="success"
Dec 06 07:36:11 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006571.0704] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22737 uid=0 result="success"
Dec 06 07:36:11 np0005548788.novalocal ifup[22738]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:11 np0005548788.novalocal ifup[22739]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:11 np0005548788.novalocal ifup[22740]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:11 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006571.1017] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22746 uid=0 result="success"
Dec 06 07:36:11 np0005548788.novalocal ovs-vsctl[22749]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Dec 06 07:36:11 np0005548788.novalocal kernel: device vlan44 entered promiscuous mode
Dec 06 07:36:11 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006571.1431] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Dec 06 07:36:11 np0005548788.novalocal systemd-udevd[22751]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:11 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006571.1696] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22760 uid=0 result="success"
Dec 06 07:36:11 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006571.1897] device (vlan44): carrier: link connected
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.2466] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22790 uid=0 result="success"
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.2957] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22805 uid=0 result="success"
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.3555] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22826 uid=0 result="success"
Dec 06 07:36:14 np0005548788.novalocal ifup[22827]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:14 np0005548788.novalocal ifup[22828]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:14 np0005548788.novalocal ifup[22829]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.3877] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22835 uid=0 result="success"
Dec 06 07:36:14 np0005548788.novalocal ovs-vsctl[22838]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Dec 06 07:36:14 np0005548788.novalocal systemd-udevd[22840]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:14 np0005548788.novalocal kernel: device vlan21 entered promiscuous mode
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.4597] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.4846] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22850 uid=0 result="success"
Dec 06 07:36:14 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006574.5057] device (vlan21): carrier: link connected
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.5572] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22880 uid=0 result="success"
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.6019] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22895 uid=0 result="success"
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.6615] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22916 uid=0 result="success"
Dec 06 07:36:17 np0005548788.novalocal ifup[22917]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:17 np0005548788.novalocal ifup[22918]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:17 np0005548788.novalocal ifup[22919]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.6925] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22925 uid=0 result="success"
Dec 06 07:36:17 np0005548788.novalocal ovs-vsctl[22928]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Dec 06 07:36:17 np0005548788.novalocal kernel: device vlan22 entered promiscuous mode
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.7488] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Dec 06 07:36:17 np0005548788.novalocal systemd-udevd[22930]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.7750] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22940 uid=0 result="success"
Dec 06 07:36:17 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006577.7962] device (vlan22): carrier: link connected
Dec 06 07:36:19 np0005548788.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 07:36:19 np0005548788.novalocal sshd[22959]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:19 np0005548788.novalocal sshd[22959]: Received disconnect from 45.55.249.98 port 41776:11: Bye Bye [preauth]
Dec 06 07:36:19 np0005548788.novalocal sshd[22959]: Disconnected from authenticating user root 45.55.249.98 port 41776 [preauth]
Dec 06 07:36:20 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006580.8479] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22972 uid=0 result="success"
Dec 06 07:36:20 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006580.8943] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22987 uid=0 result="success"
Dec 06 07:36:20 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006580.9566] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23008 uid=0 result="success"
Dec 06 07:36:20 np0005548788.novalocal ifup[23009]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:20 np0005548788.novalocal ifup[23010]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:20 np0005548788.novalocal ifup[23011]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:20 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006580.9894] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23017 uid=0 result="success"
Dec 06 07:36:21 np0005548788.novalocal ovs-vsctl[23020]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Dec 06 07:36:21 np0005548788.novalocal kernel: device vlan23 entered promiscuous mode
Dec 06 07:36:21 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006581.0791] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Dec 06 07:36:21 np0005548788.novalocal systemd-udevd[23022]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:21 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006581.1025] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23032 uid=0 result="success"
Dec 06 07:36:21 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006581.1255] device (vlan23): carrier: link connected
Dec 06 07:36:24 np0005548788.novalocal sshd[23053]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.1729] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23064 uid=0 result="success"
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.2160] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23079 uid=0 result="success"
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.2791] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23100 uid=0 result="success"
Dec 06 07:36:24 np0005548788.novalocal ifup[23101]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:24 np0005548788.novalocal ifup[23102]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:24 np0005548788.novalocal ifup[23103]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.3114] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23109 uid=0 result="success"
Dec 06 07:36:24 np0005548788.novalocal ovs-vsctl[23112]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Dec 06 07:36:24 np0005548788.novalocal kernel: device vlan20 entered promiscuous mode
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.3426] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Dec 06 07:36:24 np0005548788.novalocal systemd-udevd[23115]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.3614] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23124 uid=0 result="success"
Dec 06 07:36:24 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006584.3757] device (vlan20): carrier: link connected
Dec 06 07:36:25 np0005548788.novalocal sshd[23053]: Received disconnect from 151.36.70.253 port 31098:11: Bye Bye [preauth]
Dec 06 07:36:25 np0005548788.novalocal sshd[23053]: Disconnected from authenticating user root 151.36.70.253 port 31098 [preauth]
Dec 06 07:36:27 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006587.4237] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23154 uid=0 result="success"
Dec 06 07:36:27 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006587.4715] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23169 uid=0 result="success"
Dec 06 07:36:27 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006587.5380] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23190 uid=0 result="success"
Dec 06 07:36:27 np0005548788.novalocal ifup[23191]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:27 np0005548788.novalocal ifup[23192]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:27 np0005548788.novalocal ifup[23193]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:27 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006587.5726] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23199 uid=0 result="success"
Dec 06 07:36:27 np0005548788.novalocal ovs-vsctl[23202]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Dec 06 07:36:27 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006587.6306] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23209 uid=0 result="success"
Dec 06 07:36:28 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006588.6892] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23236 uid=0 result="success"
Dec 06 07:36:28 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006588.7352] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23251 uid=0 result="success"
Dec 06 07:36:28 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006588.7950] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23272 uid=0 result="success"
Dec 06 07:36:28 np0005548788.novalocal ifup[23273]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:28 np0005548788.novalocal ifup[23274]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:28 np0005548788.novalocal ifup[23275]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:28 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006588.8274] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23281 uid=0 result="success"
Dec 06 07:36:28 np0005548788.novalocal ovs-vsctl[23284]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Dec 06 07:36:28 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006588.8849] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23291 uid=0 result="success"
Dec 06 07:36:29 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006589.9438] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23319 uid=0 result="success"
Dec 06 07:36:29 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006589.9913] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23334 uid=0 result="success"
Dec 06 07:36:30 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006590.0491] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23355 uid=0 result="success"
Dec 06 07:36:30 np0005548788.novalocal ifup[23356]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:30 np0005548788.novalocal ifup[23357]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:30 np0005548788.novalocal ifup[23358]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:30 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006590.0825] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23364 uid=0 result="success"
Dec 06 07:36:30 np0005548788.novalocal ovs-vsctl[23367]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Dec 06 07:36:30 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006590.1372] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23374 uid=0 result="success"
Dec 06 07:36:31 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006591.1945] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23402 uid=0 result="success"
Dec 06 07:36:31 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006591.2407] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23417 uid=0 result="success"
Dec 06 07:36:31 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006591.2971] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23438 uid=0 result="success"
Dec 06 07:36:31 np0005548788.novalocal ifup[23439]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:31 np0005548788.novalocal ifup[23440]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:31 np0005548788.novalocal ifup[23441]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:31 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006591.3278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23447 uid=0 result="success"
Dec 06 07:36:31 np0005548788.novalocal ovs-vsctl[23450]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Dec 06 07:36:31 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006591.3843] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23457 uid=0 result="success"
Dec 06 07:36:32 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006592.4348] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23485 uid=0 result="success"
Dec 06 07:36:32 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006592.4862] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23500 uid=0 result="success"
Dec 06 07:36:32 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006592.5358] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23521 uid=0 result="success"
Dec 06 07:36:32 np0005548788.novalocal ifup[23522]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:32 np0005548788.novalocal ifup[23523]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:32 np0005548788.novalocal ifup[23524]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:32 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006592.5632] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23530 uid=0 result="success"
Dec 06 07:36:32 np0005548788.novalocal ovs-vsctl[23533]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Dec 06 07:36:32 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006592.6130] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23540 uid=0 result="success"
Dec 06 07:36:33 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006593.6656] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23568 uid=0 result="success"
Dec 06 07:36:33 np0005548788.novalocal NetworkManager[5968]: <info>  [1765006593.7131] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23583 uid=0 result="success"
Dec 06 07:36:33 np0005548788.novalocal sudo[22362]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:47 np0005548788.novalocal sshd[23601]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:53 np0005548788.novalocal sshd[23601]: Received disconnect from 45.78.222.109 port 43320:11: Bye Bye [preauth]
Dec 06 07:36:53 np0005548788.novalocal sshd[23601]: Disconnected from authenticating user root 45.78.222.109 port 43320 [preauth]
Dec 06 07:37:00 np0005548788.novalocal sshd[23603]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:03 np0005548788.novalocal sshd[23603]: Received disconnect from 102.140.97.134 port 60214:11: Bye Bye [preauth]
Dec 06 07:37:03 np0005548788.novalocal sshd[23603]: Disconnected from authenticating user root 102.140.97.134 port 60214 [preauth]
Dec 06 07:37:05 np0005548788.novalocal sshd[23605]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:07 np0005548788.novalocal sshd[23605]: Received disconnect from 103.52.114.250 port 54874:11: Bye Bye [preauth]
Dec 06 07:37:07 np0005548788.novalocal sshd[23605]: Disconnected from authenticating user root 103.52.114.250 port 54874 [preauth]
Dec 06 07:37:16 np0005548788.novalocal sshd[23607]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:16 np0005548788.novalocal sshd[23607]: Received disconnect from 45.55.249.98 port 45526:11: Bye Bye [preauth]
Dec 06 07:37:16 np0005548788.novalocal sshd[23607]: Disconnected from authenticating user root 45.55.249.98 port 45526 [preauth]
Dec 06 07:37:27 np0005548788.novalocal python3[23623]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:37:33 np0005548788.novalocal python3[23642]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:34 np0005548788.novalocal sudo[23656]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdikgxwckbjkrdcraxjoaoarxdqbxcbo ; /usr/bin/python3
Dec 06 07:37:34 np0005548788.novalocal sudo[23656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:37:34 np0005548788.novalocal python3[23658]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:34 np0005548788.novalocal sudo[23656]: pam_unix(sudo:session): session closed for user root
Dec 06 07:37:35 np0005548788.novalocal python3[23672]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:35 np0005548788.novalocal sudo[23686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-varfyhgzauelvvyryxjjqmdxhyqcaosu ; /usr/bin/python3
Dec 06 07:37:35 np0005548788.novalocal sudo[23686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:37:36 np0005548788.novalocal python3[23688]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:36 np0005548788.novalocal sudo[23686]: pam_unix(sudo:session): session closed for user root
Dec 06 07:37:37 np0005548788.novalocal python3[23702]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Dec 06 07:37:37 np0005548788.novalocal python3[23717]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005548788.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:37:38 np0005548788.novalocal sudo[23735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egpubjickvhunqgnzpsvbnosxqzjdhrr ; /usr/bin/python3
Dec 06 07:37:38 np0005548788.novalocal sudo[23735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:37:38 np0005548788.novalocal python3[23737]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:37:38 np0005548788.novalocal systemd[1]: Starting Hostname Service...
Dec 06 07:37:38 np0005548788.novalocal systemd[1]: Started Hostname Service.
Dec 06 07:37:38 np0005548788.localdomain systemd-hostnamed[23741]: Hostname set to <np0005548788.localdomain> (static)
Dec 06 07:37:38 np0005548788.localdomain NetworkManager[5968]: <info>  [1765006658.7094] hostname: static hostname changed from "np0005548788.novalocal" to "np0005548788.localdomain"
Dec 06 07:37:38 np0005548788.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 07:37:38 np0005548788.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 07:37:38 np0005548788.localdomain sudo[23735]: pam_unix(sudo:session): session closed for user root
Dec 06 07:37:40 np0005548788.localdomain sshd[19162]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:37:40 np0005548788.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Dec 06 07:37:40 np0005548788.localdomain systemd[1]: session-10.scope: Consumed 1min 46.234s CPU time.
Dec 06 07:37:40 np0005548788.localdomain systemd-logind[765]: Session 10 logged out. Waiting for processes to exit.
Dec 06 07:37:40 np0005548788.localdomain systemd-logind[765]: Removed session 10.
Dec 06 07:37:42 np0005548788.localdomain sshd[23752]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:42 np0005548788.localdomain sshd[23752]: Accepted publickey for zuul from 38.102.83.114 port 57712 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:37:42 np0005548788.localdomain systemd-logind[765]: New session 11 of user zuul.
Dec 06 07:37:42 np0005548788.localdomain systemd[1]: Started Session 11 of User zuul.
Dec 06 07:37:43 np0005548788.localdomain sshd[23752]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:37:43 np0005548788.localdomain python3[23769]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Dec 06 07:37:45 np0005548788.localdomain sshd[23752]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:37:45 np0005548788.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Dec 06 07:37:45 np0005548788.localdomain systemd-logind[765]: Session 11 logged out. Waiting for processes to exit.
Dec 06 07:37:45 np0005548788.localdomain systemd-logind[765]: Removed session 11.
Dec 06 07:37:48 np0005548788.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 07:38:08 np0005548788.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 07:38:15 np0005548788.localdomain sshd[23775]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:15 np0005548788.localdomain sshd[23775]: Received disconnect from 45.55.249.98 port 60668:11: Bye Bye [preauth]
Dec 06 07:38:15 np0005548788.localdomain sshd[23775]: Disconnected from authenticating user root 45.55.249.98 port 60668 [preauth]
Dec 06 07:38:16 np0005548788.localdomain sshd[23777]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:17 np0005548788.localdomain sshd[23777]: Received disconnect from 151.38.137.90 port 61558:11: Bye Bye [preauth]
Dec 06 07:38:17 np0005548788.localdomain sshd[23777]: Disconnected from authenticating user root 151.38.137.90 port 61558 [preauth]
Dec 06 07:38:24 np0005548788.localdomain sshd[23779]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:24 np0005548788.localdomain sshd[23779]: Accepted publickey for zuul from 38.102.83.114 port 36950 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:38:24 np0005548788.localdomain systemd-logind[765]: New session 12 of user zuul.
Dec 06 07:38:24 np0005548788.localdomain systemd[1]: Started Session 12 of User zuul.
Dec 06 07:38:24 np0005548788.localdomain sshd[23779]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:38:24 np0005548788.localdomain sudo[23796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnxjdkpldyesvwjbpoxrtreneyxaerbl ; /usr/bin/python3
Dec 06 07:38:24 np0005548788.localdomain sudo[23796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:38:24 np0005548788.localdomain python3[23798]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:38:28 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:38:28 np0005548788.localdomain systemd-sysv-generator[23842]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:28 np0005548788.localdomain systemd-rc-local-generator[23838]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:28 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:28 np0005548788.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 06 07:38:28 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:38:28 np0005548788.localdomain systemd-sysv-generator[23886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:28 np0005548788.localdomain systemd-rc-local-generator[23879]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:38:29 np0005548788.localdomain systemd-rc-local-generator[23922]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:29 np0005548788.localdomain systemd-sysv-generator[23926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:38:29 np0005548788.localdomain systemd-rc-local-generator[23979]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:29 np0005548788.localdomain systemd-sysv-generator[23983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:29 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:30 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 07:38:30 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:38:30 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 07:38:30 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 07:38:30 np0005548788.localdomain systemd[1]: run-r4586ecdd44574a1e850c30da15a9be97.service: Deactivated successfully.
Dec 06 07:38:30 np0005548788.localdomain systemd[1]: run-re38957aaf2b84204b5bba71b085fcc08.service: Deactivated successfully.
Dec 06 07:38:31 np0005548788.localdomain sudo[23796]: pam_unix(sudo:session): session closed for user root
Dec 06 07:38:49 np0005548788.localdomain sshd[24571]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:51 np0005548788.localdomain sshd[24571]: Received disconnect from 103.52.114.250 port 41858:11: Bye Bye [preauth]
Dec 06 07:38:51 np0005548788.localdomain sshd[24571]: Disconnected from authenticating user root 103.52.114.250 port 41858 [preauth]
Dec 06 07:39:03 np0005548788.localdomain sshd[24573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:04 np0005548788.localdomain sshd[24573]: Received disconnect from 151.38.137.90 port 61071:11: Bye Bye [preauth]
Dec 06 07:39:04 np0005548788.localdomain sshd[24573]: Disconnected from authenticating user root 151.38.137.90 port 61071 [preauth]
Dec 06 07:39:15 np0005548788.localdomain sshd[24575]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:15 np0005548788.localdomain sshd[24575]: Received disconnect from 45.55.249.98 port 33222:11: Bye Bye [preauth]
Dec 06 07:39:15 np0005548788.localdomain sshd[24575]: Disconnected from authenticating user root 45.55.249.98 port 33222 [preauth]
Dec 06 07:39:22 np0005548788.localdomain sshd[24577]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:24 np0005548788.localdomain sshd[24579]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:27 np0005548788.localdomain sshd[24577]: Received disconnect from 102.140.97.134 port 51568:11: Bye Bye [preauth]
Dec 06 07:39:27 np0005548788.localdomain sshd[24577]: Disconnected from authenticating user root 102.140.97.134 port 51568 [preauth]
Dec 06 07:39:31 np0005548788.localdomain sshd[23782]: Received disconnect from 38.102.83.114 port 36950:11: disconnected by user
Dec 06 07:39:31 np0005548788.localdomain sshd[23782]: Disconnected from user zuul 38.102.83.114 port 36950
Dec 06 07:39:31 np0005548788.localdomain sshd[23779]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:39:31 np0005548788.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Dec 06 07:39:31 np0005548788.localdomain systemd[1]: session-12.scope: Consumed 5.124s CPU time.
Dec 06 07:39:31 np0005548788.localdomain systemd-logind[765]: Session 12 logged out. Waiting for processes to exit.
Dec 06 07:39:31 np0005548788.localdomain systemd-logind[765]: Removed session 12.
Dec 06 07:39:33 np0005548788.localdomain sshd[24579]: Received disconnect from 45.78.222.109 port 38358:11: Bye Bye [preauth]
Dec 06 07:39:33 np0005548788.localdomain sshd[24579]: Disconnected from authenticating user root 45.78.222.109 port 38358 [preauth]
Dec 06 07:39:51 np0005548788.localdomain sshd[24581]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:52 np0005548788.localdomain sshd[24581]: Received disconnect from 151.38.137.90 port 61709:11: Bye Bye [preauth]
Dec 06 07:39:52 np0005548788.localdomain sshd[24581]: Disconnected from authenticating user root 151.38.137.90 port 61709 [preauth]
Dec 06 07:40:16 np0005548788.localdomain sshd[24583]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:16 np0005548788.localdomain sshd[24583]: Received disconnect from 45.55.249.98 port 60040:11: Bye Bye [preauth]
Dec 06 07:40:16 np0005548788.localdomain sshd[24583]: Disconnected from authenticating user root 45.55.249.98 port 60040 [preauth]
Dec 06 07:40:43 np0005548788.localdomain sshd[24586]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:45 np0005548788.localdomain sshd[24586]: Received disconnect from 103.52.114.250 port 42384:11: Bye Bye [preauth]
Dec 06 07:40:45 np0005548788.localdomain sshd[24586]: Disconnected from authenticating user root 103.52.114.250 port 42384 [preauth]
Dec 06 07:41:16 np0005548788.localdomain sshd[24588]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:41:16 np0005548788.localdomain sshd[24588]: Received disconnect from 45.55.249.98 port 40682:11: Bye Bye [preauth]
Dec 06 07:41:16 np0005548788.localdomain sshd[24588]: Disconnected from authenticating user root 45.55.249.98 port 40682 [preauth]
Dec 06 07:41:33 np0005548788.localdomain sshd[24590]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:41:34 np0005548788.localdomain sshd[24590]: Received disconnect from 151.38.137.90 port 61677:11: Bye Bye [preauth]
Dec 06 07:41:34 np0005548788.localdomain sshd[24590]: Disconnected from authenticating user root 151.38.137.90 port 61677 [preauth]
Dec 06 07:41:42 np0005548788.localdomain sshd[24592]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:41:45 np0005548788.localdomain sshd[24592]: Received disconnect from 102.140.97.134 port 38954:11: Bye Bye [preauth]
Dec 06 07:41:45 np0005548788.localdomain sshd[24592]: Disconnected from authenticating user root 102.140.97.134 port 38954 [preauth]
Dec 06 07:42:05 np0005548788.localdomain sshd[24594]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:08 np0005548788.localdomain sshd[24594]: Received disconnect from 45.78.222.109 port 60956:11: Bye Bye [preauth]
Dec 06 07:42:08 np0005548788.localdomain sshd[24594]: Disconnected from authenticating user root 45.78.222.109 port 60956 [preauth]
Dec 06 07:42:12 np0005548788.localdomain sshd[24596]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:12 np0005548788.localdomain sshd[24596]: Received disconnect from 45.55.249.98 port 42118:11: Bye Bye [preauth]
Dec 06 07:42:12 np0005548788.localdomain sshd[24596]: Disconnected from authenticating user root 45.55.249.98 port 42118 [preauth]
Dec 06 07:42:26 np0005548788.localdomain sshd[24598]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:27 np0005548788.localdomain sshd[24598]: Received disconnect from 151.38.137.90 port 61296:11: Bye Bye [preauth]
Dec 06 07:42:27 np0005548788.localdomain sshd[24598]: Disconnected from authenticating user root 151.38.137.90 port 61296 [preauth]
Dec 06 07:42:35 np0005548788.localdomain sshd[24600]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:36 np0005548788.localdomain sshd[24600]: Received disconnect from 103.52.114.250 port 50436:11: Bye Bye [preauth]
Dec 06 07:42:36 np0005548788.localdomain sshd[24600]: Disconnected from authenticating user root 103.52.114.250 port 50436 [preauth]
Dec 06 07:43:11 np0005548788.localdomain sshd[24602]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:43:12 np0005548788.localdomain sshd[24602]: Received disconnect from 45.55.249.98 port 41992:11: Bye Bye [preauth]
Dec 06 07:43:12 np0005548788.localdomain sshd[24602]: Disconnected from authenticating user root 45.55.249.98 port 41992 [preauth]
Dec 06 07:43:21 np0005548788.localdomain sshd[24604]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:43:22 np0005548788.localdomain sshd[24604]: Received disconnect from 151.38.137.90 port 61063:11: Bye Bye [preauth]
Dec 06 07:43:22 np0005548788.localdomain sshd[24604]: Disconnected from authenticating user root 151.38.137.90 port 61063 [preauth]
Dec 06 07:44:02 np0005548788.localdomain sshd[24607]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:44:04 np0005548788.localdomain sshd[24607]: Received disconnect from 102.140.97.134 port 36822:11: Bye Bye [preauth]
Dec 06 07:44:04 np0005548788.localdomain sshd[24607]: Disconnected from authenticating user root 102.140.97.134 port 36822 [preauth]
Dec 06 07:44:10 np0005548788.localdomain sshd[24609]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:44:11 np0005548788.localdomain sshd[24609]: Received disconnect from 103.52.114.250 port 40710:11: Bye Bye [preauth]
Dec 06 07:44:11 np0005548788.localdomain sshd[24609]: Disconnected from authenticating user root 103.52.114.250 port 40710 [preauth]
Dec 06 07:44:38 np0005548788.localdomain sshd[24611]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:44:41 np0005548788.localdomain sshd[24611]: Received disconnect from 45.78.222.109 port 50784:11: Bye Bye [preauth]
Dec 06 07:44:41 np0005548788.localdomain sshd[24611]: Disconnected from authenticating user root 45.78.222.109 port 50784 [preauth]
Dec 06 07:45:52 np0005548788.localdomain sshd[24613]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:45:54 np0005548788.localdomain sshd[24613]: Received disconnect from 103.52.114.250 port 57288:11: Bye Bye [preauth]
Dec 06 07:45:54 np0005548788.localdomain sshd[24613]: Disconnected from authenticating user root 103.52.114.250 port 57288 [preauth]
Dec 06 07:46:23 np0005548788.localdomain sshd[24617]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:25 np0005548788.localdomain sshd[24617]: Received disconnect from 102.140.97.134 port 37864:11: Bye Bye [preauth]
Dec 06 07:46:25 np0005548788.localdomain sshd[24617]: Disconnected from authenticating user root 102.140.97.134 port 37864 [preauth]
Dec 06 07:47:16 np0005548788.localdomain sshd[24619]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:47:18 np0005548788.localdomain sshd[24619]: Received disconnect from 45.78.222.109 port 58878:11: Bye Bye [preauth]
Dec 06 07:47:18 np0005548788.localdomain sshd[24619]: Disconnected from authenticating user root 45.78.222.109 port 58878 [preauth]
Dec 06 07:47:39 np0005548788.localdomain sshd[24621]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:47:41 np0005548788.localdomain sshd[24621]: Received disconnect from 103.52.114.250 port 42476:11: Bye Bye [preauth]
Dec 06 07:47:41 np0005548788.localdomain sshd[24621]: Disconnected from authenticating user root 103.52.114.250 port 42476 [preauth]
Dec 06 07:48:42 np0005548788.localdomain sshd[24623]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:48:50 np0005548788.localdomain sshd[24623]: Received disconnect from 102.140.97.134 port 60138:11: Bye Bye [preauth]
Dec 06 07:48:50 np0005548788.localdomain sshd[24623]: Disconnected from authenticating user root 102.140.97.134 port 60138 [preauth]
Dec 06 07:49:19 np0005548788.localdomain sshd[24625]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:49:21 np0005548788.localdomain sshd[24625]: Received disconnect from 103.52.114.250 port 48620:11: Bye Bye [preauth]
Dec 06 07:49:21 np0005548788.localdomain sshd[24625]: Disconnected from authenticating user root 103.52.114.250 port 48620 [preauth]
Dec 06 07:49:51 np0005548788.localdomain sshd[24627]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:49:54 np0005548788.localdomain sshd[24627]: Received disconnect from 45.78.222.109 port 58662:11: Bye Bye [preauth]
Dec 06 07:49:54 np0005548788.localdomain sshd[24627]: Disconnected from authenticating user root 45.78.222.109 port 58662 [preauth]
Dec 06 07:50:59 np0005548788.localdomain sshd[24629]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:00 np0005548788.localdomain sshd[24629]: Received disconnect from 102.140.97.134 port 57142:11: Bye Bye [preauth]
Dec 06 07:51:00 np0005548788.localdomain sshd[24629]: Disconnected from authenticating user root 102.140.97.134 port 57142 [preauth]
Dec 06 07:51:09 np0005548788.localdomain sshd[24631]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:11 np0005548788.localdomain sshd[24631]: Received disconnect from 103.52.114.250 port 37600:11: Bye Bye [preauth]
Dec 06 07:51:11 np0005548788.localdomain sshd[24631]: Disconnected from authenticating user root 103.52.114.250 port 37600 [preauth]
Dec 06 07:52:27 np0005548788.localdomain sshd[24634]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:52:31 np0005548788.localdomain sshd[24634]: Received disconnect from 45.78.222.109 port 41500:11: Bye Bye [preauth]
Dec 06 07:52:31 np0005548788.localdomain sshd[24634]: Disconnected from authenticating user root 45.78.222.109 port 41500 [preauth]
Dec 06 07:52:56 np0005548788.localdomain sshd[24636]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:52:58 np0005548788.localdomain sshd[24636]: Received disconnect from 103.52.114.250 port 43076:11: Bye Bye [preauth]
Dec 06 07:52:58 np0005548788.localdomain sshd[24636]: Disconnected from authenticating user root 103.52.114.250 port 43076 [preauth]
Dec 06 07:53:15 np0005548788.localdomain sshd[24639]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:53:19 np0005548788.localdomain sshd[24639]: Received disconnect from 102.140.97.134 port 58772:11: Bye Bye [preauth]
Dec 06 07:53:19 np0005548788.localdomain sshd[24639]: Disconnected from authenticating user root 102.140.97.134 port 58772 [preauth]
Dec 06 07:54:01 np0005548788.localdomain anacron[6181]: Job `cron.monthly' started
Dec 06 07:54:01 np0005548788.localdomain anacron[6181]: Job `cron.monthly' terminated
Dec 06 07:54:01 np0005548788.localdomain anacron[6181]: Normal exit (3 jobs run)
Dec 06 07:54:37 np0005548788.localdomain sshd[24644]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:54:39 np0005548788.localdomain sshd[24644]: Received disconnect from 103.52.114.250 port 36370:11: Bye Bye [preauth]
Dec 06 07:54:39 np0005548788.localdomain sshd[24644]: Disconnected from authenticating user root 103.52.114.250 port 36370 [preauth]
Dec 06 07:55:04 np0005548788.localdomain sshd[24646]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:08 np0005548788.localdomain sshd[24646]: Received disconnect from 45.78.222.109 port 34938:11: Bye Bye [preauth]
Dec 06 07:55:08 np0005548788.localdomain sshd[24646]: Disconnected from authenticating user root 45.78.222.109 port 34938 [preauth]
Dec 06 07:55:17 np0005548788.localdomain sshd[24648]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:17 np0005548788.localdomain sshd[24648]: Accepted publickey for zuul from 192.168.122.100 port 54610 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:55:17 np0005548788.localdomain systemd-logind[765]: New session 13 of user zuul.
Dec 06 07:55:17 np0005548788.localdomain systemd[1]: Started Session 13 of User zuul.
Dec 06 07:55:17 np0005548788.localdomain sshd[24648]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:55:17 np0005548788.localdomain sudo[24694]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdeydgiftdqfuahqknpmrxzdbmuydjeb ; /usr/bin/python3
Dec 06 07:55:17 np0005548788.localdomain sudo[24694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:18 np0005548788.localdomain python3[24696]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 07:55:18 np0005548788.localdomain sudo[24694]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:19 np0005548788.localdomain sudo[24781]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyakymxbiwbhryfafiqvdlusimwnycog ; /usr/bin/python3
Dec 06 07:55:19 np0005548788.localdomain sudo[24781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:19 np0005548788.localdomain python3[24783]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:55:22 np0005548788.localdomain sudo[24781]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:22 np0005548788.localdomain sudo[24798]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smhhurastklqpdpznzfhpxonwrwafdek ; /usr/bin/python3
Dec 06 07:55:22 np0005548788.localdomain sudo[24798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:23 np0005548788.localdomain python3[24800]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 07:55:23 np0005548788.localdomain sudo[24798]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:23 np0005548788.localdomain sudo[24814]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfuidnjmfxsqopeoumegzgzbnfoqpqzy ; /usr/bin/python3
Dec 06 07:55:23 np0005548788.localdomain sudo[24814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:23 np0005548788.localdomain python3[24816]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:23 np0005548788.localdomain kernel: loop: module loaded
Dec 06 07:55:23 np0005548788.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Dec 06 07:55:23 np0005548788.localdomain sudo[24814]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:24 np0005548788.localdomain sudo[24839]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxsxuebgjgtzgshvzlyczvwhenwkprfg ; /usr/bin/python3
Dec 06 07:55:24 np0005548788.localdomain sudo[24839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:24 np0005548788.localdomain python3[24841]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:24 np0005548788.localdomain lvm[24844]: PV /dev/loop3 not used.
Dec 06 07:55:24 np0005548788.localdomain lvm[24846]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:55:24 np0005548788.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 06 07:55:24 np0005548788.localdomain lvm[24850]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 06 07:55:24 np0005548788.localdomain lvm[24857]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:55:24 np0005548788.localdomain lvm[24857]: VG ceph_vg0 finished
Dec 06 07:55:24 np0005548788.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 06 07:55:24 np0005548788.localdomain sudo[24839]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:24 np0005548788.localdomain sudo[24903]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxzjaryiesxyayctmirdrmidohhkzbhm ; /usr/bin/python3
Dec 06 07:55:24 np0005548788.localdomain sudo[24903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:25 np0005548788.localdomain python3[24905]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:25 np0005548788.localdomain sudo[24903]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:25 np0005548788.localdomain sudo[24946]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhthmusdosoqisojbqpqkzbhmujvplde ; /usr/bin/python3
Dec 06 07:55:25 np0005548788.localdomain sudo[24946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:25 np0005548788.localdomain python3[24948]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007724.7410376-54543-48881242736552/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:25 np0005548788.localdomain sudo[24946]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:26 np0005548788.localdomain sudo[24976]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acjwtsegqnzhxepidmbglinberuscxkd ; /usr/bin/python3
Dec 06 07:55:26 np0005548788.localdomain sudo[24976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:26 np0005548788.localdomain python3[24978]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:55:26 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:55:26 np0005548788.localdomain systemd-sysv-generator[25008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:55:26 np0005548788.localdomain systemd-rc-local-generator[25005]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:55:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:55:26 np0005548788.localdomain systemd[1]: Starting Ceph OSD losetup...
Dec 06 07:55:26 np0005548788.localdomain bash[25019]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img)
Dec 06 07:55:26 np0005548788.localdomain systemd[1]: Finished Ceph OSD losetup.
Dec 06 07:55:26 np0005548788.localdomain lvm[25020]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:55:26 np0005548788.localdomain lvm[25020]: VG ceph_vg0 finished
Dec 06 07:55:26 np0005548788.localdomain sudo[24976]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:27 np0005548788.localdomain sudo[25034]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmhjdwxvnydxbtvpoaeubbeplvgohwfu ; /usr/bin/python3
Dec 06 07:55:27 np0005548788.localdomain sudo[25034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:27 np0005548788.localdomain python3[25036]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:55:29 np0005548788.localdomain sudo[25034]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:30 np0005548788.localdomain sudo[25051]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlelinizxfxtweysojqblaazuwlgtgjt ; /usr/bin/python3
Dec 06 07:55:30 np0005548788.localdomain sudo[25051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:30 np0005548788.localdomain python3[25053]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 07:55:30 np0005548788.localdomain sudo[25051]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:30 np0005548788.localdomain sudo[25067]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhypmeotoavmqqzhzjedrulqsvmycddm ; /usr/bin/python3
Dec 06 07:55:30 np0005548788.localdomain sudo[25067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:30 np0005548788.localdomain python3[25069]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:30 np0005548788.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Dec 06 07:55:30 np0005548788.localdomain sudo[25067]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:31 np0005548788.localdomain sudo[25089]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpelwvegfzadslmpxbfmfamcfpqyccqt ; /usr/bin/python3
Dec 06 07:55:31 np0005548788.localdomain sudo[25089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:31 np0005548788.localdomain python3[25091]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:31 np0005548788.localdomain lvm[25094]: PV /dev/loop4 not used.
Dec 06 07:55:31 np0005548788.localdomain lvm[25096]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:55:31 np0005548788.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Dec 06 07:55:31 np0005548788.localdomain lvm[25103]:   1 logical volume(s) in volume group "ceph_vg1" now active
Dec 06 07:55:31 np0005548788.localdomain lvm[25107]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:55:31 np0005548788.localdomain lvm[25107]: VG ceph_vg1 finished
Dec 06 07:55:31 np0005548788.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Dec 06 07:55:31 np0005548788.localdomain sudo[25089]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:32 np0005548788.localdomain sudo[25153]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opztiocllqzhphozpbhtzfrpfxgveadw ; /usr/bin/python3
Dec 06 07:55:32 np0005548788.localdomain sudo[25153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:32 np0005548788.localdomain python3[25155]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:32 np0005548788.localdomain sudo[25153]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:32 np0005548788.localdomain sudo[25196]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fakpvcnapiyfqasssbqwumiocxexjwqf ; /usr/bin/python3
Dec 06 07:55:32 np0005548788.localdomain sudo[25196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:32 np0005548788.localdomain python3[25198]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007731.8611104-54737-28783480938285/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:32 np0005548788.localdomain sudo[25196]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:33 np0005548788.localdomain sudo[25226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbmzdaceokmzopjlvilmvuwhtiapquln ; /usr/bin/python3
Dec 06 07:55:33 np0005548788.localdomain sudo[25226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:33 np0005548788.localdomain python3[25228]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:55:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:55:33 np0005548788.localdomain systemd-rc-local-generator[25256]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:55:33 np0005548788.localdomain systemd-sysv-generator[25261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:55:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:55:33 np0005548788.localdomain systemd[1]: Starting Ceph OSD losetup...
Dec 06 07:55:33 np0005548788.localdomain bash[25269]: /dev/loop4: [64516]:8399529 (/var/lib/ceph-osd-1.img)
Dec 06 07:55:33 np0005548788.localdomain systemd[1]: Finished Ceph OSD losetup.
Dec 06 07:55:33 np0005548788.localdomain lvm[25270]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:55:33 np0005548788.localdomain lvm[25270]: VG ceph_vg1 finished
Dec 06 07:55:33 np0005548788.localdomain sudo[25226]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:34 np0005548788.localdomain sshd[25271]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:37 np0005548788.localdomain sshd[25271]: Received disconnect from 102.140.97.134 port 55096:11: Bye Bye [preauth]
Dec 06 07:55:37 np0005548788.localdomain sshd[25271]: Disconnected from authenticating user root 102.140.97.134 port 55096 [preauth]
Dec 06 07:55:42 np0005548788.localdomain sudo[25315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbzkiqedfvrhyuthnsmvrokpiwdifrps ; /usr/bin/python3
Dec 06 07:55:42 np0005548788.localdomain sudo[25315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:42 np0005548788.localdomain python3[25317]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 07:55:42 np0005548788.localdomain sudo[25315]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:43 np0005548788.localdomain sudo[25335]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rygkfdltspnmwbifjmzmfmhzwilwujcn ; /usr/bin/python3
Dec 06 07:55:43 np0005548788.localdomain sudo[25335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:43 np0005548788.localdomain python3[25337]: ansible-hostname Invoked with name=np0005548788.localdomain use=None
Dec 06 07:55:43 np0005548788.localdomain systemd[1]: Starting Hostname Service...
Dec 06 07:55:43 np0005548788.localdomain systemd[1]: Started Hostname Service.
Dec 06 07:55:43 np0005548788.localdomain sudo[25335]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:45 np0005548788.localdomain sudo[25358]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxemhvonqfdmwqwswhahnlxuvjkdfknf ; /usr/bin/python3
Dec 06 07:55:45 np0005548788.localdomain sudo[25358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:45 np0005548788.localdomain python3[25360]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Dec 06 07:55:45 np0005548788.localdomain sudo[25358]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:46 np0005548788.localdomain sudo[25406]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laoanmgjfgyqmlspvrfyunjcbyfngwmn ; /usr/bin/python3
Dec 06 07:55:46 np0005548788.localdomain sudo[25406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:46 np0005548788.localdomain python3[25408]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.6xjzk3xatmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:46 np0005548788.localdomain sudo[25406]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:46 np0005548788.localdomain sudo[25436]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nflthsyxjuygtuspkbuhsrwycmkcnyzj ; /usr/bin/python3
Dec 06 07:55:46 np0005548788.localdomain sudo[25436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:46 np0005548788.localdomain python3[25438]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.6xjzk3xatmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:46 np0005548788.localdomain sudo[25436]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:47 np0005548788.localdomain sudo[25452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koltsnufalaburahftkpoqpknowbapjy ; /usr/bin/python3
Dec 06 07:55:47 np0005548788.localdomain sudo[25452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:47 np0005548788.localdomain python3[25454]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.6xjzk3xatmphosts insertbefore=BOF block=192.168.122.106 np0005548788.localdomain np0005548788
                                                         192.168.122.106 np0005548788.ctlplane.localdomain np0005548788.ctlplane
                                                         192.168.122.107 np0005548789.localdomain np0005548789
                                                         192.168.122.107 np0005548789.ctlplane.localdomain np0005548789.ctlplane
                                                         192.168.122.108 np0005548790.localdomain np0005548790
                                                         192.168.122.108 np0005548790.ctlplane.localdomain np0005548790.ctlplane
                                                         192.168.122.103 np0005548785.localdomain np0005548785
                                                         192.168.122.103 np0005548785.ctlplane.localdomain np0005548785.ctlplane
                                                         192.168.122.104 np0005548786.localdomain np0005548786
                                                         192.168.122.104 np0005548786.ctlplane.localdomain np0005548786.ctlplane
                                                         192.168.122.105 np0005548787.localdomain np0005548787
                                                         192.168.122.105 np0005548787.ctlplane.localdomain np0005548787.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:47 np0005548788.localdomain sudo[25452]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:47 np0005548788.localdomain sudo[25468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhpgdfccbvixaphasctfmgnflsdydhpq ; /usr/bin/python3
Dec 06 07:55:47 np0005548788.localdomain sudo[25468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:47 np0005548788.localdomain python3[25470]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.6xjzk3xatmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:48 np0005548788.localdomain sudo[25468]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:48 np0005548788.localdomain sudo[25485]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxixkgqhfgudqcbhieusvmhdfmyhbbeu ; /usr/bin/python3
Dec 06 07:55:48 np0005548788.localdomain sudo[25485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:48 np0005548788.localdomain python3[25487]: ansible-file Invoked with path=/tmp/ansible.6xjzk3xatmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:48 np0005548788.localdomain sudo[25485]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:50 np0005548788.localdomain sudo[25501]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnprsedzhpvmgpgvwibetbvqakxnifzy ; /usr/bin/python3
Dec 06 07:55:50 np0005548788.localdomain sudo[25501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:50 np0005548788.localdomain python3[25503]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:50 np0005548788.localdomain sudo[25501]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:51 np0005548788.localdomain sudo[25519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovrjbzcuccrnayhbymljsftpaafzwowc ; /usr/bin/python3
Dec 06 07:55:51 np0005548788.localdomain sudo[25519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:51 np0005548788.localdomain python3[25521]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:55:54 np0005548788.localdomain sudo[25519]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:55 np0005548788.localdomain sudo[25568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sidrmhgixhlyfdxvrfxegnvqwdgwochg ; /usr/bin/python3
Dec 06 07:55:55 np0005548788.localdomain sudo[25568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:55 np0005548788.localdomain python3[25570]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:55 np0005548788.localdomain sudo[25568]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:55 np0005548788.localdomain sudo[25613]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unglcgpphsamrydmkrbhyprdwmpnfrtd ; /usr/bin/python3
Dec 06 07:55:55 np0005548788.localdomain sudo[25613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:56 np0005548788.localdomain python3[25615]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007755.1433096-55568-8731304709499/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:56 np0005548788.localdomain sudo[25613]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:57 np0005548788.localdomain sudo[25643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azwlpqcqtcagjhwtvrlyhlylldggxcqd ; /usr/bin/python3
Dec 06 07:55:57 np0005548788.localdomain sudo[25643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:57 np0005548788.localdomain python3[25645]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:55:58 np0005548788.localdomain sudo[25643]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:58 np0005548788.localdomain sudo[25661]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyuzhqwzioakdsqfjduwxksfyuikzjzk ; /usr/bin/python3
Dec 06 07:55:58 np0005548788.localdomain sudo[25661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:58 np0005548788.localdomain python3[25663]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:55:59 np0005548788.localdomain chronyd[767]: chronyd exiting
Dec 06 07:55:59 np0005548788.localdomain systemd[1]: Stopping NTP client/server...
Dec 06 07:55:59 np0005548788.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 07:55:59 np0005548788.localdomain systemd[1]: Stopped NTP client/server.
Dec 06 07:55:59 np0005548788.localdomain systemd[1]: chronyd.service: Consumed 121ms CPU time, read 1.9M from disk, written 4.0K to disk.
Dec 06 07:55:59 np0005548788.localdomain systemd[1]: Starting NTP client/server...
Dec 06 07:55:59 np0005548788.localdomain chronyd[25670]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 07:55:59 np0005548788.localdomain chronyd[25670]: Frequency -30.741 +/- 0.152 ppm read from /var/lib/chrony/drift
Dec 06 07:55:59 np0005548788.localdomain chronyd[25670]: Loaded seccomp filter (level 2)
Dec 06 07:55:59 np0005548788.localdomain systemd[1]: Started NTP client/server.
Dec 06 07:55:59 np0005548788.localdomain sudo[25661]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:59 np0005548788.localdomain sudo[25717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwjseozqiqfzktrwmuzytxzrbpvqwghk ; /usr/bin/python3
Dec 06 07:55:59 np0005548788.localdomain sudo[25717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:59 np0005548788.localdomain python3[25719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:59 np0005548788.localdomain sudo[25717]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:00 np0005548788.localdomain sudo[25760]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjmgydyrnhubjgmeoflgmvwvjrjwlbbe ; /usr/bin/python3
Dec 06 07:56:00 np0005548788.localdomain sudo[25760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:00 np0005548788.localdomain python3[25762]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007759.5394063-55758-109800971990875/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:56:00 np0005548788.localdomain sudo[25760]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:00 np0005548788.localdomain sudo[25790]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksprwedyycsqiqlfipcyddtuzjmoqjfh ; /usr/bin/python3
Dec 06 07:56:00 np0005548788.localdomain sudo[25790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:00 np0005548788.localdomain python3[25792]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:56:00 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:56:00 np0005548788.localdomain systemd-sysv-generator[25823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:56:00 np0005548788.localdomain systemd-rc-local-generator[25820]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:56:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:56:01 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:56:01 np0005548788.localdomain systemd-rc-local-generator[25859]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:56:01 np0005548788.localdomain systemd-sysv-generator[25862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:56:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:56:01 np0005548788.localdomain systemd[1]: Starting chronyd online sources service...
Dec 06 07:56:01 np0005548788.localdomain chronyc[25868]: 200 OK
Dec 06 07:56:01 np0005548788.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Dec 06 07:56:01 np0005548788.localdomain systemd[1]: Finished chronyd online sources service.
Dec 06 07:56:01 np0005548788.localdomain sudo[25790]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:01 np0005548788.localdomain sudo[25883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twegwulhziwumnxansuyossdaikrkmrr ; /usr/bin/python3
Dec 06 07:56:01 np0005548788.localdomain sudo[25883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:02 np0005548788.localdomain python3[25885]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:56:02 np0005548788.localdomain chronyd[25670]: System clock was stepped by -0.000000 seconds
Dec 06 07:56:02 np0005548788.localdomain sudo[25883]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:02 np0005548788.localdomain sudo[25900]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncfkifbszrtevvwtqyjhjguecwjthatp ; /usr/bin/python3
Dec 06 07:56:02 np0005548788.localdomain sudo[25900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:02 np0005548788.localdomain python3[25902]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:56:03 np0005548788.localdomain chronyd[25670]: Selected source 23.133.168.245 (pool.ntp.org)
Dec 06 07:56:12 np0005548788.localdomain sudo[25900]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:12 np0005548788.localdomain sudo[25917]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpuvqngdvvdtvccastmvnyclkezfobum ; /usr/bin/python3
Dec 06 07:56:12 np0005548788.localdomain sudo[25917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:13 np0005548788.localdomain python3[25919]: ansible-timezone Invoked with name=UTC hwclock=None
Dec 06 07:56:13 np0005548788.localdomain systemd[1]: Starting Time & Date Service...
Dec 06 07:56:13 np0005548788.localdomain systemd[1]: Started Time & Date Service.
Dec 06 07:56:13 np0005548788.localdomain sudo[25917]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:13 np0005548788.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 07:56:13 np0005548788.localdomain sudo[25939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teegegwlrujsuwxqvpowvfaahzatvjgb ; /usr/bin/python3
Dec 06 07:56:13 np0005548788.localdomain sudo[25939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:14 np0005548788.localdomain python3[25941]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:56:14 np0005548788.localdomain chronyd[25670]: chronyd exiting
Dec 06 07:56:14 np0005548788.localdomain systemd[1]: Stopping NTP client/server...
Dec 06 07:56:14 np0005548788.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 07:56:14 np0005548788.localdomain systemd[1]: Stopped NTP client/server.
Dec 06 07:56:14 np0005548788.localdomain systemd[1]: Starting NTP client/server...
Dec 06 07:56:14 np0005548788.localdomain chronyd[25948]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 07:56:14 np0005548788.localdomain chronyd[25948]: Frequency -30.741 +/- 0.152 ppm read from /var/lib/chrony/drift
Dec 06 07:56:14 np0005548788.localdomain chronyd[25948]: Loaded seccomp filter (level 2)
Dec 06 07:56:14 np0005548788.localdomain systemd[1]: Started NTP client/server.
Dec 06 07:56:14 np0005548788.localdomain sudo[25939]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:18 np0005548788.localdomain chronyd[25948]: Selected source 23.133.168.245 (pool.ntp.org)
Dec 06 07:56:19 np0005548788.localdomain sshd[25950]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:56:21 np0005548788.localdomain sshd[25950]: Received disconnect from 103.52.114.250 port 46136:11: Bye Bye [preauth]
Dec 06 07:56:21 np0005548788.localdomain sshd[25950]: Disconnected from authenticating user root 103.52.114.250 port 46136 [preauth]
Dec 06 07:56:30 np0005548788.localdomain sudo[25965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbpeczvdtuguhlvfubrqpsrutbnaqfpv ; /usr/bin/python3
Dec 06 07:56:30 np0005548788.localdomain sudo[25965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:31 np0005548788.localdomain useradd[25969]: new group: name=ceph-admin, GID=1002
Dec 06 07:56:31 np0005548788.localdomain useradd[25969]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 06 07:56:31 np0005548788.localdomain sudo[25965]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:31 np0005548788.localdomain sudo[26021]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmrchdzqjnocyarxmfjhmkympdkefjqf ; /usr/bin/python3
Dec 06 07:56:31 np0005548788.localdomain sudo[26021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:31 np0005548788.localdomain sudo[26021]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:32 np0005548788.localdomain sudo[26064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oegveqooymkesiyaruxmtemfkqnsinqs ; /usr/bin/python3
Dec 06 07:56:32 np0005548788.localdomain sudo[26064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:32 np0005548788.localdomain sudo[26064]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:32 np0005548788.localdomain sudo[26094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbcwnhgeonybjpkveenmxqqudelkvdop ; /usr/bin/python3
Dec 06 07:56:32 np0005548788.localdomain sudo[26094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:32 np0005548788.localdomain sudo[26094]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:32 np0005548788.localdomain sudo[26110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coihobfesqoiltlitsjtcisjamyvovpr ; /usr/bin/python3
Dec 06 07:56:32 np0005548788.localdomain sudo[26110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:33 np0005548788.localdomain sudo[26110]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:33 np0005548788.localdomain sudo[26126]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kthrewynrwuokhkyllnxkgqrbpdhdyck ; /usr/bin/python3
Dec 06 07:56:33 np0005548788.localdomain sudo[26126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:33 np0005548788.localdomain sudo[26126]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:33 np0005548788.localdomain sudo[26142]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roguapxkxykkmlvdycnmchiqnwoyydkh ; /usr/bin/python3
Dec 06 07:56:33 np0005548788.localdomain sudo[26142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:34 np0005548788.localdomain sudo[26142]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:43 np0005548788.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 07:57:54 np0005548788.localdomain sshd[26147]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:57:56 np0005548788.localdomain sshd[26147]: Received disconnect from 102.140.97.134 port 53614:11: Bye Bye [preauth]
Dec 06 07:57:56 np0005548788.localdomain sshd[26147]: Disconnected from authenticating user root 102.140.97.134 port 53614 [preauth]
Dec 06 07:58:10 np0005548788.localdomain sshd[26149]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:11 np0005548788.localdomain sshd[26149]: Received disconnect from 103.52.114.250 port 42890:11: Bye Bye [preauth]
Dec 06 07:58:11 np0005548788.localdomain sshd[26149]: Disconnected from authenticating user root 103.52.114.250 port 42890 [preauth]
Dec 06 07:58:18 np0005548788.localdomain sshd[26151]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:18 np0005548788.localdomain sshd[26151]: Accepted publickey for ceph-admin from 192.168.122.103 port 55274 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:18 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 1002.
Dec 06 07:58:18 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Dec 06 07:58:18 np0005548788.localdomain systemd-logind[765]: New session 14 of user ceph-admin.
Dec 06 07:58:18 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Dec 06 07:58:19 np0005548788.localdomain systemd[1]: Starting User Manager for UID 1002...
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:19 np0005548788.localdomain sshd[26168]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Queued start job for default target Main User Target.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Created slice User Application Slice.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Reached target Paths.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Reached target Timers.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Starting D-Bus User Message Bus Socket...
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Starting Create User's Volatile Files and Directories...
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Finished Create User's Volatile Files and Directories.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Listening on D-Bus User Message Bus Socket.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Reached target Sockets.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Reached target Basic System.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Reached target Main User Target.
Dec 06 07:58:19 np0005548788.localdomain systemd[26155]: Startup finished in 121ms.
Dec 06 07:58:19 np0005548788.localdomain systemd[1]: Started User Manager for UID 1002.
Dec 06 07:58:19 np0005548788.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Dec 06 07:58:19 np0005548788.localdomain sshd[26151]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:19 np0005548788.localdomain sshd[26168]: Accepted publickey for ceph-admin from 192.168.122.103 port 55278 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:19 np0005548788.localdomain systemd-logind[765]: New session 16 of user ceph-admin.
Dec 06 07:58:19 np0005548788.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Dec 06 07:58:19 np0005548788.localdomain sshd[26168]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:19 np0005548788.localdomain sudo[26175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:19 np0005548788.localdomain sudo[26175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:19 np0005548788.localdomain sudo[26175]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:19 np0005548788.localdomain sshd[26190]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:19 np0005548788.localdomain sshd[26190]: Accepted publickey for ceph-admin from 192.168.122.103 port 55294 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:19 np0005548788.localdomain systemd-logind[765]: New session 17 of user ceph-admin.
Dec 06 07:58:19 np0005548788.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Dec 06 07:58:19 np0005548788.localdomain sshd[26190]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:19 np0005548788.localdomain sudo[26194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005548788.localdomain
Dec 06 07:58:19 np0005548788.localdomain sudo[26194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:19 np0005548788.localdomain sudo[26194]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:19 np0005548788.localdomain sshd[26209]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:20 np0005548788.localdomain sshd[26209]: Accepted publickey for ceph-admin from 192.168.122.103 port 55296 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:20 np0005548788.localdomain systemd-logind[765]: New session 18 of user ceph-admin.
Dec 06 07:58:20 np0005548788.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Dec 06 07:58:20 np0005548788.localdomain sshd[26209]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:20 np0005548788.localdomain sudo[26213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Dec 06 07:58:20 np0005548788.localdomain sudo[26213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:20 np0005548788.localdomain sudo[26213]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:20 np0005548788.localdomain sshd[26228]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:20 np0005548788.localdomain sshd[26228]: Accepted publickey for ceph-admin from 192.168.122.103 port 55306 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:20 np0005548788.localdomain systemd-logind[765]: New session 19 of user ceph-admin.
Dec 06 07:58:20 np0005548788.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Dec 06 07:58:20 np0005548788.localdomain sshd[26228]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:20 np0005548788.localdomain sudo[26232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:58:20 np0005548788.localdomain sudo[26232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:20 np0005548788.localdomain sudo[26232]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:20 np0005548788.localdomain sshd[26247]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:20 np0005548788.localdomain sshd[26247]: Accepted publickey for ceph-admin from 192.168.122.103 port 55320 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:20 np0005548788.localdomain systemd-logind[765]: New session 20 of user ceph-admin.
Dec 06 07:58:20 np0005548788.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Dec 06 07:58:20 np0005548788.localdomain sshd[26247]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:20 np0005548788.localdomain sudo[26251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:58:20 np0005548788.localdomain sudo[26251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:20 np0005548788.localdomain sudo[26251]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:21 np0005548788.localdomain sshd[26266]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:21 np0005548788.localdomain sshd[26266]: Accepted publickey for ceph-admin from 192.168.122.103 port 55336 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:21 np0005548788.localdomain systemd-logind[765]: New session 21 of user ceph-admin.
Dec 06 07:58:21 np0005548788.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Dec 06 07:58:21 np0005548788.localdomain sshd[26266]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:21 np0005548788.localdomain sudo[26270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Dec 06 07:58:21 np0005548788.localdomain sudo[26270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:21 np0005548788.localdomain sudo[26270]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:21 np0005548788.localdomain sshd[26285]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:21 np0005548788.localdomain sshd[26285]: Accepted publickey for ceph-admin from 192.168.122.103 port 55346 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:21 np0005548788.localdomain systemd-logind[765]: New session 22 of user ceph-admin.
Dec 06 07:58:21 np0005548788.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Dec 06 07:58:21 np0005548788.localdomain sshd[26285]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:21 np0005548788.localdomain sudo[26289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:58:21 np0005548788.localdomain sudo[26289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:21 np0005548788.localdomain sudo[26289]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:21 np0005548788.localdomain sshd[26304]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:21 np0005548788.localdomain sshd[26304]: Accepted publickey for ceph-admin from 192.168.122.103 port 55352 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:21 np0005548788.localdomain systemd-logind[765]: New session 23 of user ceph-admin.
Dec 06 07:58:21 np0005548788.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Dec 06 07:58:22 np0005548788.localdomain sshd[26304]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:22 np0005548788.localdomain sudo[26308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Dec 06 07:58:22 np0005548788.localdomain sudo[26308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:22 np0005548788.localdomain sudo[26308]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:22 np0005548788.localdomain sshd[26323]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:22 np0005548788.localdomain sshd[26323]: Accepted publickey for ceph-admin from 192.168.122.103 port 55356 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:22 np0005548788.localdomain systemd-logind[765]: New session 24 of user ceph-admin.
Dec 06 07:58:22 np0005548788.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Dec 06 07:58:22 np0005548788.localdomain sshd[26323]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:22 np0005548788.localdomain sshd[26340]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:22 np0005548788.localdomain sshd[26340]: Accepted publickey for ceph-admin from 192.168.122.103 port 55358 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:22 np0005548788.localdomain systemd-logind[765]: New session 25 of user ceph-admin.
Dec 06 07:58:22 np0005548788.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Dec 06 07:58:22 np0005548788.localdomain sshd[26340]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:23 np0005548788.localdomain sudo[26344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Dec 06 07:58:23 np0005548788.localdomain sudo[26344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:23 np0005548788.localdomain sudo[26344]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:23 np0005548788.localdomain sshd[26359]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:23 np0005548788.localdomain sshd[26359]: Accepted publickey for ceph-admin from 192.168.122.103 port 55360 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:23 np0005548788.localdomain systemd-logind[765]: New session 26 of user ceph-admin.
Dec 06 07:58:23 np0005548788.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Dec 06 07:58:23 np0005548788.localdomain sshd[26359]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:23 np0005548788.localdomain sudo[26363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005548788.localdomain
Dec 06 07:58:23 np0005548788.localdomain sudo[26363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:23 np0005548788.localdomain sudo[26363]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548788.localdomain sudo[26398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 07:58:47 np0005548788.localdomain sudo[26398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548788.localdomain sudo[26398]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548788.localdomain sudo[26413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:47 np0005548788.localdomain sudo[26413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548788.localdomain sudo[26413]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548788.localdomain sudo[26428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 07:58:47 np0005548788.localdomain sudo[26428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:47 np0005548788.localdomain sudo[26428]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548788.localdomain sudo[26464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:47 np0005548788.localdomain sudo[26464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548788.localdomain sudo[26464]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548788.localdomain sudo[26479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 07:58:47 np0005548788.localdomain sudo[26479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:48 np0005548788.localdomain sudo[26479]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:48 np0005548788.localdomain sudo[26531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:48 np0005548788.localdomain sudo[26531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:48 np0005548788.localdomain sudo[26531]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:48 np0005548788.localdomain sudo[26546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 07:58:48 np0005548788.localdomain sudo[26546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:48 np0005548788.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26574 (sysctl)
Dec 06 07:58:48 np0005548788.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 06 07:58:48 np0005548788.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 06 07:58:49 np0005548788.localdomain sudo[26546]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548788.localdomain sudo[26596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:49 np0005548788.localdomain sudo[26596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:49 np0005548788.localdomain sudo[26596]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548788.localdomain sudo[26611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 07:58:49 np0005548788.localdomain sudo[26611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:49 np0005548788.localdomain sudo[26611]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548788.localdomain sudo[26644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:49 np0005548788.localdomain sudo[26644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:49 np0005548788.localdomain sudo[26644]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548788.localdomain sudo[26659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 07:58:49 np0005548788.localdomain sudo[26659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:53 np0005548788.localdomain kernel: VFS: idmapped mount is not enabled.
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 2025-12-06 07:58:50.463161685 +0000 UTC m=+0.044253951 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 2025-12-06 07:59:14.673073056 +0000 UTC m=+24.254165282 container create a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_yalow, release=1763362218, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Dec 06 07:59:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck1730662974-merged.mount: Deactivated successfully.
Dec 06 07:59:14 np0005548788.localdomain systemd[1]: Created slice Slice /machine.
Dec 06 07:59:14 np0005548788.localdomain systemd[1]: Started libpod-conmon-a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab.scope.
Dec 06 07:59:14 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 2025-12-06 07:59:14.772034639 +0000 UTC m=+24.353126885 container init a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_yalow, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, ceph=True, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main)
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 2025-12-06 07:59:14.783282657 +0000 UTC m=+24.364374913 container start a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_yalow, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.buildah.version=1.41.4, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z)
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 2025-12-06 07:59:14.783589986 +0000 UTC m=+24.364682232 container attach a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_yalow, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, distribution-scope=public)
Dec 06 07:59:14 np0005548788.localdomain strange_yalow[26826]: 167 167
Dec 06 07:59:14 np0005548788.localdomain systemd[1]: libpod-a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab.scope: Deactivated successfully.
Dec 06 07:59:14 np0005548788.localdomain podman[26713]: 2025-12-06 07:59:14.786923023 +0000 UTC m=+24.368015229 container died a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_yalow, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4)
Dec 06 07:59:14 np0005548788.localdomain podman[26831]: 2025-12-06 07:59:14.882827738 +0000 UTC m=+0.083306089 container remove a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_yalow, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 07:59:14 np0005548788.localdomain systemd[1]: libpod-conmon-a1191f440efbc80d01f34197450b2c7f285df2b56a9da036860a6e032f9518ab.scope: Deactivated successfully.
Dec 06 07:59:15 np0005548788.localdomain podman[26852]: 
Dec 06 07:59:15 np0005548788.localdomain podman[26852]: 2025-12-06 07:59:15.080197259 +0000 UTC m=+0.052736018 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-298c13f8804599a3f80938b1bce5ea97fec96b013ac14d85787746db8a6ef186-merged.mount: Deactivated successfully.
Dec 06 07:59:18 np0005548788.localdomain podman[26852]: 2025-12-06 07:59:18.940307929 +0000 UTC m=+3.912846678 container create 063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_wilson, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, ceph=True, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 07:59:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d.scope.
Dec 06 07:59:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db214de53b562a26cb5c9ef8078107614df8b71a61a28f15191f8da9efeff0c5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db214de53b562a26cb5c9ef8078107614df8b71a61a28f15191f8da9efeff0c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:19 np0005548788.localdomain podman[26852]: 2025-12-06 07:59:19.031514587 +0000 UTC m=+4.004053346 container init 063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_wilson, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, release=1763362218, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 07:59:19 np0005548788.localdomain podman[26852]: 2025-12-06 07:59:19.043833416 +0000 UTC m=+4.016372175 container start 063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_wilson, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 07:59:19 np0005548788.localdomain podman[26852]: 2025-12-06 07:59:19.044124464 +0000 UTC m=+4.016663213 container attach 063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_wilson, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git)
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]: [
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:     {
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "available": false,
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "ceph_device": false,
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "lsm_data": {},
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "lvs": [],
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "path": "/dev/sr0",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "rejected_reasons": [
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "Has a FileSystem",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "Insufficient space (<5GB)"
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         ],
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         "sys_api": {
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "actuators": null,
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "device_nodes": "sr0",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "human_readable_size": "482.00 KB",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "id_bus": "ata",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "model": "QEMU DVD-ROM",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "nr_requests": "2",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "partitions": {},
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "path": "/dev/sr0",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "removable": "1",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "rev": "2.5+",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "ro": "0",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "rotational": "1",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "sas_address": "",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "sas_device_handle": "",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "scheduler_mode": "mq-deadline",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "sectors": 0,
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "sectorsize": "2048",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "size": 493568.0,
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "support_discard": "0",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "type": "disk",
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:             "vendor": "QEMU"
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:         }
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]:     }
Dec 06 07:59:19 np0005548788.localdomain practical_wilson[27122]: ]
Dec 06 07:59:19 np0005548788.localdomain systemd[1]: libpod-063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d.scope: Deactivated successfully.
Dec 06 07:59:19 np0005548788.localdomain podman[26852]: 2025-12-06 07:59:19.917887075 +0000 UTC m=+4.890425834 container died 063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_wilson, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main)
Dec 06 07:59:19 np0005548788.localdomain systemd[1]: tmp-crun.GFWIfi.mount: Deactivated successfully.
Dec 06 07:59:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-db214de53b562a26cb5c9ef8078107614df8b71a61a28f15191f8da9efeff0c5-merged.mount: Deactivated successfully.
Dec 06 07:59:19 np0005548788.localdomain podman[28508]: 2025-12-06 07:59:19.998056451 +0000 UTC m=+0.069468665 container remove 063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_wilson, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: libpod-conmon-063ad3e568da84d02ea595d4ce8506c6225ae63cedc3acad36e9092ab65cd28d.scope: Deactivated successfully.
Dec 06 07:59:20 np0005548788.localdomain sudo[26659]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:20 np0005548788.localdomain sudo[28520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:59:20 np0005548788.localdomain sudo[28520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:20 np0005548788.localdomain sudo[28520]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:20 np0005548788.localdomain sudo[28535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 --coredump-max-size=32G
Dec 06 07:59:20 np0005548788.localdomain sudo[28535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: Closed Process Core Dump Socket.
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: Stopping Process Core Dump Socket...
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: Listening on Process Core Dump Socket.
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:20 np0005548788.localdomain systemd-rc-local-generator[28586]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:20 np0005548788.localdomain systemd-sysv-generator[28593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:20 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:20 np0005548788.localdomain systemd-rc-local-generator[28627]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:20 np0005548788.localdomain systemd-sysv-generator[28630]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:21 np0005548788.localdomain sudo[28535]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:41 np0005548788.localdomain sudo[28639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:59:41 np0005548788.localdomain sudo[28639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:41 np0005548788.localdomain sudo[28639]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:41 np0005548788.localdomain sudo[28654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:59:41 np0005548788.localdomain sudo[28654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 2025-12-06 07:59:41.823267171 +0000 UTC m=+0.069736030 container create 3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_jones, RELEASE=main, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, vcs-type=git)
Dec 06 07:59:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294.scope.
Dec 06 07:59:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 2025-12-06 07:59:41.890750434 +0000 UTC m=+0.137219283 container init 3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_jones, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, name=rhceph, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7)
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 2025-12-06 07:59:41.793991965 +0000 UTC m=+0.040460814 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 2025-12-06 07:59:41.900468341 +0000 UTC m=+0.146937200 container start 3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_jones, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, version=7, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, name=rhceph)
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 2025-12-06 07:59:41.900799916 +0000 UTC m=+0.147268775 container attach 3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_jones, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, distribution-scope=public, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 07:59:41 np0005548788.localdomain eager_jones[28726]: 167 167
Dec 06 07:59:41 np0005548788.localdomain systemd[1]: libpod-3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294.scope: Deactivated successfully.
Dec 06 07:59:41 np0005548788.localdomain podman[28711]: 2025-12-06 07:59:41.904582654 +0000 UTC m=+0.151051543 container died 3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_jones, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 07:59:41 np0005548788.localdomain podman[28731]: 2025-12-06 07:59:41.988985483 +0000 UTC m=+0.073669995 container remove 3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_jones, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4)
Dec 06 07:59:41 np0005548788.localdomain systemd[1]: libpod-conmon-3ecf19c42f20f393d422535db007d916886d6302cf0e9eaa6e0574ec56ca3294.scope: Deactivated successfully.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:42 np0005548788.localdomain systemd-rc-local-generator[28771]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:42 np0005548788.localdomain systemd-sysv-generator[28777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:42 np0005548788.localdomain systemd-sysv-generator[28811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:42 np0005548788.localdomain systemd-rc-local-generator[28806]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: Reached target All Ceph clusters and services.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:42 np0005548788.localdomain systemd-sysv-generator[28848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:42 np0005548788.localdomain systemd-rc-local-generator[28845]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: Reached target Ceph cluster 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:42 np0005548788.localdomain systemd-rc-local-generator[28889]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:42 np0005548788.localdomain systemd-sysv-generator[28892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 07:59:43 np0005548788.localdomain systemd-rc-local-generator[28926]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:43 np0005548788.localdomain systemd-sysv-generator[28932]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: Created slice Slice /system/ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: Reached target System Time Set.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: Reached target System Time Synchronized.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: Starting Ceph crash.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:43 np0005548788.localdomain podman[28994]: 
Dec 06 07:59:43 np0005548788.localdomain podman[28994]: 2025-12-06 07:59:43.635756753 +0000 UTC m=+0.062592654 container create 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.tags=rhceph ceph)
Dec 06 07:59:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5229b104d766631a911ee231d428be07b3c90de2ece894a033c7983e42cf30be/merged/etc/ceph/ceph.client.crash.np0005548788.keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5229b104d766631a911ee231d428be07b3c90de2ece894a033c7983e42cf30be/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5229b104d766631a911ee231d428be07b3c90de2ece894a033c7983e42cf30be/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:43 np0005548788.localdomain podman[28994]: 2025-12-06 07:59:43.620733247 +0000 UTC m=+0.047569148 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:43 np0005548788.localdomain podman[28994]: 2025-12-06 07:59:43.729251479 +0000 UTC m=+0.156087380 container init 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7)
Dec 06 07:59:43 np0005548788.localdomain podman[28994]: 2025-12-06 07:59:43.740745419 +0000 UTC m=+0.167581310 container start 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, CEPH_POINT_RELEASE=, release=1763362218, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 07:59:43 np0005548788.localdomain bash[28994]: 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09
Dec 06 07:59:43 np0005548788.localdomain systemd[1]: Started Ceph crash.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 07:59:43 np0005548788.localdomain sudo[28654]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.923+0000 7f1477a8c640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.923+0000 7f1477a8c640 -1 AuthRegistry(0x7f1470067c70) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.925+0000 7f1477a8c640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.925+0000 7f1477a8c640 -1 AuthRegistry(0x7f1477a8b000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.932+0000 7f1475000640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.933+0000 7f1475801640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.936+0000 7f1476002640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: 2025-12-06T07:59:43.936+0000 7f1477a8c640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 06 07:59:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788[29007]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 06 07:59:44 np0005548788.localdomain systemd[1]: tmp-crun.o38sbZ.mount: Deactivated successfully.
Dec 06 07:59:47 np0005548788.localdomain sshd[29024]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:59:48 np0005548788.localdomain sshd[29024]: Received disconnect from 103.52.114.250 port 36852:11: Bye Bye [preauth]
Dec 06 07:59:48 np0005548788.localdomain sshd[29024]: Disconnected from authenticating user root 103.52.114.250 port 36852 [preauth]
Dec 06 07:59:52 np0005548788.localdomain sudo[29026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:59:52 np0005548788.localdomain sudo[29026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:52 np0005548788.localdomain sudo[29026]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:52 np0005548788.localdomain sudo[29041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Dec 06 07:59:52 np0005548788.localdomain sudo[29041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 2025-12-06 07:59:53.147232324 +0000 UTC m=+0.134573528 container create 9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_clarke, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.component=rhceph-container, release=1763362218)
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 2025-12-06 07:59:53.057133847 +0000 UTC m=+0.044475051 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: Started libpod-conmon-9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77.scope.
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 2025-12-06 07:59:53.226251438 +0000 UTC m=+0.213592642 container init 9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_clarke, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1763362218, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main)
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 2025-12-06 07:59:53.234261955 +0000 UTC m=+0.221603159 container start 9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_clarke, io.openshift.expose-services=, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 2025-12-06 07:59:53.234517587 +0000 UTC m=+0.221858791 container attach 9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, RELEASE=main)
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: libpod-9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77.scope: Deactivated successfully.
Dec 06 07:59:53 np0005548788.localdomain epic_clarke[29109]: 167 167
Dec 06 07:59:53 np0005548788.localdomain podman[29094]: 2025-12-06 07:59:53.239709371 +0000 UTC m=+0.227050595 container died 9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_clarke, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, release=1763362218, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-945dd7e37817ad2d62179fe24d1415b56acf2c4171751d9fd2c12497cfa1f4d6-merged.mount: Deactivated successfully.
Dec 06 07:59:53 np0005548788.localdomain podman[29114]: 2025-12-06 07:59:53.335804199 +0000 UTC m=+0.082993884 container remove 9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_clarke, release=1763362218, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, ceph=True)
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: libpod-conmon-9d8019c641d32600f7f5d85b412632367e2cf95f44d08cc67cb44157e89b5b77.scope: Deactivated successfully.
Dec 06 07:59:53 np0005548788.localdomain podman[29136]: 
Dec 06 07:59:53 np0005548788.localdomain podman[29136]: 2025-12-06 07:59:53.538042666 +0000 UTC m=+0.081326994 container create 444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_hertz, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: Started libpod-conmon-444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041.scope.
Dec 06 07:59:53 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:53 np0005548788.localdomain podman[29136]: 2025-12-06 07:59:53.506679142 +0000 UTC m=+0.049963470 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:53 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b0167ec66f0f7806d5db7f63896d640075d6baa2dfa721e2225c9f2949c17/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b0167ec66f0f7806d5db7f63896d640075d6baa2dfa721e2225c9f2949c17/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b0167ec66f0f7806d5db7f63896d640075d6baa2dfa721e2225c9f2949c17/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b0167ec66f0f7806d5db7f63896d640075d6baa2dfa721e2225c9f2949c17/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b0167ec66f0f7806d5db7f63896d640075d6baa2dfa721e2225c9f2949c17/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548788.localdomain podman[29136]: 2025-12-06 07:59:53.667051291 +0000 UTC m=+0.210335599 container init 444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_hertz, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, release=1763362218, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Dec 06 07:59:53 np0005548788.localdomain podman[29136]: 2025-12-06 07:59:53.677135566 +0000 UTC m=+0.220419894 container start 444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_hertz, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, release=1763362218)
Dec 06 07:59:53 np0005548788.localdomain podman[29136]: 2025-12-06 07:59:53.677453691 +0000 UTC m=+0.220738069 container attach 444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_hertz, RELEASE=main, com.redhat.component=rhceph-container, release=1763362218, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z)
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: --> passed data devices: 0 physical, 2 LVM
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: --> relative data size: 1.0
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new c2e51dbc-fb9d-4d4f-bccd-00fbff3a3611
Dec 06 07:59:54 np0005548788.localdomain lvm[29206]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:59:54 np0005548788.localdomain lvm[29206]: VG ceph_vg0 finished
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 06 07:59:54 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec 06 07:59:55 np0005548788.localdomain agitated_hertz[29152]:  stderr: got monmap epoch 3
Dec 06 07:59:55 np0005548788.localdomain agitated_hertz[29152]: --> Creating keyring file for osd.2
Dec 06 07:59:55 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec 06 07:59:55 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec 06 07:59:55 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid c2e51dbc-fb9d-4d4f-bccd-00fbff3a3611 --setuser ceph --setgroup ceph
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]:  stderr: 2025-12-06T07:59:55.371+0000 7ff232bd9a80 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]:  stderr: 2025-12-06T07:59:55.371+0000 7ff232bd9a80 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 07:59:57 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 0f10fd1b-8fb0-45b2-b45d-bb7bc8e56c93
Dec 06 07:59:58 np0005548788.localdomain lvm[30148]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:59:58 np0005548788.localdomain lvm[30148]: VG ceph_vg1 finished
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-5
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Dec 06 07:59:58 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-5/activate.monmap
Dec 06 07:59:59 np0005548788.localdomain agitated_hertz[29152]:  stderr: got monmap epoch 3
Dec 06 07:59:59 np0005548788.localdomain agitated_hertz[29152]: --> Creating keyring file for osd.5
Dec 06 07:59:59 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/keyring
Dec 06 07:59:59 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/
Dec 06 07:59:59 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 5 --monmap /var/lib/ceph/osd/ceph-5/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-5/ --osd-uuid 0f10fd1b-8fb0-45b2-b45d-bb7bc8e56c93 --setuser ceph --setgroup ceph
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]:  stderr: 2025-12-06T07:59:59.218+0000 7ff589312a80 -1 bluestore(/var/lib/ceph/osd/ceph-5//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]:  stderr: 2025-12-06T07:59:59.218+0000 7ff589312a80 -1 bluestore(/var/lib/ceph/osd/ceph-5/) _read_fsid unparsable uuid
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-5 --no-mon-config
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: --> ceph-volume lvm activate successful for osd ID: 5
Dec 06 08:00:01 np0005548788.localdomain agitated_hertz[29152]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Dec 06 08:00:01 np0005548788.localdomain systemd[1]: libpod-444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041.scope: Deactivated successfully.
Dec 06 08:00:01 np0005548788.localdomain systemd[1]: libpod-444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041.scope: Consumed 3.839s CPU time.
Dec 06 08:00:01 np0005548788.localdomain podman[29136]: 2025-12-06 08:00:01.838825687 +0000 UTC m=+8.382110025 container died 444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_hertz, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Dec 06 08:00:01 np0005548788.localdomain systemd[1]: tmp-crun.8nlKHx.mount: Deactivated successfully.
Dec 06 08:00:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a63b0167ec66f0f7806d5db7f63896d640075d6baa2dfa721e2225c9f2949c17-merged.mount: Deactivated successfully.
Dec 06 08:00:01 np0005548788.localdomain podman[31058]: 2025-12-06 08:00:01.956154533 +0000 UTC m=+0.102535552 container remove 444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_hertz, release=1763362218, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container)
Dec 06 08:00:01 np0005548788.localdomain systemd[1]: libpod-conmon-444f9fa1cabb7ef8699fcc6231f05be2564f2a3f131716baec40edb890220041.scope: Deactivated successfully.
Dec 06 08:00:02 np0005548788.localdomain sudo[29041]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:02 np0005548788.localdomain sudo[31070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:02 np0005548788.localdomain sudo[31070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:02 np0005548788.localdomain sudo[31070]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:02 np0005548788.localdomain sudo[31085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- lvm list --format json
Dec 06 08:00:02 np0005548788.localdomain sudo[31085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 2025-12-06 08:00:02.728476253 +0000 UTC m=+0.064915563 container create 69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_ellis, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, name=rhceph, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 06 08:00:02 np0005548788.localdomain systemd[1]: Started libpod-conmon-69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1.scope.
Dec 06 08:00:02 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 2025-12-06 08:00:02.790816173 +0000 UTC m=+0.127255513 container init 69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_ellis, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph)
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 2025-12-06 08:00:02.800906858 +0000 UTC m=+0.137346158 container start 69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_ellis, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, release=1763362218, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 2025-12-06 08:00:02.801088366 +0000 UTC m=+0.137527716 container attach 69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_ellis, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:02 np0005548788.localdomain goofy_ellis[31156]: 167 167
Dec 06 08:00:02 np0005548788.localdomain systemd[1]: libpod-69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1.scope: Deactivated successfully.
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 2025-12-06 08:00:02.80477633 +0000 UTC m=+0.141215710 container died 69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_ellis, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, ceph=True)
Dec 06 08:00:02 np0005548788.localdomain podman[31141]: 2025-12-06 08:00:02.705972475 +0000 UTC m=+0.042411805 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:02 np0005548788.localdomain podman[31161]: 2025-12-06 08:00:02.895064384 +0000 UTC m=+0.081033471 container remove 69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_ellis, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, version=7)
Dec 06 08:00:02 np0005548788.localdomain systemd[1]: libpod-conmon-69f798334272dc52e0f9dad8a34f56a5999dd052185205214cd743eb57f17fa1.scope: Deactivated successfully.
Dec 06 08:00:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2773365ad430761aaf055a2d7cfb3240f5c2b71beb4a08f337f9d9ec0112d4c7-merged.mount: Deactivated successfully.
Dec 06 08:00:03 np0005548788.localdomain podman[31181]: 
Dec 06 08:00:03 np0005548788.localdomain podman[31181]: 2025-12-06 08:00:03.111578933 +0000 UTC m=+0.078980724 container create 126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bohr, CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1763362218, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 08:00:03 np0005548788.localdomain systemd[1]: Started libpod-conmon-126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196.scope.
Dec 06 08:00:03 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:03 np0005548788.localdomain podman[31181]: 2025-12-06 08:00:03.078901747 +0000 UTC m=+0.046303548 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:03 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eabc159f970d7933dbcc782a626c823496bcd3fd349d254b623c992f5c0c49a9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:03 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eabc159f970d7933dbcc782a626c823496bcd3fd349d254b623c992f5c0c49a9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:03 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eabc159f970d7933dbcc782a626c823496bcd3fd349d254b623c992f5c0c49a9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:03 np0005548788.localdomain podman[31181]: 2025-12-06 08:00:03.216890364 +0000 UTC m=+0.184292206 container init 126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bohr, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main)
Dec 06 08:00:03 np0005548788.localdomain podman[31181]: 2025-12-06 08:00:03.227465332 +0000 UTC m=+0.194867143 container start 126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bohr, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 08:00:03 np0005548788.localdomain podman[31181]: 2025-12-06 08:00:03.227787387 +0000 UTC m=+0.195189168 container attach 126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bohr, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=)
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]: {
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:     "2": [
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:         {
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "devices": [
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "/dev/loop3"
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             ],
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_name": "ceph_lv0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_size": "7511998464",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=25M8WV-2xWV-KUIs-dD6b-ijve-s62F-QFzzeh,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=1939e851-b10c-5c3b-9bb7-8e7f380233e8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=c2e51dbc-fb9d-4d4f-bccd-00fbff3a3611,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_uuid": "25M8WV-2xWV-KUIs-dD6b-ijve-s62F-QFzzeh",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "name": "ceph_lv0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "tags": {
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.block_uuid": "25M8WV-2xWV-KUIs-dD6b-ijve-s62F-QFzzeh",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.cephx_lockbox_secret": "",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.cluster_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.cluster_name": "ceph",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.crush_device_class": "",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.encrypted": "0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.osd_fsid": "c2e51dbc-fb9d-4d4f-bccd-00fbff3a3611",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.osd_id": "2",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.type": "block",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.vdo": "0"
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             },
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "type": "block",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "vg_name": "ceph_vg0"
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:         }
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:     ],
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:     "5": [
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:         {
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "devices": [
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "/dev/loop4"
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             ],
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_name": "ceph_lv1",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_size": "7511998464",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=obaY8C-n6ot-8uf1-vVlT-nBQt-sYC8-8LIeWM,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=1939e851-b10c-5c3b-9bb7-8e7f380233e8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=0f10fd1b-8fb0-45b2-b45d-bb7bc8e56c93,ceph.osd_id=5,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "lv_uuid": "obaY8C-n6ot-8uf1-vVlT-nBQt-sYC8-8LIeWM",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "name": "ceph_lv1",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "tags": {
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.block_uuid": "obaY8C-n6ot-8uf1-vVlT-nBQt-sYC8-8LIeWM",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.cephx_lockbox_secret": "",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.cluster_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.cluster_name": "ceph",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.crush_device_class": "",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.encrypted": "0",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.osd_fsid": "0f10fd1b-8fb0-45b2-b45d-bb7bc8e56c93",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.osd_id": "5",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.type": "block",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:                 "ceph.vdo": "0"
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             },
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "type": "block",
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:             "vg_name": "ceph_vg1"
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:         }
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]:     ]
Dec 06 08:00:03 np0005548788.localdomain affectionate_bohr[31197]: }
Dec 06 08:00:03 np0005548788.localdomain systemd[1]: libpod-126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196.scope: Deactivated successfully.
Dec 06 08:00:03 np0005548788.localdomain podman[31206]: 2025-12-06 08:00:03.657193155 +0000 UTC m=+0.054667921 container died 126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bohr, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True)
Dec 06 08:00:03 np0005548788.localdomain podman[31206]: 2025-12-06 08:00:03.684601913 +0000 UTC m=+0.082076679 container remove 126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bohr, io.buildah.version=1.41.4, name=rhceph, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, release=1763362218, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:03 np0005548788.localdomain systemd[1]: libpod-conmon-126a6579657d68d44bf8e5e3207a5cd17904f2a90df1fa26f7445811da706196.scope: Deactivated successfully.
Dec 06 08:00:03 np0005548788.localdomain sudo[31085]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:03 np0005548788.localdomain sudo[31220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:03 np0005548788.localdomain sudo[31220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:03 np0005548788.localdomain sudo[31220]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:03 np0005548788.localdomain sudo[31235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 08:00:03 np0005548788.localdomain sudo[31235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-eabc159f970d7933dbcc782a626c823496bcd3fd349d254b623c992f5c0c49a9-merged.mount: Deactivated successfully.
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 2025-12-06 08:00:04.431900067 +0000 UTC m=+0.056678225 container create 3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_black, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z)
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: Started libpod-conmon-3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa.scope.
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 2025-12-06 08:00:04.499290245 +0000 UTC m=+0.124068393 container init 3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_black, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 2025-12-06 08:00:04.509245052 +0000 UTC m=+0.134023211 container start 3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_black, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 2025-12-06 08:00:04.509381619 +0000 UTC m=+0.134159777 container attach 3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_black, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=1763362218, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git)
Dec 06 08:00:04 np0005548788.localdomain stupefied_black[31306]: 167 167
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: libpod-3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa.scope: Deactivated successfully.
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 2025-12-06 08:00:04.513149527 +0000 UTC m=+0.137927685 container died 3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_black, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-11-26T19:44:28Z, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:04 np0005548788.localdomain podman[31291]: 2025-12-06 08:00:04.417621156 +0000 UTC m=+0.042399324 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:04 np0005548788.localdomain podman[31311]: 2025-12-06 08:00:04.598828544 +0000 UTC m=+0.073707736 container remove 3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_black, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.41.4, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: libpod-conmon-3a5429b44d11648cb6fffe2deb2863b7a3d8ff4f6f88f0cf759fdeee64054faa.scope: Deactivated successfully.
Dec 06 08:00:04 np0005548788.localdomain podman[31340]: 
Dec 06 08:00:04 np0005548788.localdomain podman[31340]: 2025-12-06 08:00:04.921553767 +0000 UTC m=+0.070060884 container create 1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True)
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e2cbe0618356e0b5ac0733f310f1b035ad650c9015b725797d26b9a4be471a7b-merged.mount: Deactivated successfully.
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: Started libpod-conmon-1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676.scope.
Dec 06 08:00:04 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:04 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7124b08b933513939e212f736ac702398ca642961df749bff8fc341965acae/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:04 np0005548788.localdomain podman[31340]: 2025-12-06 08:00:04.895981524 +0000 UTC m=+0.044488631 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:05 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7124b08b933513939e212f736ac702398ca642961df749bff8fc341965acae/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:05 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7124b08b933513939e212f736ac702398ca642961df749bff8fc341965acae/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:05 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7124b08b933513939e212f736ac702398ca642961df749bff8fc341965acae/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:05 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7124b08b933513939e212f736ac702398ca642961df749bff8fc341965acae/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:05 np0005548788.localdomain podman[31340]: 2025-12-06 08:00:05.051247765 +0000 UTC m=+0.199754872 container init 1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:00:05 np0005548788.localdomain podman[31340]: 2025-12-06 08:00:05.060420526 +0000 UTC m=+0.208927603 container start 1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 08:00:05 np0005548788.localdomain podman[31340]: 2025-12-06 08:00:05.060624845 +0000 UTC m=+0.209131982 container attach 1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test, vendor=Red Hat, Inc., version=7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, release=1763362218, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Dec 06 08:00:05 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test[31355]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 06 08:00:05 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test[31355]:                             [--no-systemd] [--no-tmpfs]
Dec 06 08:00:05 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test[31355]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 06 08:00:05 np0005548788.localdomain podman[31340]: 2025-12-06 08:00:05.268040176 +0000 UTC m=+0.416547273 container died 1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: libpod-1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676.scope: Deactivated successfully.
Dec 06 08:00:05 np0005548788.localdomain systemd-journald[618]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Dec 06 08:00:05 np0005548788.localdomain systemd-journald[618]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 08:00:05 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:00:05 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:00:05 np0005548788.localdomain podman[31360]: 2025-12-06 08:00:05.371404516 +0000 UTC m=+0.093142710 container remove 1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate-test, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, release=1763362218, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph)
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: libpod-conmon-1bc4c69c4bceafdac71adea030949a52eedec4a0277376cbbcfbc6fe8b259676.scope: Deactivated successfully.
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:00:05 np0005548788.localdomain systemd-sysv-generator[31421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:05 np0005548788.localdomain systemd-rc-local-generator[31416]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: tmp-crun.5Jg3FU.mount: Deactivated successfully.
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bd7124b08b933513939e212f736ac702398ca642961df749bff8fc341965acae-merged.mount: Deactivated successfully.
Dec 06 08:00:05 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:00:06 np0005548788.localdomain systemd-sysv-generator[31463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:06 np0005548788.localdomain systemd-rc-local-generator[31460]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:06 np0005548788.localdomain systemd[1]: Starting Ceph osd.2 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 08:00:06 np0005548788.localdomain podman[31523]: 
Dec 06 08:00:06 np0005548788.localdomain podman[31523]: 2025-12-06 08:00:06.62814996 +0000 UTC m=+0.080965937 container create dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, release=1763362218, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:06 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:06 np0005548788.localdomain podman[31523]: 2025-12-06 08:00:06.595727906 +0000 UTC m=+0.048543913 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d07c31c11d2b8d3dbbc79c06fa33efb71e1c5d77baf5a1063f25ec4080b6c177/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d07c31c11d2b8d3dbbc79c06fa33efb71e1c5d77baf5a1063f25ec4080b6c177/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d07c31c11d2b8d3dbbc79c06fa33efb71e1c5d77baf5a1063f25ec4080b6c177/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d07c31c11d2b8d3dbbc79c06fa33efb71e1c5d77baf5a1063f25ec4080b6c177/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d07c31c11d2b8d3dbbc79c06fa33efb71e1c5d77baf5a1063f25ec4080b6c177/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548788.localdomain podman[31523]: 2025-12-06 08:00:06.76705051 +0000 UTC m=+0.219866497 container init dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, release=1763362218, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git)
Dec 06 08:00:06 np0005548788.localdomain systemd[1]: tmp-crun.qvThg2.mount: Deactivated successfully.
Dec 06 08:00:06 np0005548788.localdomain podman[31523]: 2025-12-06 08:00:06.782033655 +0000 UTC m=+0.234849632 container start dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:06 np0005548788.localdomain podman[31523]: 2025-12-06 08:00:06.782399932 +0000 UTC m=+0.235215919 container attach dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, ceph=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.expose-services=)
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 06 08:00:07 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate[31538]: --> ceph-volume raw activate successful for osd ID: 2
Dec 06 08:00:07 np0005548788.localdomain bash[31523]: --> ceph-volume raw activate successful for osd ID: 2
Dec 06 08:00:07 np0005548788.localdomain systemd[1]: libpod-dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7.scope: Deactivated successfully.
Dec 06 08:00:07 np0005548788.localdomain podman[31523]: 2025-12-06 08:00:07.598649717 +0000 UTC m=+1.051465724 container died dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1763362218, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 06 08:00:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d07c31c11d2b8d3dbbc79c06fa33efb71e1c5d77baf5a1063f25ec4080b6c177-merged.mount: Deactivated successfully.
Dec 06 08:00:07 np0005548788.localdomain podman[31653]: 2025-12-06 08:00:07.689464567 +0000 UTC m=+0.078454090 container remove dde0d91b1cef3561e39787f088879e36a83f04a61d60c752d9fd319bfbaa7fa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2-activate, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, ceph=True, name=rhceph, vendor=Red Hat, Inc., release=1763362218, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git)
Dec 06 08:00:07 np0005548788.localdomain podman[31713]: 
Dec 06 08:00:08 np0005548788.localdomain podman[31713]: 2025-12-06 08:00:08.003876228 +0000 UTC m=+0.073170721 container create 82985aff03ff388f88e58a9a38b8d4f0c2476ab5e29ff0f8396a197b5ec591d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Dec 06 08:00:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa98dd7d044bf41e6a65460b6f23721834027e25749ff96c2c713a5550678574/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa98dd7d044bf41e6a65460b6f23721834027e25749ff96c2c713a5550678574/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548788.localdomain podman[31713]: 2025-12-06 08:00:07.975329496 +0000 UTC m=+0.044624039 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa98dd7d044bf41e6a65460b6f23721834027e25749ff96c2c713a5550678574/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa98dd7d044bf41e6a65460b6f23721834027e25749ff96c2c713a5550678574/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa98dd7d044bf41e6a65460b6f23721834027e25749ff96c2c713a5550678574/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548788.localdomain podman[31713]: 2025-12-06 08:00:08.113536834 +0000 UTC m=+0.182831317 container init 82985aff03ff388f88e58a9a38b8d4f0c2476ab5e29ff0f8396a197b5ec591d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64)
Dec 06 08:00:08 np0005548788.localdomain podman[31713]: 2025-12-06 08:00:08.123989345 +0000 UTC m=+0.193283838 container start 82985aff03ff388f88e58a9a38b8d4f0c2476ab5e29ff0f8396a197b5ec591d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Dec 06 08:00:08 np0005548788.localdomain bash[31713]: 82985aff03ff388f88e58a9a38b8d4f0c2476ab5e29ff0f8396a197b5ec591d1
Dec 06 08:00:08 np0005548788.localdomain systemd[1]: Started Ceph osd.2 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: pidfile_write: ignore empty --pid-file
Dec 06 08:00:08 np0005548788.localdomain sudo[31235]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) close
Dec 06 08:00:08 np0005548788.localdomain sudo[31744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:08 np0005548788.localdomain sudo[31744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:08 np0005548788.localdomain sudo[31744]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:08 np0005548788.localdomain sudo[31759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 08:00:08 np0005548788.localdomain sudo[31759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) close
Dec 06 08:00:08 np0005548788.localdomain systemd[1]: tmp-crun.n1YIaG.mount: Deactivated successfully.
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: load: jerasure load: lrc 
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) close
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:08 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) close
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 2025-12-06 08:00:09.077067243 +0000 UTC m=+0.081214010 container create 16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_galois, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: Started libpod-conmon-16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37.scope.
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 2025-12-06 08:00:09.043577928 +0000 UTC m=+0.047724755 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 2025-12-06 08:00:09.144625699 +0000 UTC m=+0.148772516 container init 16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_galois, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1763362218, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 2025-12-06 08:00:09.15464182 +0000 UTC m=+0.158788607 container start 16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_galois, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=)
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 2025-12-06 08:00:09.154909203 +0000 UTC m=+0.159056050 container attach 16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_galois, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1763362218, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc.)
Dec 06 08:00:09 np0005548788.localdomain interesting_galois[31842]: 167 167
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: libpod-16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37.scope: Deactivated successfully.
Dec 06 08:00:09 np0005548788.localdomain podman[31823]: 2025-12-06 08:00:09.157524946 +0000 UTC m=+0.161671733 container died 16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_galois, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1763362218, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:09 np0005548788.localdomain podman[31847]: 2025-12-06 08:00:09.239871107 +0000 UTC m=+0.075917951 container remove 16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_galois, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: libpod-conmon-16adc5dc1ba69b01ee9643559010a9b4dddbdc8f92e1b41db347ff7e931a6c37.scope: Deactivated successfully.
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3590e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs mount
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs mount shared_bdev_used = 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Git sha 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DB SUMMARY
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DB Session ID:  Q0ANI4D63878XQSYK17R
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                     Options.env: 0x5648d3824cb0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                Options.info_log: 0x5648d4522b80
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.write_buffer_manager: 0x5648d357a140
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Compression algorithms supported:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3568850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4522f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4a13d298-1e87-448d-a7e5-5579c9279581
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009295690, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009295957, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: freelist init
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: freelist _read_cfg
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs umount
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) close
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 2025-12-06 08:00:09.496163005 +0000 UTC m=+0.075967152 container create 38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bdev(0x5648d3591180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs mount
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluefs mount shared_bdev_used = 4718592
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Git sha 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DB SUMMARY
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DB Session ID:  Q0ANI4D63878XQSYK17Q
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                     Options.env: 0x5648d36b6690
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                Options.info_log: 0x5648d4544780
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: Started libpod-conmon-38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b.scope.
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.write_buffer_manager: 0x5648d357b5e0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Compression algorithms supported:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c2cc268e50aa0d5dc2f1b22585da985a43a38a33c01425b344156db549406f5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 2025-12-06 08:00:09.464170142 +0000 UTC m=+0.043974289 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d45449e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d35682d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4545d00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3569610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4545d00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3569610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c2cc268e50aa0d5dc2f1b22585da985a43a38a33c01425b344156db549406f5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c2cc268e50aa0d5dc2f1b22585da985a43a38a33c01425b344156db549406f5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648d4545d00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x5648d3569610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4a13d298-1e87-448d-a7e5-5579c9279581
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009581825, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009591612, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4a13d298-1e87-448d-a7e5-5579c9279581", "db_session_id": "Q0ANI4D63878XQSYK17Q", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c2cc268e50aa0d5dc2f1b22585da985a43a38a33c01425b344156db549406f5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009596869, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1608, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 467, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4a13d298-1e87-448d-a7e5-5579c9279581", "db_session_id": "Q0ANI4D63878XQSYK17Q", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009601295, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4a13d298-1e87-448d-a7e5-5579c9279581", "db_session_id": "Q0ANI4D63878XQSYK17Q", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009607001, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 06 08:00:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c2cc268e50aa0d5dc2f1b22585da985a43a38a33c01425b344156db549406f5/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 2025-12-06 08:00:09.614013287 +0000 UTC m=+0.193817414 container init 38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, version=7)
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 2025-12-06 08:00:09.624993872 +0000 UTC m=+0.204797999 container start 38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, version=7)
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 2025-12-06 08:00:09.625320878 +0000 UTC m=+0.205125035 container attach 38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, release=1763362218, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph)
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5648d35d0700
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: DB pointer 0x5648d447fa00
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: _get_class not permitted to load lua
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: _get_class not permitted to load sdk
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: _get_class not permitted to load test_remote_reads
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 load_pgs
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 load_pgs opened 0 pgs
Dec 06 08:00:09 np0005548788.localdomain ceph-osd[31731]: osd.2 0 log_to_monitors true
Dec 06 08:00:09 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2[31727]: 2025-12-06T08:00:09.650+0000 7f5593cd6a80 -1 osd.2 0 log_to_monitors true
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6f1158686c9f17fe930cb8ea2d21363a627997a81092eeef13c7dbb6a82ec2ca-merged.mount: Deactivated successfully.
Dec 06 08:00:09 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test[32088]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 06 08:00:09 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test[32088]:                             [--no-systemd] [--no-tmpfs]
Dec 06 08:00:09 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test[32088]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: libpod-38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b.scope: Deactivated successfully.
Dec 06 08:00:09 np0005548788.localdomain podman[32070]: 2025-12-06 08:00:09.871519452 +0000 UTC m=+0.451323659 container died 38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: tmp-crun.Zpleid.mount: Deactivated successfully.
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1c2cc268e50aa0d5dc2f1b22585da985a43a38a33c01425b344156db549406f5-merged.mount: Deactivated successfully.
Dec 06 08:00:09 np0005548788.localdomain podman[32306]: 2025-12-06 08:00:09.972534822 +0000 UTC m=+0.088236010 container remove 38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate-test, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 08:00:09 np0005548788.localdomain systemd[1]: libpod-conmon-38c975faccc94e7c68152cf2790e1e5f2d33c1de7916b3aa017fc175ac61e61b.scope: Deactivated successfully.
Dec 06 08:00:10 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:00:10 np0005548788.localdomain systemd-rc-local-generator[32363]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:10 np0005548788.localdomain systemd-sysv-generator[32367]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:10 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:00:10 np0005548788.localdomain systemd-rc-local-generator[32403]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:10 np0005548788.localdomain systemd-sysv-generator[32406]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0 done with init, starting boot process
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0 start_boot
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 06 08:00:10 np0005548788.localdomain ceph-osd[31731]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec 06 08:00:10 np0005548788.localdomain systemd[1]: Starting Ceph osd.5 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 2025-12-06 08:00:11.099883023 +0000 UTC m=+0.075781674 container create 47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Dec 06 08:00:11 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 2025-12-06 08:00:11.070914641 +0000 UTC m=+0.046813262 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:11 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/499ad56a10c431abdf77ab7a5eb4101f87087753c32fd464e80700cf8666a883/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/499ad56a10c431abdf77ab7a5eb4101f87087753c32fd464e80700cf8666a883/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/499ad56a10c431abdf77ab7a5eb4101f87087753c32fd464e80700cf8666a883/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/499ad56a10c431abdf77ab7a5eb4101f87087753c32fd464e80700cf8666a883/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/499ad56a10c431abdf77ab7a5eb4101f87087753c32fd464e80700cf8666a883/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 2025-12-06 08:00:11.239523508 +0000 UTC m=+0.215422139 container init 47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z)
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 2025-12-06 08:00:11.261589445 +0000 UTC m=+0.237488106 container start 47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1763362218, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 2025-12-06 08:00:11.262144791 +0000 UTC m=+0.238043422 container attach 47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=)
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Dec 06 08:00:11 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate[32479]: --> ceph-volume raw activate successful for osd ID: 5
Dec 06 08:00:11 np0005548788.localdomain bash[32465]: --> ceph-volume raw activate successful for osd ID: 5
Dec 06 08:00:11 np0005548788.localdomain systemd[1]: libpod-47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120.scope: Deactivated successfully.
Dec 06 08:00:11 np0005548788.localdomain podman[32465]: 2025-12-06 08:00:11.981624557 +0000 UTC m=+0.957523198 container died 47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate, GIT_BRANCH=main, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-499ad56a10c431abdf77ab7a5eb4101f87087753c32fd464e80700cf8666a883-merged.mount: Deactivated successfully.
Dec 06 08:00:12 np0005548788.localdomain podman[32610]: 2025-12-06 08:00:12.144987137 +0000 UTC m=+0.151081954 container remove 47f5341a0aeb3ef721caa500271cf485b77e44905f6f86f979108261cd139120 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, name=rhceph, ceph=True, version=7, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, vcs-type=git)
Dec 06 08:00:12 np0005548788.localdomain podman[32672]: 
Dec 06 08:00:12 np0005548788.localdomain podman[32672]: 2025-12-06 08:00:12.434200413 +0000 UTC m=+0.050509205 container create b43a598bce0147bece1221bc6e837dc7aa935201b03970150cbe028060bcae4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7)
Dec 06 08:00:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbaf36c5f97459d5c6f1d99f2acb8b40b0dab0cfa037e0d23965d9302f17a861/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:12 np0005548788.localdomain podman[32672]: 2025-12-06 08:00:12.415275243 +0000 UTC m=+0.031584045 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbaf36c5f97459d5c6f1d99f2acb8b40b0dab0cfa037e0d23965d9302f17a861/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbaf36c5f97459d5c6f1d99f2acb8b40b0dab0cfa037e0d23965d9302f17a861/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbaf36c5f97459d5c6f1d99f2acb8b40b0dab0cfa037e0d23965d9302f17a861/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbaf36c5f97459d5c6f1d99f2acb8b40b0dab0cfa037e0d23965d9302f17a861/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:12 np0005548788.localdomain podman[32672]: 2025-12-06 08:00:12.570633618 +0000 UTC m=+0.186942430 container init b43a598bce0147bece1221bc6e837dc7aa935201b03970150cbe028060bcae4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5, version=7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True)
Dec 06 08:00:12 np0005548788.localdomain podman[32672]: 2025-12-06 08:00:12.600539434 +0000 UTC m=+0.216848246 container start b43a598bce0147bece1221bc6e837dc7aa935201b03970150cbe028060bcae4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, distribution-scope=public, ceph=True)
Dec 06 08:00:12 np0005548788.localdomain bash[32672]: b43a598bce0147bece1221bc6e837dc7aa935201b03970150cbe028060bcae4c
Dec 06 08:00:12 np0005548788.localdomain systemd[1]: Started Ceph osd.5 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: pidfile_write: ignore empty --pid-file
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) close
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) close
Dec 06 08:00:12 np0005548788.localdomain sudo[31759]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:12 np0005548788.localdomain sudo[32706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:12 np0005548788.localdomain sudo[32706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:12 np0005548788.localdomain sudo[32706]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:12 np0005548788.localdomain sudo[32721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- raw list --format json
Dec 06 08:00:12 np0005548788.localdomain sudo[32721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: starting osd.5 osd_data /var/lib/ceph/osd/ceph-5 /var/lib/ceph/osd/ceph-5/journal
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: load: jerasure load: lrc 
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:12 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) close
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) close
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 2025-12-06 08:00:13.460442421 +0000 UTC m=+0.091243421 container create fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_shamir, description=Red Hat Ceph Storage 7, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2ae00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs mount
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs mount shared_bdev_used = 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Git sha 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DB SUMMARY
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DB Session ID:  TT655CN6CX0UV0EMT2BH
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                     Options.env: 0x55c5371becb0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                Options.info_log: 0x55c537ec8860
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.write_buffer_manager: 0x55c536f14140
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Compression algorithms supported:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8a20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f02850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8c40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8c40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537ec8c40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0a726209-ec9e-460a-b4bb-25ecccb9270f
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013492141, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013492406, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 2025-12-06 08:00:13.412079067 +0000 UTC m=+0.042880077 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old nid_max 1025
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old blobid_max 10240
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta min_alloc_size 0x1000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: freelist init
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: freelist _read_cfg
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs umount
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) close
Dec 06 08:00:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971.scope.
Dec 06 08:00:13 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 2025-12-06 08:00:13.592742431 +0000 UTC m=+0.223543441 container init fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_shamir, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 08:00:13 np0005548788.localdomain pensive_shamir[32992]: 167 167
Dec 06 08:00:13 np0005548788.localdomain systemd[1]: libpod-fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971.scope: Deactivated successfully.
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 2025-12-06 08:00:13.624731244 +0000 UTC m=+0.255532244 container start fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_shamir, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, version=7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, name=rhceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 2025-12-06 08:00:13.624995037 +0000 UTC m=+0.255796047 container attach fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_shamir, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:00:13 np0005548788.localdomain podman[32782]: 2025-12-06 08:00:13.626230225 +0000 UTC m=+0.257031235 container died fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_shamir, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, release=1763362218, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7)
Dec 06 08:00:13 np0005548788.localdomain systemd[1]: tmp-crun.kS2lEY.mount: Deactivated successfully.
Dec 06 08:00:13 np0005548788.localdomain podman[32997]: 2025-12-06 08:00:13.731215021 +0000 UTC m=+0.108371476 container remove fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_shamir, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, release=1763362218, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:00:13 np0005548788.localdomain systemd[1]: libpod-conmon-fead019ea0c8ce098473cc10de81bab51c36a5193721eca6057af34af2371971.scope: Deactivated successfully.
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bdev(0x55c536f2b180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs mount
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluefs mount shared_bdev_used = 4718592
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Git sha 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DB SUMMARY
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DB Session ID:  TT655CN6CX0UV0EMT2BG
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                     Options.env: 0x55c536f4e690
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                Options.info_log: 0x55c537d5c460
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.write_buffer_manager: 0x55c536f155e0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Compression algorithms supported:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5c6c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f022d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5d7e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f03610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5d7e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f03610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c537d5d7e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55c536f03610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0a726209-ec9e-460a-b4bb-25ecccb9270f
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013767675, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013791197, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008013, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0a726209-ec9e-460a-b4bb-25ecccb9270f", "db_session_id": "TT655CN6CX0UV0EMT2BG", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013819724, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008013, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0a726209-ec9e-460a-b4bb-25ecccb9270f", "db_session_id": "TT655CN6CX0UV0EMT2BG", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013824392, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008013, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0a726209-ec9e-460a-b4bb-25ecccb9270f", "db_session_id": "TT655CN6CX0UV0EMT2BG", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008013828180, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c537d56700
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: DB pointer 0x55c537e19a00
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super from 4, latest 4
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super done
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: _get_class not permitted to load lua
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: _get_class not permitted to load sdk
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: _get_class not permitted to load test_remote_reads
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 load_pgs
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 load_pgs opened 0 pgs
Dec 06 08:00:13 np0005548788.localdomain ceph-osd[32690]: osd.5 0 log_to_monitors true
Dec 06 08:00:13 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5[32686]: 2025-12-06T08:00:13.913+0000 7f46e9993a80 -1 osd.5 0 log_to_monitors true
Dec 06 08:00:13 np0005548788.localdomain podman[33198]: 
Dec 06 08:00:13 np0005548788.localdomain podman[33198]: 2025-12-06 08:00:13.931276427 +0000 UTC m=+0.084075414 container create 13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_wilbur, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:00:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf.scope.
Dec 06 08:00:14 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:14 np0005548788.localdomain podman[33198]: 2025-12-06 08:00:13.911245404 +0000 UTC m=+0.064044361 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:14 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/300b2a4d181b7059a591717ac6a4e625adb2f712c32fbbfb3c6c04aba7c39d1e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:14 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/300b2a4d181b7059a591717ac6a4e625adb2f712c32fbbfb3c6c04aba7c39d1e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:14 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/300b2a4d181b7059a591717ac6a4e625adb2f712c32fbbfb3c6c04aba7c39d1e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:14 np0005548788.localdomain podman[33198]: 2025-12-06 08:00:14.054632696 +0000 UTC m=+0.207431653 container init 13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_wilbur, io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Dec 06 08:00:14 np0005548788.localdomain podman[33198]: 2025-12-06 08:00:14.066667962 +0000 UTC m=+0.219466929 container start 13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_wilbur, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, version=7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 08:00:14 np0005548788.localdomain podman[33198]: 2025-12-06 08:00:14.066802578 +0000 UTC m=+0.219601535 container attach 13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_wilbur, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, version=7, vcs-type=git, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main)
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 24.155 iops: 6183.646 elapsed_sec: 0.485
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [WRN] : OSD bench result of 6183.646281 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 0 waiting for initial osdmap
Dec 06 08:00:14 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2[31727]: 2025-12-06T08:00:14.391+0000 7f558fc55640 -1 osd.2 0 waiting for initial osdmap
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 check_osdmap_features require_osd_release unknown -> reef
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 set_numa_affinity not setting numa affinity
Dec 06 08:00:14 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-2[31727]: 2025-12-06T08:00:14.407+0000 7f558b27f640 -1 osd.2 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Dec 06 08:00:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1e4bff1d1989516ad1a851f255ebef7577d80ae8dcfc34af47d5220715bed47b-merged.mount: Deactivated successfully.
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]: {
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:     "0f10fd1b-8fb0-45b2-b45d-bb7bc8e56c93": {
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "ceph_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "osd_id": 5,
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "osd_uuid": "0f10fd1b-8fb0-45b2-b45d-bb7bc8e56c93",
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "type": "bluestore"
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:     },
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:     "c2e51dbc-fb9d-4d4f-bccd-00fbff3a3611": {
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "ceph_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "osd_id": 2,
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "osd_uuid": "c2e51dbc-fb9d-4d4f-bccd-00fbff3a3611",
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:         "type": "bluestore"
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]:     }
Dec 06 08:00:14 np0005548788.localdomain eloquent_wilbur[33246]: }
Dec 06 08:00:14 np0005548788.localdomain podman[33198]: 2025-12-06 08:00:14.638473864 +0000 UTC m=+0.791272811 container died 13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_wilbur, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, distribution-scope=public, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 06 08:00:14 np0005548788.localdomain systemd[1]: libpod-13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf.scope: Deactivated successfully.
Dec 06 08:00:14 np0005548788.localdomain systemd[1]: tmp-crun.WQAAv6.mount: Deactivated successfully.
Dec 06 08:00:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-300b2a4d181b7059a591717ac6a4e625adb2f712c32fbbfb3c6c04aba7c39d1e-merged.mount: Deactivated successfully.
Dec 06 08:00:14 np0005548788.localdomain podman[33283]: 2025-12-06 08:00:14.712341677 +0000 UTC m=+0.061775256 container remove 13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_wilbur, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, version=7, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1763362218, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:14 np0005548788.localdomain systemd[1]: libpod-conmon-13367e06a726e45acb1562d8ac3167c9789e40aa83ab338e91a51869b7869edf.scope: Deactivated successfully.
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[31731]: osd.2 12 state: booting -> active
Dec 06 08:00:14 np0005548788.localdomain sudo[32721]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 06 08:00:14 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 06 08:00:15 np0005548788.localdomain sudo[33296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:00:15 np0005548788.localdomain sudo[33296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:15 np0005548788.localdomain sudo[33296]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:15 np0005548788.localdomain sshd[33311]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:00:15 np0005548788.localdomain sudo[33313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:15 np0005548788.localdomain sudo[33313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:15 np0005548788.localdomain sudo[33313]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:15 np0005548788.localdomain sudo[33328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:00:15 np0005548788.localdomain sudo[33328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0 done with init, starting boot process
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0 start_boot
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 06 08:00:15 np0005548788.localdomain ceph-osd[32690]: osd.5 0  bench count 12288000 bsize 4 KiB
Dec 06 08:00:16 np0005548788.localdomain systemd[1]: tmp-crun.d11g9V.mount: Deactivated successfully.
Dec 06 08:00:16 np0005548788.localdomain podman[33411]: 2025-12-06 08:00:16.228107319 +0000 UTC m=+0.092411016 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:16 np0005548788.localdomain podman[33411]: 2025-12-06 08:00:16.350837528 +0000 UTC m=+0.215141225 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public)
Dec 06 08:00:16 np0005548788.localdomain sudo[33328]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:16 np0005548788.localdomain sudo[33477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:16 np0005548788.localdomain sudo[33477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:16 np0005548788.localdomain sudo[33477]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:16 np0005548788.localdomain ceph-osd[31731]: osd.2 14 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 06 08:00:16 np0005548788.localdomain ceph-osd[31731]: osd.2 14 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec 06 08:00:16 np0005548788.localdomain ceph-osd[31731]: osd.2 14 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 06 08:00:16 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 14 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=14) [2] r=0 lpr=14 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:00:16 np0005548788.localdomain sudo[33492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:00:16 np0005548788.localdomain sudo[33492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:17 np0005548788.localdomain sshd[33527]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:00:17 np0005548788.localdomain sudo[33492]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:17 np0005548788.localdomain sudo[33541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:17 np0005548788.localdomain sudo[33541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:17 np0005548788.localdomain sudo[33541]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:17 np0005548788.localdomain sudo[33556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 08:00:17 np0005548788.localdomain sudo[33556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:17 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 15 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=14) [2] r=0 lpr=14 crt=0'0 mlcod 0'0 undersized+peered mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 2025-12-06 08:00:18.282610738 +0000 UTC m=+0.065901380 container create 1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_stonebraker, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, architecture=x86_64)
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 2025-12-06 08:00:18.251454913 +0000 UTC m=+0.034745575 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7.scope.
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 2025-12-06 08:00:18.442168299 +0000 UTC m=+0.225458921 container init 1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_stonebraker, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, version=7, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Dec 06 08:00:18 np0005548788.localdomain busy_stonebraker[33625]: 167 167
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: libpod-1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7.scope: Deactivated successfully.
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 2025-12-06 08:00:18.542561519 +0000 UTC m=+0.325852151 container start 1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_stonebraker, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 2025-12-06 08:00:18.542972479 +0000 UTC m=+0.326263291 container attach 1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_stonebraker, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=1763362218, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.41.4, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container)
Dec 06 08:00:18 np0005548788.localdomain podman[33609]: 2025-12-06 08:00:18.546334306 +0000 UTC m=+0.329624928 container died 1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_stonebraker, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f6255e43fa2434cbfce2631424de26e90b6f22f6b73f6b8612b16ffb59ba4302-merged.mount: Deactivated successfully.
Dec 06 08:00:18 np0005548788.localdomain podman[33630]: 2025-12-06 08:00:18.740558547 +0000 UTC m=+0.268962066 container remove 1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_stonebraker, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, release=1763362218, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: libpod-conmon-1f5286af179cef8d8f0b06894c72d67970826edc4724762b386fbcac37ab30c7.scope: Deactivated successfully.
Dec 06 08:00:18 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=16 pruub=14.969466209s) [3,4,2] r=2 lpr=16 pi=[14,16)/0 crt=0'0 mlcod 0'0 peered pruub 24.235929489s@ mbc={}] start_peering_interval up [2] -> [3,4,2], acting [2] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:00:18 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=16 pruub=14.969334602s) [3,4,2] r=2 lpr=16 pi=[14,16)/0 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 24.235929489s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:00:18 np0005548788.localdomain podman[33650]: 
Dec 06 08:00:18 np0005548788.localdomain podman[33650]: 2025-12-06 08:00:18.940455086 +0000 UTC m=+0.061038111 container create 4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, architecture=x86_64, name=rhceph, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2.scope.
Dec 06 08:00:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:19 np0005548788.localdomain podman[33650]: 2025-12-06 08:00:18.909966752 +0000 UTC m=+0.030549797 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3a4cece1c854aaf8a72c8af948878e08cd84acf10db7b6e4aa16e60e3b300b8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3a4cece1c854aaf8a72c8af948878e08cd84acf10db7b6e4aa16e60e3b300b8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3a4cece1c854aaf8a72c8af948878e08cd84acf10db7b6e4aa16e60e3b300b8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:19 np0005548788.localdomain podman[33650]: 2025-12-06 08:00:19.047124181 +0000 UTC m=+0.167707236 container init 4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, version=7, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:19 np0005548788.localdomain podman[33650]: 2025-12-06 08:00:19.081265486 +0000 UTC m=+0.201848541 container start 4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container)
Dec 06 08:00:19 np0005548788.localdomain podman[33650]: 2025-12-06 08:00:19.081613342 +0000 UTC m=+0.202196437 container attach 4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:19 np0005548788.localdomain systemd[26155]: Starting Mark boot as successful...
Dec 06 08:00:19 np0005548788.localdomain systemd[26155]: Finished Mark boot as successful.
Dec 06 08:00:19 np0005548788.localdomain sshd[33311]: Received disconnect from 102.140.97.134 port 36910:11: Bye Bye [preauth]
Dec 06 08:00:19 np0005548788.localdomain sshd[33311]: Disconnected from authenticating user root 102.140.97.134 port 36910 [preauth]
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]: [
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:     {
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "available": false,
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "ceph_device": false,
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "lsm_data": {},
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "lvs": [],
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "path": "/dev/sr0",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "rejected_reasons": [
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "Insufficient space (<5GB)",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "Has a FileSystem"
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         ],
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         "sys_api": {
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "actuators": null,
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "device_nodes": "sr0",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "human_readable_size": "482.00 KB",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "id_bus": "ata",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "model": "QEMU DVD-ROM",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "nr_requests": "2",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "partitions": {},
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "path": "/dev/sr0",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "removable": "1",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "rev": "2.5+",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "ro": "0",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "rotational": "1",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "sas_address": "",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "sas_device_handle": "",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "scheduler_mode": "mq-deadline",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "sectors": 0,
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "sectorsize": "2048",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "size": 493568.0,
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "support_discard": "0",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "type": "disk",
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:             "vendor": "QEMU"
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:         }
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]:     }
Dec 06 08:00:19 np0005548788.localdomain funny_chaum[33665]: ]
Dec 06 08:00:19 np0005548788.localdomain systemd[1]: libpod-4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2.scope: Deactivated successfully.
Dec 06 08:00:19 np0005548788.localdomain podman[33650]: 2025-12-06 08:00:19.983501143 +0000 UTC m=+1.104084208 container died 4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, version=7)
Dec 06 08:00:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a3a4cece1c854aaf8a72c8af948878e08cd84acf10db7b6e4aa16e60e3b300b8-merged.mount: Deactivated successfully.
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.655 iops: 5031.753 elapsed_sec: 0.596
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [WRN] : OSD bench result of 5031.753383 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.5. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 0 waiting for initial osdmap
Dec 06 08:00:20 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5[32686]: 2025-12-06T08:00:20.065+0000 7f46e5912640 -1 osd.5 0 waiting for initial osdmap
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 check_osdmap_features require_osd_release unknown -> reef
Dec 06 08:00:20 np0005548788.localdomain podman[35139]: 2025-12-06 08:00:20.097534744 +0000 UTC m=+0.101825468 container remove 4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:20 np0005548788.localdomain systemd[1]: libpod-conmon-4218ae2bc63bdcad777504261453bd2f9a66a1719752db54bffef718122a21a2.scope: Deactivated successfully.
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 set_numa_affinity not setting numa affinity
Dec 06 08:00:20 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-5[32686]: 2025-12-06T08:00:20.103+0000 7f46e0f3c640 -1 osd.5 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:20 np0005548788.localdomain ceph-osd[32690]: osd.5 17 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Dec 06 08:00:20 np0005548788.localdomain sudo[33556]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:20 np0005548788.localdomain sshd[33527]: Received disconnect from 45.78.222.109 port 59284:11: Bye Bye [preauth]
Dec 06 08:00:20 np0005548788.localdomain sshd[33527]: Disconnected from authenticating user root 45.78.222.109 port 59284 [preauth]
Dec 06 08:00:21 np0005548788.localdomain ceph-osd[32690]: osd.5 17 tick checking mon for new map
Dec 06 08:00:21 np0005548788.localdomain ceph-osd[32690]: osd.5 18 state: booting -> active
Dec 06 08:00:21 np0005548788.localdomain sudo[35155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:00:21 np0005548788.localdomain sudo[35155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:21 np0005548788.localdomain sudo[35155]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:27 np0005548788.localdomain sudo[35170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:27 np0005548788.localdomain sudo[35170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:27 np0005548788.localdomain sudo[35170]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:28 np0005548788.localdomain sudo[35185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:00:28 np0005548788.localdomain sudo[35185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:28 np0005548788.localdomain podman[35272]: 2025-12-06 08:00:28.763390691 +0000 UTC m=+0.083780117 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:00:28 np0005548788.localdomain podman[35272]: 2025-12-06 08:00:28.89456454 +0000 UTC m=+0.214953936 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, ceph=True, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7)
Dec 06 08:00:29 np0005548788.localdomain sudo[35185]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:29 np0005548788.localdomain sudo[35339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:00:29 np0005548788.localdomain sudo[35339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:29 np0005548788.localdomain sudo[35339]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:01 np0005548788.localdomain CROND[35355]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 08:01:01 np0005548788.localdomain run-parts[35358]: (/etc/cron.hourly) starting 0anacron
Dec 06 08:01:01 np0005548788.localdomain run-parts[35364]: (/etc/cron.hourly) finished 0anacron
Dec 06 08:01:01 np0005548788.localdomain CROND[35354]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 08:01:22 np0005548788.localdomain sshd[35365]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:01:24 np0005548788.localdomain sshd[35365]: Received disconnect from 103.52.114.250 port 59796:11: Bye Bye [preauth]
Dec 06 08:01:24 np0005548788.localdomain sshd[35365]: Disconnected from authenticating user root 103.52.114.250 port 59796 [preauth]
Dec 06 08:01:29 np0005548788.localdomain sudo[35367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:01:29 np0005548788.localdomain sudo[35367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:29 np0005548788.localdomain sudo[35367]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:29 np0005548788.localdomain sudo[35382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:01:29 np0005548788.localdomain sudo[35382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:30 np0005548788.localdomain podman[35468]: 2025-12-06 08:01:30.629804957 +0000 UTC m=+0.102952882 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=)
Dec 06 08:01:30 np0005548788.localdomain podman[35468]: 2025-12-06 08:01:30.766915421 +0000 UTC m=+0.240063356 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container)
Dec 06 08:01:31 np0005548788.localdomain sudo[35382]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:31 np0005548788.localdomain sudo[35534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:01:31 np0005548788.localdomain sudo[35534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:31 np0005548788.localdomain sudo[35534]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:31 np0005548788.localdomain sudo[35549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:01:31 np0005548788.localdomain sudo[35549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:31 np0005548788.localdomain sudo[35549]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:32 np0005548788.localdomain sudo[35595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:01:32 np0005548788.localdomain sudo[35595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:32 np0005548788.localdomain sudo[35595]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:34 np0005548788.localdomain sshd[24651]: Received disconnect from 192.168.122.100 port 54610:11: disconnected by user
Dec 06 08:01:34 np0005548788.localdomain sshd[24651]: Disconnected from user zuul 192.168.122.100 port 54610
Dec 06 08:01:34 np0005548788.localdomain sshd[24648]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:01:34 np0005548788.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Dec 06 08:01:34 np0005548788.localdomain systemd[1]: session-13.scope: Consumed 22.167s CPU time.
Dec 06 08:01:34 np0005548788.localdomain systemd-logind[765]: Session 13 logged out. Waiting for processes to exit.
Dec 06 08:01:34 np0005548788.localdomain systemd-logind[765]: Removed session 13.
Dec 06 08:02:32 np0005548788.localdomain sudo[35610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:02:32 np0005548788.localdomain sudo[35610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:02:32 np0005548788.localdomain sudo[35610]: pam_unix(sudo:session): session closed for user root
Dec 06 08:02:32 np0005548788.localdomain sudo[35625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:02:32 np0005548788.localdomain sudo[35625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:02:33 np0005548788.localdomain sudo[35625]: pam_unix(sudo:session): session closed for user root
Dec 06 08:02:34 np0005548788.localdomain sudo[35672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:02:34 np0005548788.localdomain sudo[35672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:02:34 np0005548788.localdomain sudo[35672]: pam_unix(sudo:session): session closed for user root
Dec 06 08:02:34 np0005548788.localdomain sshd[35687]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:02:38 np0005548788.localdomain sshd[35687]: Received disconnect from 102.140.97.134 port 49572:11: Bye Bye [preauth]
Dec 06 08:02:38 np0005548788.localdomain sshd[35687]: Disconnected from authenticating user root 102.140.97.134 port 49572 [preauth]
Dec 06 08:02:53 np0005548788.localdomain sshd[35689]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:03:08 np0005548788.localdomain sshd[35690]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:03:10 np0005548788.localdomain sshd[35690]: Received disconnect from 103.52.114.250 port 52160:11: Bye Bye [preauth]
Dec 06 08:03:10 np0005548788.localdomain sshd[35690]: Disconnected from authenticating user root 103.52.114.250 port 52160 [preauth]
Dec 06 08:03:21 np0005548788.localdomain systemd[26155]: Created slice User Background Tasks Slice.
Dec 06 08:03:21 np0005548788.localdomain systemd[26155]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 08:03:21 np0005548788.localdomain systemd[26155]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 08:03:34 np0005548788.localdomain sudo[35693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:03:34 np0005548788.localdomain sudo[35693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:03:34 np0005548788.localdomain sudo[35693]: pam_unix(sudo:session): session closed for user root
Dec 06 08:03:34 np0005548788.localdomain sudo[35708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:03:34 np0005548788.localdomain sudo[35708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:03:34 np0005548788.localdomain sudo[35708]: pam_unix(sudo:session): session closed for user root
Dec 06 08:03:35 np0005548788.localdomain sudo[35754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:03:35 np0005548788.localdomain sudo[35754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:03:35 np0005548788.localdomain sudo[35754]: pam_unix(sudo:session): session closed for user root
Dec 06 08:03:42 np0005548788.localdomain sshd[35689]: Connection closed by 45.78.222.109 port 37764 [preauth]
Dec 06 08:04:35 np0005548788.localdomain sudo[35771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:04:35 np0005548788.localdomain sudo[35771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:04:35 np0005548788.localdomain sudo[35771]: pam_unix(sudo:session): session closed for user root
Dec 06 08:04:35 np0005548788.localdomain sudo[35786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:04:35 np0005548788.localdomain sudo[35786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:04:36 np0005548788.localdomain sudo[35786]: pam_unix(sudo:session): session closed for user root
Dec 06 08:04:36 np0005548788.localdomain sudo[35833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:04:36 np0005548788.localdomain sudo[35833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:04:36 np0005548788.localdomain sudo[35833]: pam_unix(sudo:session): session closed for user root
Dec 06 08:04:55 np0005548788.localdomain sshd[35848]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:04:57 np0005548788.localdomain sshd[35848]: Received disconnect from 102.140.97.134 port 48102:11: Bye Bye [preauth]
Dec 06 08:04:57 np0005548788.localdomain sshd[35848]: Disconnected from authenticating user root 102.140.97.134 port 48102 [preauth]
Dec 06 08:05:14 np0005548788.localdomain sshd[35850]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:14 np0005548788.localdomain sshd[35850]: Accepted publickey for zuul from 192.168.122.100 port 60464 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:05:14 np0005548788.localdomain systemd-logind[765]: New session 27 of user zuul.
Dec 06 08:05:14 np0005548788.localdomain systemd[1]: Started Session 27 of User zuul.
Dec 06 08:05:14 np0005548788.localdomain sshd[35850]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:05:14 np0005548788.localdomain sudo[35896]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jomhghczkjhsbjgtxsejltdsbbbclgdl ; /usr/bin/python3
Dec 06 08:05:14 np0005548788.localdomain sudo[35896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:14 np0005548788.localdomain python3[35898]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 06 08:05:14 np0005548788.localdomain sudo[35896]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:15 np0005548788.localdomain sudo[35941]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgozbuczyjmwdbeynwomohiqzhqwweuk ; /usr/bin/python3
Dec 06 08:05:15 np0005548788.localdomain sudo[35941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:15 np0005548788.localdomain python3[35943]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 08:05:15 np0005548788.localdomain sudo[35941]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:15 np0005548788.localdomain sudo[35961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trukgbnjfrtluelwqzumuvqjptobmxek ; /usr/bin/python3
Dec 06 08:05:15 np0005548788.localdomain sudo[35961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:15 np0005548788.localdomain python3[35963]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548788.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 08:05:16 np0005548788.localdomain useradd[35965]: new group: name=tripleo-admin, GID=1003
Dec 06 08:05:16 np0005548788.localdomain useradd[35965]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Dec 06 08:05:16 np0005548788.localdomain sudo[35961]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:16 np0005548788.localdomain sudo[36017]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tajkbzeqqmopbxvblnvpzeqajxqucsyt ; /usr/bin/python3
Dec 06 08:05:16 np0005548788.localdomain sudo[36017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:16 np0005548788.localdomain python3[36019]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:05:16 np0005548788.localdomain sudo[36017]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:16 np0005548788.localdomain sudo[36060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eofmhfemjntuydhxbgxyheqbmybwkyrj ; /usr/bin/python3
Dec 06 08:05:16 np0005548788.localdomain sudo[36060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:16 np0005548788.localdomain python3[36062]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765008316.2815766-66338-59653666687612/source _original_basename=tmphzucgg_4 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:16 np0005548788.localdomain sudo[36060]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:17 np0005548788.localdomain sudo[36090]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-degjtnzryurxhgomqzyunlywoqyvdtpt ; /usr/bin/python3
Dec 06 08:05:17 np0005548788.localdomain sudo[36090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:17 np0005548788.localdomain python3[36092]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:17 np0005548788.localdomain sudo[36090]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:17 np0005548788.localdomain sudo[36106]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qprosgisffapmzugluczbslfkmiptser ; /usr/bin/python3
Dec 06 08:05:17 np0005548788.localdomain sudo[36106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:17 np0005548788.localdomain python3[36108]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:17 np0005548788.localdomain sudo[36106]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:18 np0005548788.localdomain sudo[36122]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnagkhaaagwmwgeobsnfvrlitaovuruy ; /usr/bin/python3
Dec 06 08:05:18 np0005548788.localdomain sudo[36122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:18 np0005548788.localdomain python3[36124]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:18 np0005548788.localdomain sudo[36122]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:18 np0005548788.localdomain sudo[36138]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuzpydxesnacslrworsqqqffbfrkluqz ; /usr/bin/python3
Dec 06 08:05:18 np0005548788.localdomain sudo[36138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:18 np0005548788.localdomain python3[36140]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:18 np0005548788.localdomain sudo[36138]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:19 np0005548788.localdomain python3[36154]: ansible-ping Invoked with data=pong
Dec 06 08:05:30 np0005548788.localdomain sshd[36155]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:30 np0005548788.localdomain sshd[36155]: Accepted publickey for tripleo-admin from 192.168.122.100 port 37056 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:05:30 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 06 08:05:30 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 06 08:05:30 np0005548788.localdomain systemd-logind[765]: New session 28 of user tripleo-admin.
Dec 06 08:05:30 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 06 08:05:30 np0005548788.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Queued start job for default target Main User Target.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Created slice User Application Slice.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Reached target Paths.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Reached target Timers.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Starting D-Bus User Message Bus Socket...
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Starting Create User's Volatile Files and Directories...
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Finished Create User's Volatile Files and Directories.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Reached target Sockets.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Reached target Basic System.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Reached target Main User Target.
Dec 06 08:05:30 np0005548788.localdomain systemd[36159]: Startup finished in 98ms.
Dec 06 08:05:30 np0005548788.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 06 08:05:30 np0005548788.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Dec 06 08:05:30 np0005548788.localdomain sshd[36155]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 08:05:31 np0005548788.localdomain sudo[36219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwoixilgzlxsrivmezoggffqzoyleksq ; /usr/bin/python3
Dec 06 08:05:31 np0005548788.localdomain sudo[36219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:31 np0005548788.localdomain python3[36221]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 08:05:31 np0005548788.localdomain sudo[36219]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:32 np0005548788.localdomain sshd[36226]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:34 np0005548788.localdomain sshd[36226]: Connection closed by authenticating user root 47.237.163.130 port 59876 [preauth]
Dec 06 08:05:36 np0005548788.localdomain sudo[36241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztgoyybbtamvcmbxfruriwsfkvjyhxqo ; /usr/bin/python3
Dec 06 08:05:36 np0005548788.localdomain sudo[36241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:36 np0005548788.localdomain python3[36243]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Dec 06 08:05:36 np0005548788.localdomain sudo[36241]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548788.localdomain sudo[36244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:05:37 np0005548788.localdomain sudo[36244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:05:37 np0005548788.localdomain sudo[36244]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548788.localdomain sudo[36274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqsbiqeutavnjcyxxgbinaycfsniihlc ; /usr/bin/python3
Dec 06 08:05:37 np0005548788.localdomain sudo[36274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:37 np0005548788.localdomain sudo[36272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:05:37 np0005548788.localdomain sudo[36272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:05:37 np0005548788.localdomain python3[36288]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Dec 06 08:05:37 np0005548788.localdomain sudo[36274]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548788.localdomain sudo[36352]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmfbatfdehsrlrmysnhdeyatdqbzifmx ; /usr/bin/python3
Dec 06 08:05:37 np0005548788.localdomain sudo[36352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:37 np0005548788.localdomain sudo[36272]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548788.localdomain python3[36356]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.dg91mzhktmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:37 np0005548788.localdomain sudo[36352]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:38 np0005548788.localdomain sudo[36398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esjcixudgmnqdzqomathyrzzkwpdjsmt ; /usr/bin/python3
Dec 06 08:05:38 np0005548788.localdomain sudo[36398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:38 np0005548788.localdomain sudo[36401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:05:38 np0005548788.localdomain sudo[36401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:05:38 np0005548788.localdomain sudo[36401]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:38 np0005548788.localdomain python3[36400]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.dg91mzhktmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:38 np0005548788.localdomain sudo[36398]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:39 np0005548788.localdomain sudo[36429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axoolerluayscbekzyaegtevjyfyqrnt ; /usr/bin/python3
Dec 06 08:05:39 np0005548788.localdomain sudo[36429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:39 np0005548788.localdomain python3[36431]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.dg91mzhktmphosts insertbefore=BOF block=172.17.0.106 np0005548788.localdomain np0005548788
                                                         172.18.0.106 np0005548788.storage.localdomain np0005548788.storage
                                                         172.20.0.106 np0005548788.storagemgmt.localdomain np0005548788.storagemgmt
                                                         172.17.0.106 np0005548788.internalapi.localdomain np0005548788.internalapi
                                                         172.19.0.106 np0005548788.tenant.localdomain np0005548788.tenant
                                                         192.168.122.106 np0005548788.ctlplane.localdomain np0005548788.ctlplane
                                                         172.17.0.107 np0005548789.localdomain np0005548789
                                                         172.18.0.107 np0005548789.storage.localdomain np0005548789.storage
                                                         172.20.0.107 np0005548789.storagemgmt.localdomain np0005548789.storagemgmt
                                                         172.17.0.107 np0005548789.internalapi.localdomain np0005548789.internalapi
                                                         172.19.0.107 np0005548789.tenant.localdomain np0005548789.tenant
                                                         192.168.122.107 np0005548789.ctlplane.localdomain np0005548789.ctlplane
                                                         172.17.0.108 np0005548790.localdomain np0005548790
                                                         172.18.0.108 np0005548790.storage.localdomain np0005548790.storage
                                                         172.20.0.108 np0005548790.storagemgmt.localdomain np0005548790.storagemgmt
                                                         172.17.0.108 np0005548790.internalapi.localdomain np0005548790.internalapi
                                                         172.19.0.108 np0005548790.tenant.localdomain np0005548790.tenant
                                                         192.168.122.108 np0005548790.ctlplane.localdomain np0005548790.ctlplane
                                                         172.17.0.103 np0005548785.localdomain np0005548785
                                                         172.18.0.103 np0005548785.storage.localdomain np0005548785.storage
                                                         172.20.0.103 np0005548785.storagemgmt.localdomain np0005548785.storagemgmt
                                                         172.17.0.103 np0005548785.internalapi.localdomain np0005548785.internalapi
                                                         172.19.0.103 np0005548785.tenant.localdomain np0005548785.tenant
                                                         192.168.122.103 np0005548785.ctlplane.localdomain np0005548785.ctlplane
                                                         172.17.0.104 np0005548786.localdomain np0005548786
                                                         172.18.0.104 np0005548786.storage.localdomain np0005548786.storage
                                                         172.20.0.104 np0005548786.storagemgmt.localdomain np0005548786.storagemgmt
                                                         172.17.0.104 np0005548786.internalapi.localdomain np0005548786.internalapi
                                                         172.19.0.104 np0005548786.tenant.localdomain np0005548786.tenant
                                                         192.168.122.104 np0005548786.ctlplane.localdomain np0005548786.ctlplane
                                                         172.17.0.105 np0005548787.localdomain np0005548787
                                                         172.18.0.105 np0005548787.storage.localdomain np0005548787.storage
                                                         172.20.0.105 np0005548787.storagemgmt.localdomain np0005548787.storagemgmt
                                                         172.17.0.105 np0005548787.internalapi.localdomain np0005548787.internalapi
                                                         172.19.0.105 np0005548787.tenant.localdomain np0005548787.tenant
                                                         192.168.122.105 np0005548787.ctlplane.localdomain np0005548787.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.250  overcloud.storage.localdomain
                                                         172.20.0.140  overcloud.storagemgmt.localdomain
                                                         172.17.0.168  overcloud.internalapi.localdomain
                                                         172.21.0.196  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:39 np0005548788.localdomain sudo[36429]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:39 np0005548788.localdomain sudo[36445]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qthecqpomaroftrfvpauwihzovglkhdf ; /usr/bin/python3
Dec 06 08:05:39 np0005548788.localdomain sudo[36445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:39 np0005548788.localdomain python3[36447]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.dg91mzhktmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:05:39 np0005548788.localdomain sudo[36445]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:40 np0005548788.localdomain sudo[36462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wctgqbnsaocoigucevbviodzrmlcedcg ; /usr/bin/python3
Dec 06 08:05:40 np0005548788.localdomain sudo[36462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:40 np0005548788.localdomain python3[36464]: ansible-file Invoked with path=/tmp/ansible.dg91mzhktmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:40 np0005548788.localdomain sudo[36462]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:41 np0005548788.localdomain sudo[36478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfncnnwzmturzolplxkvermlksdhwrik ; /usr/bin/python3
Dec 06 08:05:41 np0005548788.localdomain sudo[36478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:41 np0005548788.localdomain python3[36480]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:05:41 np0005548788.localdomain sudo[36478]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:41 np0005548788.localdomain sudo[36495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwkquqkbqrusqznoikvjvqmkxknujfew ; /usr/bin/python3
Dec 06 08:05:41 np0005548788.localdomain sudo[36495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:42 np0005548788.localdomain python3[36497]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:05:45 np0005548788.localdomain sudo[36495]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:47 np0005548788.localdomain sudo[36514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byvvfqqujrnpljeoipmkaslsbpgwyili ; /usr/bin/python3
Dec 06 08:05:47 np0005548788.localdomain sudo[36514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:47 np0005548788.localdomain python3[36516]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:05:47 np0005548788.localdomain sudo[36514]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:47 np0005548788.localdomain sudo[36531]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yylhpnnqlsxcmeupxdneauqykfvhzclx ; /usr/bin/python3
Dec 06 08:05:47 np0005548788.localdomain sudo[36531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:48 np0005548788.localdomain python3[36533]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:06:03 np0005548788.localdomain groupadd[36705]: group added to /etc/group: name=puppet, GID=52
Dec 06 08:06:03 np0005548788.localdomain groupadd[36705]: group added to /etc/gshadow: name=puppet
Dec 06 08:06:03 np0005548788.localdomain groupadd[36705]: new group: name=puppet, GID=52
Dec 06 08:06:03 np0005548788.localdomain useradd[36712]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Dec 06 08:06:38 np0005548788.localdomain sudo[37449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:06:38 np0005548788.localdomain sudo[37449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:06:38 np0005548788.localdomain sudo[37449]: pam_unix(sudo:session): session closed for user root
Dec 06 08:06:38 np0005548788.localdomain sudo[37466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:06:38 np0005548788.localdomain sudo[37466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:06:39 np0005548788.localdomain sudo[37466]: pam_unix(sudo:session): session closed for user root
Dec 06 08:06:39 np0005548788.localdomain sudo[37514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:06:39 np0005548788.localdomain sudo[37514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:06:39 np0005548788.localdomain sudo[37514]: pam_unix(sudo:session): session closed for user root
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  Converting 2700 SID table entries...
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:06:57 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:06:57 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 06 08:06:57 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:06:57 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:06:57 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:06:57 np0005548788.localdomain systemd-rc-local-generator[37685]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:06:57 np0005548788.localdomain systemd-sysv-generator[37691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:06:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:06:57 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:06:58 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:06:58 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:06:58 np0005548788.localdomain systemd[1]: run-r41fc16ab217046db97fdd6e1c5994113.service: Deactivated successfully.
Dec 06 08:06:59 np0005548788.localdomain sudo[36531]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:01 np0005548788.localdomain sudo[38129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewyoyhpypnhazichsvakmtnlqqyxejml ; /usr/bin/python3
Dec 06 08:07:01 np0005548788.localdomain sudo[38129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:01 np0005548788.localdomain python3[38131]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:02 np0005548788.localdomain sudo[38129]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:03 np0005548788.localdomain sudo[38268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvsbwpqgyhxpbjbnafkhpulqswgphlky ; /usr/bin/python3
Dec 06 08:07:03 np0005548788.localdomain sudo[38268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:03 np0005548788.localdomain python3[38270]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:07:03 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:07:03 np0005548788.localdomain systemd-rc-local-generator[38291]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:07:03 np0005548788.localdomain systemd-sysv-generator[38299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:07:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:07:03 np0005548788.localdomain sudo[38268]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:05 np0005548788.localdomain sudo[38321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgttcbqvlazznquaxrfhllnmbmqxjerk ; /usr/bin/python3
Dec 06 08:07:05 np0005548788.localdomain sudo[38321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:05 np0005548788.localdomain python3[38323]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:05 np0005548788.localdomain sudo[38321]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:05 np0005548788.localdomain sudo[38337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guhkthyugqubuuukracyhcxirbuecefe ; /usr/bin/python3
Dec 06 08:07:05 np0005548788.localdomain sudo[38337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:05 np0005548788.localdomain python3[38339]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:05 np0005548788.localdomain sudo[38337]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:06 np0005548788.localdomain sudo[38354]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbgsbwrkgmpnnsglncoqzgbvqxjzthbl ; /usr/bin/python3
Dec 06 08:07:06 np0005548788.localdomain sudo[38354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:06 np0005548788.localdomain python3[38356]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 08:07:06 np0005548788.localdomain sudo[38354]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:06 np0005548788.localdomain sudo[38372]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yioxfzvdxmncjbaydupzdrtorgaavvgf ; /usr/bin/python3
Dec 06 08:07:06 np0005548788.localdomain sudo[38372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:07 np0005548788.localdomain python3[38374]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:07 np0005548788.localdomain sudo[38372]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:07 np0005548788.localdomain sudo[38390]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csijlruqkfwvdmyqxdvxzdlvjgsvxcbc ; /usr/bin/python3
Dec 06 08:07:07 np0005548788.localdomain sudo[38390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:07 np0005548788.localdomain python3[38392]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:07 np0005548788.localdomain sudo[38390]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:07 np0005548788.localdomain sudo[38408]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpzomvsrwxocykjicfbrscqhfukfimse ; /usr/bin/python3
Dec 06 08:07:07 np0005548788.localdomain sudo[38408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:08 np0005548788.localdomain python3[38410]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 08:07:08 np0005548788.localdomain systemd[1]: Reloading Network Manager...
Dec 06 08:07:08 np0005548788.localdomain NetworkManager[5968]: <info>  [1765008428.1382] audit: op="reload" arg="0" pid=38413 uid=0 result="success"
Dec 06 08:07:08 np0005548788.localdomain NetworkManager[5968]: <info>  [1765008428.1392] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Dec 06 08:07:08 np0005548788.localdomain NetworkManager[5968]: <info>  [1765008428.1392] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Dec 06 08:07:08 np0005548788.localdomain systemd[1]: Reloaded Network Manager.
Dec 06 08:07:08 np0005548788.localdomain sudo[38408]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:08 np0005548788.localdomain sudo[38427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdokshzewhukjfgghkfszjwbeytbipou ; /usr/bin/python3
Dec 06 08:07:08 np0005548788.localdomain sudo[38427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:08 np0005548788.localdomain python3[38429]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:08 np0005548788.localdomain sudo[38427]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:08 np0005548788.localdomain sudo[38444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiwmhkorzcyvurfqelrlgeatakkpqabc ; /usr/bin/python3
Dec 06 08:07:08 np0005548788.localdomain sudo[38444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:09 np0005548788.localdomain python3[38446]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:09 np0005548788.localdomain sudo[38444]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:09 np0005548788.localdomain sudo[38462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfqogjlzmvhaaigngffcwezsjjhvsdzv ; /usr/bin/python3
Dec 06 08:07:09 np0005548788.localdomain sudo[38462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:09 np0005548788.localdomain python3[38464]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:09 np0005548788.localdomain sudo[38462]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:09 np0005548788.localdomain sudo[38478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcamzaodvqlkzwgtwytucvqwmryhmxdj ; /usr/bin/python3
Dec 06 08:07:09 np0005548788.localdomain sudo[38478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:09 np0005548788.localdomain python3[38480]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:09 np0005548788.localdomain sudo[38478]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:10 np0005548788.localdomain sudo[38494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfhxszkijjononrshurnzcqcunjgsoge ; /usr/bin/python3
Dec 06 08:07:10 np0005548788.localdomain sudo[38494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:10 np0005548788.localdomain python3[38496]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 06 08:07:10 np0005548788.localdomain sudo[38494]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:10 np0005548788.localdomain sudo[38510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jexqaietccfyeecfviknfizxscqanqzu ; /usr/bin/python3
Dec 06 08:07:10 np0005548788.localdomain sudo[38510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:11 np0005548788.localdomain python3[38512]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:11 np0005548788.localdomain sudo[38510]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:11 np0005548788.localdomain sudo[38526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acnmnqqxpfugegzxyglbipyqoappispi ; /usr/bin/python3
Dec 06 08:07:11 np0005548788.localdomain sudo[38526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:11 np0005548788.localdomain python3[38528]: ansible-blockinfile Invoked with path=/tmp/ansible.kwh5hrp2 block=[192.168.122.106]*,[np0005548788.ctlplane.localdomain]*,[172.17.0.106]*,[np0005548788.internalapi.localdomain]*,[172.18.0.106]*,[np0005548788.storage.localdomain]*,[172.20.0.106]*,[np0005548788.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005548788.tenant.localdomain]*,[np0005548788.localdomain]*,[np0005548788]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxIoAQH9YZnGrAxYR5prFQwo6HY5mwdDjndb+bp2pwvtVLM4ABIdCi+K1wpbhOpoO7BsYOf/tdBqemvSDleNo/ZLh3v3MmoVtoTtQZqLWsAQWFgJCjcGUGB+H3CHhtbp706coVQMlGD+UQqpCBy8WamMB/Ldy+hSHbLHwzuMzj8tO90vUbEyuKgOuu/X3ZFa+Yjo/asQ+PTrVfirh1QvRQ9aK22xH89KbThA/1an4OjnNGLCP752auSQ894B21QLKfqaMGPlpbjU8Wr6MP4zKV9lUzpQiFr6IU6cd4CeIsJDj7FnAZuBSmi8ewgm/r4ZWkmCSlqw8OpMC5soJnm8Q4PJTIFvT9eyyFCh9xmQkMhzE8P332LtYjZ+vXhYFU14e04mOQx5UrtHN8uWJVbOAwtLNAcenHyRtCQGkAZ6f9q0OvSuYr+o3FhHhN5ABu32AKAD8YpkjLypi+PbaiKNQW8XzPAHHbV8CGZ4B09ZWeQY49VA0bPxIYBXd1mEBlXSE=
                                                         [192.168.122.107]*,[np0005548789.ctlplane.localdomain]*,[172.17.0.107]*,[np0005548789.internalapi.localdomain]*,[172.18.0.107]*,[np0005548789.storage.localdomain]*,[172.20.0.107]*,[np0005548789.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005548789.tenant.localdomain]*,[np0005548789.localdomain]*,[np0005548789]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwH3rhRTvOINLmLdbeRXeXOiMzz+IXEuW2cXYAe50Wcc3ikH2RVGirWQrwLc8hAoA7UFCXADqEMxPg6/fLsQkbP7kLOpUtam8nuXvgt8VHM4RFl5wh9EOgZ7DWgjA7s3r2eQMcBhv82CjVMLY/YjnLuRNXCsJAqeG32qcKedKH/huEFvkb49U/UnNlxi5BfNrMlY9n5UQXE2rd6EKwP58aP/qQ1ie3p8nwHc36/MJcfEIABlLaoHK/LxnadOFTh93OkqVi7A0VQsKSmKD64nABiN7ML0NReoyRIQI5r3Dawe8v2K9jCBh5jY88TVsYUJqgwoZSSU73sYGHX4uF+PY8wL7qwn6mCzA17GGYeB8Dy0N8qwDqah6kUjpcLwGp7YaKf0FIZPBKcLVMrX6Tnwxer1j3kOIt3tgLZoz3mMfstWfCyvt9t+GEW5MCE+MBkY4Eree3uK7pI+wJ3vFQS9XVP00hjNiLWYmoaaW6rl8xtw7QtGhzmjcWbOxaZvHWE5E=
                                                         [192.168.122.108]*,[np0005548790.ctlplane.localdomain]*,[172.17.0.108]*,[np0005548790.internalapi.localdomain]*,[172.18.0.108]*,[np0005548790.storage.localdomain]*,[172.20.0.108]*,[np0005548790.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005548790.tenant.localdomain]*,[np0005548790.localdomain]*,[np0005548790]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDmdMCy44p73Ui+o09YQitqR9FILqoJ6AGYYutFVH6wn5m1j6oEoI4XgVFPR3UpG3SXdoiG7m0DRxC/WZZMpZbaQ3ZHbJJioRh1hV5uQtK5k2gtmS8uePng5UprbLncMXf+HIxNRvirU3r6zdgNGAroK0rN0nWESi/FNb2flu9Aw9JAsgIAAouW4IUoeyMGZ1AflhRhsWsQMstM9UEeGU+iTqV7al1URVCSq1finY99m+QC+Pftpd2C/+agboOIiVa63+D/RqqfYqh4C/PYfDbssYjcZzk3P90+HQ6uMKexX3HRnFbyje4eLSBHC0pjr/4pNfk/eSpdHeyMAPsP+QlBztdcPj9OnjcmT9ymeJRKF7GwNIWg3Pn9L2yY50d8l9Zu6rNIDW786XNcbm88yHdCHA5FE1A8XTWQRQ3eUSUsmsvf03pExAouRM4Fj8dvCu6wzG2SuyWqmdT5yCNrUG0e1CeE6PcfTLBeS5CJAwn5HM8aUndQQldWmaUbMPL5Jis=
                                                         [192.168.122.103]*,[np0005548785.ctlplane.localdomain]*,[172.17.0.103]*,[np0005548785.internalapi.localdomain]*,[172.18.0.103]*,[np0005548785.storage.localdomain]*,[172.20.0.103]*,[np0005548785.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005548785.tenant.localdomain]*,[np0005548785.localdomain]*,[np0005548785]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC89JzJHuRLDUgmU66VPdPVwYLrvslBwa5i2QfiUzrnpt1lKz8ayq6QMRy5y5GgfjQQhX/YZiAjUSoogVsYDkoDaImXdtfQHFlFMLTlJPiYcA/cGAwMAE/vifpWoztBHUXkJ5YWUojkXzGoR8d7ESx/tTLG/9QrQDsW6JcV18mcFCQZdeWYWGWdLn6ynmQOZ0N4U6mYK1FqE+GKgP6L9PEjkC1ePo81AnYcdQ5Z1IETdcCcJytdvvxH/Zie1PiAaMAgMYhsqu7+DZRRTvg+cEMw3mRVuodIyQEbpZs8MjR3itViRfZ+UqYi6uKDnz1viLL0aACaYhOLzrE7bQ6Sl4j1MnMrWncUOv3Sq2fus+Y6oYmed84E6HUNljte7vVP9jwPclbCAmj5WuC/Av9dSqqHEpPRbKJ4tAuBrO2LBKS7J62FjRYiY807V1viyxUgjK5FmsQyfVr3/YOirluSx54e4XwxxDrAjtrd0x68H7/Mt6HP/79cWKaVbC7XUckYRmE=
                                                         [192.168.122.104]*,[np0005548786.ctlplane.localdomain]*,[172.17.0.104]*,[np0005548786.internalapi.localdomain]*,[172.18.0.104]*,[np0005548786.storage.localdomain]*,[172.20.0.104]*,[np0005548786.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005548786.tenant.localdomain]*,[np0005548786.localdomain]*,[np0005548786]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDURzBA/aIGrwPgaIApy0UCTi4wdQhfDEx0QfkSAIn0ZptZcOkaR8BWtl9GijRPEp++Ep4qU04JcwHO1ZULd2UnCdDeg1Imwnf7x9HQBjAr0mH+tE0t4MBLtBbrk8Ep5ggyKATK1CvEl3NuGIS4gSSUWxzkR74Iju/GtrEMuVnMSsOw+auBofiv1ne4zyXqQWZORiK32DSolw1KyXGLyqG+JOpl3Kza5o79S1KUghfRzskZMm/AxFYciPmg4EQK/jL9Izj7qq3v8MaL8baeyqNlPaaRKCh+pkZlYtoPzDhe+vn/jwnDmQgqC1Bh+dkNiKEVlWz3mxoiMoeLY3jP/tMF2M4M8puGakPc2sqJxk1++Tv/lFRO3zBS+V2kECKI5DtQI6XThfLYXxIQl5SHr4yGEoxhMNt6YNQPLp6lg30kHO24YyNNA7LPFYYoOGUCaq5ZVUCF9lagMxcgkN0Bs+ZZqeni+53RqxoutiRZ0m9pIiqxGjrJjbNFXmofgfDBcUE=
                                                         [192.168.122.105]*,[np0005548787.ctlplane.localdomain]*,[172.17.0.105]*,[np0005548787.internalapi.localdomain]*,[172.18.0.105]*,[np0005548787.storage.localdomain]*,[172.20.0.105]*,[np0005548787.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005548787.tenant.localdomain]*,[np0005548787.localdomain]*,[np0005548787]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXe0UZ2kJKcvYaHSnjIOf3QqkGhArLo32nvDm8Pl8ZVNWfdRV8R+e17etAicDq//fxWC+U9jiHp4qI6/0Jm64rPocmJKaA+r79sNpv+598NlGtVUfTYQ34Ze9bgaPkjAwKfPNrzjSDChyfkys4Hm0J7ttog5rvMcuRelxkFmoonOcuzBC+9ufI6qld7br5w4WDookwamkefbMCiwAZxrw2bSjoTu7/TEFbt7SM0lUIdqP5WvxpWK52OkjnakQ0BL4QHdRYz1kBx/vS0TFxXb2pMO291dfkxDl3H2oXXZZYK/LWy3nZyJEX+mD5J6WOEs5HC5GQQ+CNEV0wa2e/gJA7KBsyL5T6RBtH8id22sBHZkzcaDhUz1ZABGAiOx4rdrr4YFFFy/u00nX3ZCuRBPXYh37Pafl7GXcSKyhTmkCZI0591RdNmb1duh9ZIObRmPVp2+WIheAFvS7EU4B0+ZjAEbDJgiSa9VlUrlRFX0ajcFHR8FnwNRcoERO3A3h4/Tc=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:11 np0005548788.localdomain sudo[38526]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:12 np0005548788.localdomain sudo[38542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tprkyodljadchozjgwhadylvlavhwzab ; /usr/bin/python3
Dec 06 08:07:12 np0005548788.localdomain sudo[38542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:12 np0005548788.localdomain python3[38544]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.kwh5hrp2' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:12 np0005548788.localdomain sudo[38542]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:12 np0005548788.localdomain sudo[38560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayaiazzvdjuownxtosldmtdgrqcmvqrh ; /usr/bin/python3
Dec 06 08:07:12 np0005548788.localdomain sudo[38560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:12 np0005548788.localdomain python3[38562]: ansible-file Invoked with path=/tmp/ansible.kwh5hrp2 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:12 np0005548788.localdomain sudo[38560]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:13 np0005548788.localdomain sudo[38576]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnwhjmeuegohppcjujddfjkxtvyvuwbu ; /usr/bin/python3
Dec 06 08:07:13 np0005548788.localdomain sudo[38576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:13 np0005548788.localdomain python3[38578]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:07:13 np0005548788.localdomain sudo[38576]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:13 np0005548788.localdomain sudo[38592]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kddqshmpgqymqxxohzjrjsewnyabdgyb ; /usr/bin/python3
Dec 06 08:07:13 np0005548788.localdomain sudo[38592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:13 np0005548788.localdomain python3[38594]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:13 np0005548788.localdomain sudo[38592]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:14 np0005548788.localdomain sudo[38610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbyfkiijwgwmsyobrbjeqhjgjxfeebqb ; /usr/bin/python3
Dec 06 08:07:14 np0005548788.localdomain sudo[38610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:14 np0005548788.localdomain python3[38612]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:14 np0005548788.localdomain sudo[38610]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:14 np0005548788.localdomain sudo[38629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljhcqopqgjuofuloigybrjyycqwaabxa ; /usr/bin/python3
Dec 06 08:07:14 np0005548788.localdomain sudo[38629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:14 np0005548788.localdomain python3[38631]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Dec 06 08:07:14 np0005548788.localdomain sudo[38629]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:14 np0005548788.localdomain sudo[38645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rezdnvksjcjwfzprgjxknfhhfqiomhee ; /usr/bin/python3
Dec 06 08:07:14 np0005548788.localdomain sudo[38645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:15 np0005548788.localdomain sudo[38645]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:15 np0005548788.localdomain sudo[38693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scfkgfmlaeohgjjgskyujtpyocktssmu ; /usr/bin/python3
Dec 06 08:07:15 np0005548788.localdomain sudo[38693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:15 np0005548788.localdomain sudo[38693]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:15 np0005548788.localdomain sudo[38736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cedlpmmobrftjzquszjzyztclmppbgbv ; /usr/bin/python3
Dec 06 08:07:15 np0005548788.localdomain sudo[38736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:15 np0005548788.localdomain sudo[38736]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:16 np0005548788.localdomain sshd[38753]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:07:17 np0005548788.localdomain sudo[38768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otbvdekxzpwaklphxvmvnzhahwioqiwq ; /usr/bin/python3
Dec 06 08:07:17 np0005548788.localdomain sudo[38768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:17 np0005548788.localdomain python3[38770]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:17 np0005548788.localdomain sudo[38768]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:17 np0005548788.localdomain sudo[38785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-endtxlsihhkguwnoncplradbvdsydxhy ; /usr/bin/python3
Dec 06 08:07:17 np0005548788.localdomain sudo[38785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:17 np0005548788.localdomain python3[38787]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:07:18 np0005548788.localdomain sshd[38753]: Received disconnect from 102.140.97.134 port 48684:11: Bye Bye [preauth]
Dec 06 08:07:18 np0005548788.localdomain sshd[38753]: Disconnected from authenticating user root 102.140.97.134 port 48684 [preauth]
Dec 06 08:07:21 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:07:21 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:07:21 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:07:21 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:07:21 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:07:21 np0005548788.localdomain systemd-rc-local-generator[38856]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:07:21 np0005548788.localdomain systemd-sysv-generator[38862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:07:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:07:21 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:07:21 np0005548788.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: tuned.service: Consumed 2.044s CPU time.
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:07:22 np0005548788.localdomain systemd[1]: run-r57c3dfdbfec542589e40deda56a3dac0.service: Deactivated successfully.
Dec 06 08:07:23 np0005548788.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 08:07:23 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:07:23 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:07:23 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:07:23 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:07:23 np0005548788.localdomain systemd[1]: run-ra16e0379ecea41d2a91762adb82c55d5.service: Deactivated successfully.
Dec 06 08:07:24 np0005548788.localdomain sudo[38785]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:24 np0005548788.localdomain sudo[39222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymucfpgifkfqpmndgjuqwksopkruogcz ; /usr/bin/python3
Dec 06 08:07:24 np0005548788.localdomain sudo[39222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:24 np0005548788.localdomain python3[39224]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:07:24 np0005548788.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 08:07:24 np0005548788.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 06 08:07:24 np0005548788.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 08:07:24 np0005548788.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 08:07:26 np0005548788.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 08:07:26 np0005548788.localdomain sudo[39222]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:26 np0005548788.localdomain sudo[39417]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtfarszghelbqrrnrfjyklvqgqfirqnt ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 06 08:07:26 np0005548788.localdomain sudo[39417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:26 np0005548788.localdomain python3[39419]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:26 np0005548788.localdomain sudo[39417]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:27 np0005548788.localdomain sudo[39434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfkwtfzrmlfnxtxqlgbhmjvfetssrthe ; /usr/bin/python3
Dec 06 08:07:27 np0005548788.localdomain sudo[39434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:27 np0005548788.localdomain python3[39436]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Dec 06 08:07:27 np0005548788.localdomain sudo[39434]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:27 np0005548788.localdomain sudo[39450]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpubhqvbvyhragpuozfqtlaxraydplzu ; /usr/bin/python3
Dec 06 08:07:27 np0005548788.localdomain sudo[39450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:27 np0005548788.localdomain python3[39452]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:27 np0005548788.localdomain sudo[39450]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:27 np0005548788.localdomain sudo[39466]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wiuxyaotmgbspepoldoswfyfuwbpqlhm ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 06 08:07:27 np0005548788.localdomain sudo[39466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:28 np0005548788.localdomain python3[39468]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:29 np0005548788.localdomain sudo[39466]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:29 np0005548788.localdomain sudo[39486]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajzfbacwxcozkqrpqohoxlpztwokscoq ; /usr/bin/python3
Dec 06 08:07:29 np0005548788.localdomain sudo[39486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:29 np0005548788.localdomain python3[39488]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:29 np0005548788.localdomain sudo[39486]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:30 np0005548788.localdomain sudo[39503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgwpkyaohoiiwrxiglrwfiemlmzzggsk ; /usr/bin/python3
Dec 06 08:07:30 np0005548788.localdomain sudo[39503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:30 np0005548788.localdomain python3[39505]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:30 np0005548788.localdomain sudo[39503]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:32 np0005548788.localdomain sudo[39519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsibwvuuauurgvuskzfmceatzwvnbhnp ; /usr/bin/python3
Dec 06 08:07:32 np0005548788.localdomain sudo[39519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:32 np0005548788.localdomain python3[39521]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:32 np0005548788.localdomain sudo[39519]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:37 np0005548788.localdomain sudo[39535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyzlfbttttbpkkuvorqouyylxpvkxinw ; /usr/bin/python3
Dec 06 08:07:37 np0005548788.localdomain sudo[39535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:37 np0005548788.localdomain python3[39537]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:37 np0005548788.localdomain sudo[39535]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:37 np0005548788.localdomain sudo[39583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhcqxwykxijygbeocqgkpxxzulltespj ; /usr/bin/python3
Dec 06 08:07:37 np0005548788.localdomain sudo[39583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:38 np0005548788.localdomain python3[39585]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:38 np0005548788.localdomain sudo[39583]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:38 np0005548788.localdomain sudo[39628]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxhkxqmhewmxyiiermgpeeupdtjrbjsx ; /usr/bin/python3
Dec 06 08:07:38 np0005548788.localdomain sudo[39628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:38 np0005548788.localdomain python3[39630]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008457.8245826-71085-222169865197941/source _original_basename=tmpw29r3b3x follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:38 np0005548788.localdomain sudo[39628]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:38 np0005548788.localdomain sudo[39658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lseyduzligqyvghrbvogytfqotdagluh ; /usr/bin/python3
Dec 06 08:07:38 np0005548788.localdomain sudo[39658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:38 np0005548788.localdomain python3[39660]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:38 np0005548788.localdomain sudo[39658]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:39 np0005548788.localdomain sudo[39706]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqhpnvlofcmybqitswfqntvyspittqlr ; /usr/bin/python3
Dec 06 08:07:39 np0005548788.localdomain sudo[39706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:39 np0005548788.localdomain python3[39708]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:39 np0005548788.localdomain sudo[39706]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:39 np0005548788.localdomain sudo[39749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpuutrynibzcquaglofpfygcenpxpwaq ; /usr/bin/python3
Dec 06 08:07:39 np0005548788.localdomain sudo[39749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:39 np0005548788.localdomain python3[39751]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008459.3163774-71178-191467630384722/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=c7cc1670a1e268d7901b4353362279cc1f651214 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:39 np0005548788.localdomain sudo[39749]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548788.localdomain sudo[39763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:07:40 np0005548788.localdomain sudo[39763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:07:40 np0005548788.localdomain sudo[39763]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548788.localdomain sudo[39781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:07:40 np0005548788.localdomain sudo[39781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:07:40 np0005548788.localdomain sudo[39841]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywoghxrahpphkezmigaaynqpxnnibdfk ; /usr/bin/python3
Dec 06 08:07:40 np0005548788.localdomain sudo[39841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:40 np0005548788.localdomain python3[39843]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:40 np0005548788.localdomain sudo[39841]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548788.localdomain sudo[39912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eciuqtaojnrpmojoqogzbybzbfipmrwu ; /usr/bin/python3
Dec 06 08:07:40 np0005548788.localdomain sudo[39912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:40 np0005548788.localdomain sudo[39781]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548788.localdomain python3[39918]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008460.1461399-71235-24744498145195/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=8c98a1379d65c02b867387467a21d26fe82a1c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:40 np0005548788.localdomain sudo[39912]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:41 np0005548788.localdomain sudo[39965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:07:41 np0005548788.localdomain sudo[39965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:07:41 np0005548788.localdomain sudo[39965]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:41 np0005548788.localdomain sudo[39993]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfmfrgbhuafeqvsmxwrddsthqqeoddbe ; /usr/bin/python3
Dec 06 08:07:41 np0005548788.localdomain sudo[39993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:41 np0005548788.localdomain python3[39995]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:41 np0005548788.localdomain sudo[39993]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:41 np0005548788.localdomain sudo[40036]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsyvqsvzhzbmusufnauwkzwfjgjetphk ; /usr/bin/python3
Dec 06 08:07:41 np0005548788.localdomain sudo[40036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:41 np0005548788.localdomain python3[40038]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008461.0151143-71235-275896586927633/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=2906872dac8eb33feea0b6fc0243b65109687e47 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:41 np0005548788.localdomain sudo[40036]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:42 np0005548788.localdomain sudo[40098]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfsbvxtuezfevblcgnybzewlgzxjvgdr ; /usr/bin/python3
Dec 06 08:07:42 np0005548788.localdomain sudo[40098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:42 np0005548788.localdomain python3[40100]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:42 np0005548788.localdomain sudo[40098]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:42 np0005548788.localdomain sudo[40141]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usojloyxboadiwlhvfigwwycctzcuatg ; /usr/bin/python3
Dec 06 08:07:42 np0005548788.localdomain sudo[40141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:42 np0005548788.localdomain python3[40143]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008461.9451256-71235-77185414033742/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:42 np0005548788.localdomain sudo[40141]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:43 np0005548788.localdomain sudo[40203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmioqthtzlhsnptepsztdryscmljfppc ; /usr/bin/python3
Dec 06 08:07:43 np0005548788.localdomain sudo[40203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:43 np0005548788.localdomain python3[40205]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:43 np0005548788.localdomain sudo[40203]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:43 np0005548788.localdomain sudo[40246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jramlaccwwznbhuvhqrozedrudqdeqmg ; /usr/bin/python3
Dec 06 08:07:43 np0005548788.localdomain sudo[40246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:43 np0005548788.localdomain python3[40248]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008462.9609184-71235-254275953321865/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:43 np0005548788.localdomain sudo[40246]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:43 np0005548788.localdomain sudo[40308]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cclfbfgbmghyvmhljddyjtcpchuuadqu ; /usr/bin/python3
Dec 06 08:07:43 np0005548788.localdomain sudo[40308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:44 np0005548788.localdomain python3[40310]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:44 np0005548788.localdomain sudo[40308]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:44 np0005548788.localdomain sudo[40351]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbthdlrhatvwhdshdbvkkamyudtnmlfb ; /usr/bin/python3
Dec 06 08:07:44 np0005548788.localdomain sudo[40351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:44 np0005548788.localdomain python3[40353]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008463.7889118-71235-156680756162467/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=3ab2614d1ef7406fa9b5a05718303d5ebb288f64 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:44 np0005548788.localdomain sudo[40351]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:44 np0005548788.localdomain sudo[40413]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vchcbympvcnriuvbpxgnmsvauafxbmgb ; /usr/bin/python3
Dec 06 08:07:44 np0005548788.localdomain sudo[40413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:44 np0005548788.localdomain python3[40415]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:44 np0005548788.localdomain sudo[40413]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:45 np0005548788.localdomain sudo[40456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdanbrgxromvfmtreesefylrnjgutfju ; /usr/bin/python3
Dec 06 08:07:45 np0005548788.localdomain sudo[40456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:45 np0005548788.localdomain python3[40458]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008464.589149-71235-187375293090203/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:45 np0005548788.localdomain sudo[40456]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:45 np0005548788.localdomain sudo[40518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnccswkkkfbywfywaufhktskglpdzlmr ; /usr/bin/python3
Dec 06 08:07:45 np0005548788.localdomain sudo[40518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:45 np0005548788.localdomain python3[40520]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:45 np0005548788.localdomain sudo[40518]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:46 np0005548788.localdomain sudo[40561]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrkytygcpckygtoqxjrwjftzkznwjkxk ; /usr/bin/python3
Dec 06 08:07:46 np0005548788.localdomain sudo[40561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:46 np0005548788.localdomain python3[40563]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008465.4683867-71235-136239499879787/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=955531133cc86a259eb018c78aadbdeb821782e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:46 np0005548788.localdomain sudo[40561]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:46 np0005548788.localdomain sudo[40623]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfwcwaiwmxkzmwkolwbdsiusfrvrafld ; /usr/bin/python3
Dec 06 08:07:46 np0005548788.localdomain sudo[40623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:46 np0005548788.localdomain python3[40625]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:46 np0005548788.localdomain sudo[40623]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:46 np0005548788.localdomain sudo[40666]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrwysmcvftuitpmrpqwkrykxclhzltey ; /usr/bin/python3
Dec 06 08:07:46 np0005548788.localdomain sudo[40666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:47 np0005548788.localdomain python3[40668]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008466.354341-71235-222548211131446/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:47 np0005548788.localdomain sudo[40666]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:47 np0005548788.localdomain sudo[40728]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssecueguqbtbgggqzkiwztvydopozjln ; /usr/bin/python3
Dec 06 08:07:47 np0005548788.localdomain sudo[40728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:47 np0005548788.localdomain python3[40730]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:47 np0005548788.localdomain sudo[40728]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:47 np0005548788.localdomain sudo[40771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udgrxaqgumoraygnqvbtixhdetfhwfcv ; /usr/bin/python3
Dec 06 08:07:47 np0005548788.localdomain sudo[40771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:47 np0005548788.localdomain python3[40773]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008467.2035425-71235-75639784879036/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:47 np0005548788.localdomain sudo[40771]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:48 np0005548788.localdomain sudo[40833]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaxptiptsqkbzowghzwvapelqtbwueqf ; /usr/bin/python3
Dec 06 08:07:48 np0005548788.localdomain sudo[40833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:48 np0005548788.localdomain python3[40835]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:48 np0005548788.localdomain sudo[40833]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:48 np0005548788.localdomain sudo[40876]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnqoqueiywxgkfqoozwauwbtuwshusym ; /usr/bin/python3
Dec 06 08:07:48 np0005548788.localdomain sudo[40876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:48 np0005548788.localdomain python3[40878]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008468.0985718-71235-137289557159190/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=d0ab0ad0a4d26d457fdb3ac7c2b24afa624b4802 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:48 np0005548788.localdomain sudo[40876]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:49 np0005548788.localdomain sudo[40906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhowygmhmaddvivssncjrqarvrqkhkyc ; /usr/bin/python3
Dec 06 08:07:49 np0005548788.localdomain sudo[40906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:49 np0005548788.localdomain python3[40908]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:49 np0005548788.localdomain sudo[40906]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:49 np0005548788.localdomain sudo[40954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeenmeppfnzpemlzisujbzgrdqvhjrxf ; /usr/bin/python3
Dec 06 08:07:49 np0005548788.localdomain sudo[40954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:50 np0005548788.localdomain python3[40956]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:50 np0005548788.localdomain sudo[40954]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:50 np0005548788.localdomain sudo[40997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imnvknjucxjszpzdnrouwwdykvamqxzb ; /usr/bin/python3
Dec 06 08:07:50 np0005548788.localdomain sudo[40997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:50 np0005548788.localdomain python3[40999]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008469.7312622-71863-257650071548316/source _original_basename=tmpn2rnd_t5 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:50 np0005548788.localdomain sudo[40997]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:54 np0005548788.localdomain sudo[41027]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvuqvgblyafouzxzlplixhongkbosutw ; /usr/bin/python3
Dec 06 08:07:54 np0005548788.localdomain sudo[41027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:55 np0005548788.localdomain python3[41029]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 08:07:55 np0005548788.localdomain sudo[41027]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:55 np0005548788.localdomain sudo[41088]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzlcdvctptmpqtanacowgijhgyyqftuc ; /usr/bin/python3
Dec 06 08:07:55 np0005548788.localdomain sudo[41088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:55 np0005548788.localdomain python3[41090]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:59 np0005548788.localdomain sudo[41088]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:59 np0005548788.localdomain sudo[41105]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rncbwziopycacpyhcdkwnpgcuwmbuhht ; /usr/bin/python3
Dec 06 08:07:59 np0005548788.localdomain sudo[41105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:00 np0005548788.localdomain python3[41107]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:04 np0005548788.localdomain sudo[41105]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:05 np0005548788.localdomain sudo[41122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuucwdxdzrmpxcbatqgncjbvndlicqrz ; /usr/bin/python3
Dec 06 08:08:05 np0005548788.localdomain sudo[41122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:05 np0005548788.localdomain python3[41124]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:05 np0005548788.localdomain sudo[41122]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:05 np0005548788.localdomain sudo[41145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyiaqpqddlmsopbyruchofzahriyvlpr ; /usr/bin/python3
Dec 06 08:08:05 np0005548788.localdomain sudo[41145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:06 np0005548788.localdomain python3[41147]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:06 np0005548788.localdomain sudo[41145]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:06 np0005548788.localdomain sudo[41168]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyxhbbbtaljwddxxlgvrkbyypiojqesm ; /usr/bin/python3
Dec 06 08:08:06 np0005548788.localdomain sudo[41168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:06 np0005548788.localdomain python3[41170]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:06 np0005548788.localdomain sudo[41168]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:07 np0005548788.localdomain sudo[41191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfainmoscthvkjuehmcfkiyttsrbiccf ; /usr/bin/python3
Dec 06 08:08:07 np0005548788.localdomain sudo[41191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:07 np0005548788.localdomain python3[41193]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:07 np0005548788.localdomain sudo[41191]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:07 np0005548788.localdomain sudo[41214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnhgafmujamgmjlzfuwuaawrcjemzjqz ; /usr/bin/python3
Dec 06 08:08:07 np0005548788.localdomain sudo[41214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:07 np0005548788.localdomain python3[41216]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:07 np0005548788.localdomain sudo[41214]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:21 np0005548788.localdomain systemd[36159]: Starting Mark boot as successful...
Dec 06 08:08:21 np0005548788.localdomain systemd[36159]: Finished Mark boot as successful.
Dec 06 08:08:41 np0005548788.localdomain sudo[41225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:08:41 np0005548788.localdomain sudo[41225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:08:41 np0005548788.localdomain sudo[41225]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:41 np0005548788.localdomain sudo[41240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:08:41 np0005548788.localdomain sudo[41240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:08:42 np0005548788.localdomain sudo[41240]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:44 np0005548788.localdomain sudo[41287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:08:44 np0005548788.localdomain sudo[41287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:08:44 np0005548788.localdomain sudo[41287]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:50 np0005548788.localdomain sudo[41315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pepjthuglgqwerpccgypzlfvqobhddlu ; /usr/bin/python3
Dec 06 08:08:50 np0005548788.localdomain sudo[41315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:50 np0005548788.localdomain python3[41317]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:50 np0005548788.localdomain sudo[41315]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:50 np0005548788.localdomain sudo[41363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpjimtafzgjelrfxkbiyyqvnbgoznose ; /usr/bin/python3
Dec 06 08:08:50 np0005548788.localdomain sudo[41363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:50 np0005548788.localdomain python3[41365]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:50 np0005548788.localdomain sudo[41363]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:51 np0005548788.localdomain sudo[41381]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctzjhnfqmlauyieufseyctpdlbxvbrua ; /usr/bin/python3
Dec 06 08:08:51 np0005548788.localdomain sudo[41381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:51 np0005548788.localdomain python3[41383]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpagvo9_vm recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:51 np0005548788.localdomain sudo[41381]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:51 np0005548788.localdomain sudo[41411]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eowrkbotigkahjbcpjcdygdwroiohfwx ; /usr/bin/python3
Dec 06 08:08:51 np0005548788.localdomain sudo[41411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:51 np0005548788.localdomain python3[41413]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:51 np0005548788.localdomain sudo[41411]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:52 np0005548788.localdomain sudo[41459]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqjkazlntvzradpqzdpeeidfevplhgnq ; /usr/bin/python3
Dec 06 08:08:52 np0005548788.localdomain sudo[41459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:52 np0005548788.localdomain python3[41461]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:52 np0005548788.localdomain sudo[41459]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:52 np0005548788.localdomain sudo[41477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yozavsnsoyhmmwnoejnpntscujspdhdq ; /usr/bin/python3
Dec 06 08:08:52 np0005548788.localdomain sudo[41477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:52 np0005548788.localdomain python3[41479]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:52 np0005548788.localdomain sudo[41477]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:53 np0005548788.localdomain sudo[41539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtyqeafjuufqfgjztsddrzunyoyhjgcw ; /usr/bin/python3
Dec 06 08:08:53 np0005548788.localdomain sudo[41539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:53 np0005548788.localdomain python3[41541]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:53 np0005548788.localdomain sudo[41539]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:53 np0005548788.localdomain sudo[41557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujdxzdunzatlhizaaadmfsgfxarqpjsy ; /usr/bin/python3
Dec 06 08:08:53 np0005548788.localdomain sudo[41557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:53 np0005548788.localdomain python3[41559]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:53 np0005548788.localdomain sudo[41557]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:53 np0005548788.localdomain sudo[41619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqvpkvnkyxnwpwhlcrhrzfllnikjyycn ; /usr/bin/python3
Dec 06 08:08:53 np0005548788.localdomain sudo[41619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:53 np0005548788.localdomain python3[41621]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:54 np0005548788.localdomain sudo[41619]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:54 np0005548788.localdomain sudo[41637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxoykktnwzbvqvohrksdplrufhgtoilj ; /usr/bin/python3
Dec 06 08:08:54 np0005548788.localdomain sudo[41637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:54 np0005548788.localdomain python3[41639]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:54 np0005548788.localdomain sudo[41637]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:54 np0005548788.localdomain sudo[41699]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhptoiwghixncapeytjtvkrjhfjgqpdn ; /usr/bin/python3
Dec 06 08:08:54 np0005548788.localdomain sudo[41699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:54 np0005548788.localdomain python3[41701]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:54 np0005548788.localdomain sudo[41699]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:54 np0005548788.localdomain sudo[41717]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqglzpvulkvgvegpwnvodyoicvfhszod ; /usr/bin/python3
Dec 06 08:08:54 np0005548788.localdomain sudo[41717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:55 np0005548788.localdomain python3[41719]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:55 np0005548788.localdomain sudo[41717]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:55 np0005548788.localdomain sudo[41779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsnjrxxxxantadawnhqfcrkyiqabvgna ; /usr/bin/python3
Dec 06 08:08:55 np0005548788.localdomain sudo[41779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:55 np0005548788.localdomain python3[41781]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:55 np0005548788.localdomain sudo[41779]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:55 np0005548788.localdomain sudo[41797]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-heoriahguvhnupaewkrreccwkquarrno ; /usr/bin/python3
Dec 06 08:08:55 np0005548788.localdomain sudo[41797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:55 np0005548788.localdomain python3[41799]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:55 np0005548788.localdomain sudo[41797]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:56 np0005548788.localdomain sudo[41859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqaggmdwzssmdxvzaljulkpzrhobcxol ; /usr/bin/python3
Dec 06 08:08:56 np0005548788.localdomain sudo[41859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:56 np0005548788.localdomain python3[41861]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:56 np0005548788.localdomain sudo[41859]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:56 np0005548788.localdomain sudo[41877]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayhgzinhkziwtbjjqxwlwswnsdpiditb ; /usr/bin/python3
Dec 06 08:08:56 np0005548788.localdomain sudo[41877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:56 np0005548788.localdomain python3[41879]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:56 np0005548788.localdomain sudo[41877]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:56 np0005548788.localdomain sudo[41939]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxnxcizsyxdrlozghjxsyuiolywcsygw ; /usr/bin/python3
Dec 06 08:08:56 np0005548788.localdomain sudo[41939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:57 np0005548788.localdomain python3[41941]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:57 np0005548788.localdomain sudo[41939]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:57 np0005548788.localdomain sudo[41957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntppbmqmfotgqfigtianrtizvvzqjxsp ; /usr/bin/python3
Dec 06 08:08:57 np0005548788.localdomain sudo[41957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:57 np0005548788.localdomain python3[41959]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:57 np0005548788.localdomain sudo[41957]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:57 np0005548788.localdomain sudo[42019]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gokkdncjdcashvlqebtahqhgjpihjewz ; /usr/bin/python3
Dec 06 08:08:57 np0005548788.localdomain sudo[42019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:57 np0005548788.localdomain python3[42021]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:57 np0005548788.localdomain sudo[42019]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:57 np0005548788.localdomain sudo[42037]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgmbqkaciaibcwqjaruwbbmyhdlgqaqx ; /usr/bin/python3
Dec 06 08:08:57 np0005548788.localdomain sudo[42037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:58 np0005548788.localdomain python3[42039]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:58 np0005548788.localdomain sudo[42037]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:58 np0005548788.localdomain sudo[42099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfslcnetgtuexjkerqwigpzoyxgnugxl ; /usr/bin/python3
Dec 06 08:08:58 np0005548788.localdomain sudo[42099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:58 np0005548788.localdomain python3[42101]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:58 np0005548788.localdomain sudo[42099]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:58 np0005548788.localdomain sudo[42117]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aydekmdjajerobstwvffusmuwlamfwzw ; /usr/bin/python3
Dec 06 08:08:58 np0005548788.localdomain sudo[42117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:58 np0005548788.localdomain python3[42119]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:58 np0005548788.localdomain sudo[42117]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:59 np0005548788.localdomain sudo[42179]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdndoqldwxkgjmuetrqctdvaicpwbjaj ; /usr/bin/python3
Dec 06 08:08:59 np0005548788.localdomain sudo[42179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:59 np0005548788.localdomain python3[42181]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:59 np0005548788.localdomain sudo[42179]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:59 np0005548788.localdomain sudo[42197]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtulebewtbjjfphsncwayicrzwmhgjqb ; /usr/bin/python3
Dec 06 08:08:59 np0005548788.localdomain sudo[42197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:59 np0005548788.localdomain python3[42199]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:59 np0005548788.localdomain sudo[42197]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:59 np0005548788.localdomain sudo[42259]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goqjmfekmpchzwjrxoaergebohkxpchc ; /usr/bin/python3
Dec 06 08:08:59 np0005548788.localdomain sudo[42259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:00 np0005548788.localdomain python3[42261]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:00 np0005548788.localdomain sudo[42259]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:00 np0005548788.localdomain sudo[42277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlhdczpbwevzslmlwtpohmtedlmuimts ; /usr/bin/python3
Dec 06 08:09:00 np0005548788.localdomain sudo[42277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:00 np0005548788.localdomain python3[42279]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:00 np0005548788.localdomain sudo[42277]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:00 np0005548788.localdomain sudo[42307]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkzqqrmstthgljhzcglxzlklpngqxetp ; /usr/bin/python3
Dec 06 08:09:00 np0005548788.localdomain sudo[42307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:01 np0005548788.localdomain python3[42309]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:09:01 np0005548788.localdomain sudo[42307]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:01 np0005548788.localdomain sudo[42355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkdhgxkxowqnzquilszejuvywemigpcn ; /usr/bin/python3
Dec 06 08:09:01 np0005548788.localdomain sudo[42355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:01 np0005548788.localdomain python3[42357]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:01 np0005548788.localdomain sudo[42355]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:02 np0005548788.localdomain sudo[42373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sapctdsmauivcsouzmavfkewwcwunzbb ; /usr/bin/python3
Dec 06 08:09:02 np0005548788.localdomain sudo[42373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:02 np0005548788.localdomain python3[42375]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpz442gpd6 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:02 np0005548788.localdomain sudo[42373]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:04 np0005548788.localdomain sudo[42403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvnlptbogucfiemnvcjrcjaqocgzabxo ; /usr/bin/python3
Dec 06 08:09:04 np0005548788.localdomain sudo[42403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:04 np0005548788.localdomain python3[42405]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:09:07 np0005548788.localdomain sudo[42403]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:09 np0005548788.localdomain sudo[42420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncrvcuvlghcrgleohxnaxdaykwtugkah ; /usr/bin/python3
Dec 06 08:09:09 np0005548788.localdomain sudo[42420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:09 np0005548788.localdomain python3[42422]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:09:09 np0005548788.localdomain sudo[42420]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:09 np0005548788.localdomain sudo[42438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lczoioamjtasmcppfcfryjsgjekejqha ; /usr/bin/python3
Dec 06 08:09:09 np0005548788.localdomain sudo[42438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:09 np0005548788.localdomain python3[42440]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:09:10 np0005548788.localdomain sudo[42438]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:10 np0005548788.localdomain sudo[42456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nupdbdwrpyzupasypfvapqmgdsfsecrz ; /usr/bin/python3
Dec 06 08:09:10 np0005548788.localdomain sudo[42456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:10 np0005548788.localdomain python3[42458]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:09:10 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:09:10 np0005548788.localdomain systemd-rc-local-generator[42484]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:09:10 np0005548788.localdomain systemd-sysv-generator[42489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:09:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:09:10 np0005548788.localdomain systemd[1]: Starting Netfilter Tables...
Dec 06 08:09:10 np0005548788.localdomain systemd[1]: Finished Netfilter Tables.
Dec 06 08:09:10 np0005548788.localdomain sudo[42456]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:11 np0005548788.localdomain sudo[42546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvhtendikzzibdliusfkcadqrrfyhkkp ; /usr/bin/python3
Dec 06 08:09:11 np0005548788.localdomain sudo[42546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:11 np0005548788.localdomain python3[42548]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:11 np0005548788.localdomain sudo[42546]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:11 np0005548788.localdomain sudo[42589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfchcwurggejiodfbqxxpvckdxvmmjfz ; /usr/bin/python3
Dec 06 08:09:11 np0005548788.localdomain sudo[42589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:11 np0005548788.localdomain python3[42591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008551.207546-74804-120518329381826/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:11 np0005548788.localdomain sudo[42589]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:12 np0005548788.localdomain sudo[42619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrqwbhfijimyaggxftmixxjozntrgbjo ; /usr/bin/python3
Dec 06 08:09:12 np0005548788.localdomain sudo[42619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:12 np0005548788.localdomain python3[42621]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:12 np0005548788.localdomain sudo[42619]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:12 np0005548788.localdomain sudo[42637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcioxorawpwkkfxurnnahtzvklngnmnp ; /usr/bin/python3
Dec 06 08:09:12 np0005548788.localdomain sudo[42637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:12 np0005548788.localdomain python3[42639]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:12 np0005548788.localdomain sudo[42637]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:13 np0005548788.localdomain sudo[42686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zibyimfbdsdmvdepqaeiazpgcvpgkczo ; /usr/bin/python3
Dec 06 08:09:13 np0005548788.localdomain sudo[42686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:13 np0005548788.localdomain python3[42688]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:13 np0005548788.localdomain sudo[42686]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:13 np0005548788.localdomain sudo[42729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgzaohquxvghdwethloapwoklyjbmdfe ; /usr/bin/python3
Dec 06 08:09:13 np0005548788.localdomain sudo[42729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:13 np0005548788.localdomain python3[42731]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008553.105158-74937-211825104154996/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:13 np0005548788.localdomain sudo[42729]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:14 np0005548788.localdomain sudo[42791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qawtmfgzoxhoowvewknikxtmuhzleowh ; /usr/bin/python3
Dec 06 08:09:14 np0005548788.localdomain sudo[42791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:14 np0005548788.localdomain python3[42793]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:14 np0005548788.localdomain sudo[42791]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:14 np0005548788.localdomain sudo[42834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flbmpirkrwyaufkopniifgxjnbhwfyof ; /usr/bin/python3
Dec 06 08:09:14 np0005548788.localdomain sudo[42834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:14 np0005548788.localdomain python3[42836]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008554.0664792-74996-87031528466871/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:14 np0005548788.localdomain sudo[42834]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:15 np0005548788.localdomain sudo[42896]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nimrslldxcbyonwvulycrrdnzccpltne ; /usr/bin/python3
Dec 06 08:09:15 np0005548788.localdomain sudo[42896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:15 np0005548788.localdomain python3[42898]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:15 np0005548788.localdomain sudo[42896]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:15 np0005548788.localdomain sudo[42939]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otmdemcxzhesetihmvyjdjefiobpplyq ; /usr/bin/python3
Dec 06 08:09:15 np0005548788.localdomain sudo[42939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:15 np0005548788.localdomain python3[42941]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008555.1516433-75065-13637964437891/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:15 np0005548788.localdomain sudo[42939]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:16 np0005548788.localdomain sudo[43001]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zllcrtebzpzwpkjdsqxggfszbxehbvkz ; /usr/bin/python3
Dec 06 08:09:16 np0005548788.localdomain sudo[43001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:16 np0005548788.localdomain python3[43003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:16 np0005548788.localdomain sudo[43001]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:16 np0005548788.localdomain sudo[43044]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnykgagvsgcswbhkkwxhjawscgpaoljc ; /usr/bin/python3
Dec 06 08:09:16 np0005548788.localdomain sudo[43044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:16 np0005548788.localdomain python3[43046]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008556.0935783-75128-111529700061444/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:16 np0005548788.localdomain sudo[43044]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:17 np0005548788.localdomain sudo[43106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dihfcgvywgllhhnsuyjokdnschsszybb ; /usr/bin/python3
Dec 06 08:09:17 np0005548788.localdomain sudo[43106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:17 np0005548788.localdomain python3[43108]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:17 np0005548788.localdomain sudo[43106]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:18 np0005548788.localdomain sudo[43149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndglwxlrqoyqbpiksoofuiuenshpekgh ; /usr/bin/python3
Dec 06 08:09:18 np0005548788.localdomain sudo[43149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:18 np0005548788.localdomain python3[43151]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008557.045775-75216-264227523398686/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:18 np0005548788.localdomain sudo[43149]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:18 np0005548788.localdomain sudo[43179]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-segxhrkemhgcpilfuziknucrbndieffm ; /usr/bin/python3
Dec 06 08:09:18 np0005548788.localdomain sudo[43179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:18 np0005548788.localdomain python3[43181]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:19 np0005548788.localdomain sudo[43179]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:19 np0005548788.localdomain sudo[43245]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqdxmexnnldzijcuwpqvjsfusvziekcp ; /usr/bin/python3
Dec 06 08:09:19 np0005548788.localdomain sudo[43245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:19 np0005548788.localdomain python3[43247]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:19 np0005548788.localdomain sudo[43245]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:19 np0005548788.localdomain sudo[43262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmipphnzviadpnxgbldthtnhfotmgrom ; /usr/bin/python3
Dec 06 08:09:19 np0005548788.localdomain sudo[43262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:19 np0005548788.localdomain python3[43264]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:19 np0005548788.localdomain sudo[43262]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:20 np0005548788.localdomain sudo[43279]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdvutlgibtkejtfdlcbcbzxrflbjrdmw ; /usr/bin/python3
Dec 06 08:09:20 np0005548788.localdomain sudo[43279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:20 np0005548788.localdomain python3[43281]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:20 np0005548788.localdomain sudo[43279]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:20 np0005548788.localdomain sudo[43298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpfpwxbfrrdoobkqseumycwrzgvsotme ; /usr/bin/python3
Dec 06 08:09:20 np0005548788.localdomain sudo[43298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:20 np0005548788.localdomain python3[43300]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:20 np0005548788.localdomain sudo[43298]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:20 np0005548788.localdomain sudo[43314]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swkhmhbjuroikhmvjgycscaqjeewbwvf ; /usr/bin/python3
Dec 06 08:09:20 np0005548788.localdomain sudo[43314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:21 np0005548788.localdomain python3[43316]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:21 np0005548788.localdomain sudo[43314]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:21 np0005548788.localdomain sudo[43330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fheqshwbshnbmnyvgycarcdvyykitrnl ; /usr/bin/python3
Dec 06 08:09:21 np0005548788.localdomain sudo[43330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:21 np0005548788.localdomain python3[43332]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:21 np0005548788.localdomain sudo[43330]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:21 np0005548788.localdomain sudo[43346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgrxiesdxujbawupdjamewpiqwrjcfus ; /usr/bin/python3
Dec 06 08:09:21 np0005548788.localdomain sudo[43346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:21 np0005548788.localdomain python3[43348]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 08:09:22 np0005548788.localdomain sudo[43346]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:22 np0005548788.localdomain sudo[43367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcsyqqbznvhosiaaqfwokvgqiwnmdxpk ; /usr/bin/python3
Dec 06 08:09:22 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Dec 06 08:09:22 np0005548788.localdomain sudo[43367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:23 np0005548788.localdomain python3[43369]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:09:23 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:09:24 np0005548788.localdomain sudo[43367]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:24 np0005548788.localdomain sudo[43388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alanhxyjlnigeljhdkdwlyeorovnpyem ; /usr/bin/python3
Dec 06 08:09:24 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 06 08:09:24 np0005548788.localdomain sudo[43388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:24 np0005548788.localdomain python3[43390]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:09:25 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:09:25 np0005548788.localdomain sudo[43388]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:25 np0005548788.localdomain sudo[43409]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-runymwhinfexmxscbmmqqhkfmzdagmsb ; /usr/bin/python3
Dec 06 08:09:25 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 06 08:09:25 np0005548788.localdomain sudo[43409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:25 np0005548788.localdomain python3[43411]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:09:26 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:09:26 np0005548788.localdomain sudo[43409]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:26 np0005548788.localdomain sudo[43430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukvruebozaxkpifllcbenxhaubvezlkc ; /usr/bin/python3
Dec 06 08:09:26 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 06 08:09:26 np0005548788.localdomain sudo[43430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:27 np0005548788.localdomain python3[43432]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:27 np0005548788.localdomain sudo[43430]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:27 np0005548788.localdomain sudo[43446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipetgbtufyvszxypcuqrviyhqmpvsyru ; /usr/bin/python3
Dec 06 08:09:27 np0005548788.localdomain sudo[43446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:27 np0005548788.localdomain python3[43448]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:27 np0005548788.localdomain sudo[43446]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:27 np0005548788.localdomain sudo[43462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dryjnudxjaxnygwgymolnrdffqruoyqf ; /usr/bin/python3
Dec 06 08:09:27 np0005548788.localdomain sudo[43462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:27 np0005548788.localdomain python3[43464]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:27 np0005548788.localdomain sudo[43462]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:28 np0005548788.localdomain sudo[43478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fumszsjdwlhlxosatmymxoqkngljscgu ; /usr/bin/python3
Dec 06 08:09:28 np0005548788.localdomain sudo[43478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:28 np0005548788.localdomain python3[43480]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:09:28 np0005548788.localdomain sudo[43478]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:28 np0005548788.localdomain sudo[43494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-torkjksepzfacspzemuqbpxfggrtvjnz ; /usr/bin/python3
Dec 06 08:09:28 np0005548788.localdomain sudo[43494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:28 np0005548788.localdomain python3[43496]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:28 np0005548788.localdomain sudo[43494]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:29 np0005548788.localdomain sudo[43511]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncikrtrxzkmophxuplvbapsotbcefsle ; /usr/bin/python3
Dec 06 08:09:29 np0005548788.localdomain sudo[43511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:29 np0005548788.localdomain python3[43513]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:09:32 np0005548788.localdomain sudo[43511]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:32 np0005548788.localdomain sudo[43528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkfnobdzeowquerhygcqsqjfqzmopszo ; /usr/bin/python3
Dec 06 08:09:32 np0005548788.localdomain sudo[43528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:33 np0005548788.localdomain python3[43530]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:33 np0005548788.localdomain sudo[43528]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:33 np0005548788.localdomain sudo[43576]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwevyosbbsamniynmnlqrxtnfxiflkgv ; /usr/bin/python3
Dec 06 08:09:33 np0005548788.localdomain sudo[43576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:34 np0005548788.localdomain python3[43578]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:34 np0005548788.localdomain sudo[43576]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:34 np0005548788.localdomain sshd[43579]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:35 np0005548788.localdomain sudo[43621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nanvhhkcjovdlvcwwbuagpqndmnfnfmp ; /usr/bin/python3
Dec 06 08:09:35 np0005548788.localdomain sudo[43621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:35 np0005548788.localdomain python3[43623]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008573.2848263-75998-184156632601839/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:35 np0005548788.localdomain sudo[43621]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:35 np0005548788.localdomain sudo[43651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nigfnxwxjdshatymnsqdawxhkviqftfx ; /usr/bin/python3
Dec 06 08:09:35 np0005548788.localdomain sudo[43651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:36 np0005548788.localdomain python3[43653]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 08:09:36 np0005548788.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 08:09:36 np0005548788.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 08:09:36 np0005548788.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 08:09:36 np0005548788.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 08:09:36 np0005548788.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 06 08:09:36 np0005548788.localdomain kernel: Bridge firewalling registered
Dec 06 08:09:36 np0005548788.localdomain systemd-modules-load[43656]: Inserted module 'br_netfilter'
Dec 06 08:09:36 np0005548788.localdomain systemd-modules-load[43656]: Module 'msr' is built in
Dec 06 08:09:36 np0005548788.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 08:09:36 np0005548788.localdomain sudo[43651]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:36 np0005548788.localdomain sudo[43705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oahjkcdapcsdxbxrhfpokfxbgtkdjjut ; /usr/bin/python3
Dec 06 08:09:36 np0005548788.localdomain sudo[43705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:36 np0005548788.localdomain python3[43707]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:36 np0005548788.localdomain sudo[43705]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:36 np0005548788.localdomain sudo[43748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quplzyffvuqghofcccaacedrxyntfxax ; /usr/bin/python3
Dec 06 08:09:36 np0005548788.localdomain sudo[43748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:37 np0005548788.localdomain python3[43750]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008576.3305564-76169-101186694521355/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:37 np0005548788.localdomain sudo[43748]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548788.localdomain sudo[43778]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eunhxgyqppxjgtpaawbexnvizfrmpbye ; /usr/bin/python3
Dec 06 08:09:37 np0005548788.localdomain sudo[43778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:37 np0005548788.localdomain python3[43780]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:37 np0005548788.localdomain sudo[43778]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548788.localdomain sudo[43795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oclmnmeulcrontbwivgftvnqkponmizr ; /usr/bin/python3
Dec 06 08:09:37 np0005548788.localdomain sudo[43795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:37 np0005548788.localdomain python3[43797]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:37 np0005548788.localdomain sudo[43795]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548788.localdomain sshd[43579]: Received disconnect from 102.140.97.134 port 39218:11: Bye Bye [preauth]
Dec 06 08:09:37 np0005548788.localdomain sshd[43579]: Disconnected from authenticating user root 102.140.97.134 port 39218 [preauth]
Dec 06 08:09:37 np0005548788.localdomain sudo[43813]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtgcyjflzoofsmdfmzrkhghizhznflpm ; /usr/bin/python3
Dec 06 08:09:37 np0005548788.localdomain sudo[43813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:38 np0005548788.localdomain python3[43815]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:38 np0005548788.localdomain sudo[43813]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:38 np0005548788.localdomain sudo[43831]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyfckfhsjsvzkxrscfvzadrzdurrnvzt ; /usr/bin/python3
Dec 06 08:09:38 np0005548788.localdomain sudo[43831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:38 np0005548788.localdomain python3[43833]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:38 np0005548788.localdomain sudo[43831]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:38 np0005548788.localdomain sudo[43848]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lejmpkjnwcyxdfpvdwalczfkozrvnlfk ; /usr/bin/python3
Dec 06 08:09:38 np0005548788.localdomain sudo[43848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:38 np0005548788.localdomain python3[43850]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:38 np0005548788.localdomain sudo[43848]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:38 np0005548788.localdomain sudo[43865]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itjwoquscwffykvqlznkgmymrnnqipux ; /usr/bin/python3
Dec 06 08:09:38 np0005548788.localdomain sudo[43865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548788.localdomain python3[43867]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548788.localdomain sudo[43865]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:39 np0005548788.localdomain sudo[43882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkhvizphpzooeexwgxsxmsvtgszsdyui ; /usr/bin/python3
Dec 06 08:09:39 np0005548788.localdomain sudo[43882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548788.localdomain python3[43884]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548788.localdomain sudo[43882]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:39 np0005548788.localdomain sudo[43900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idnljfhvfrdclipnaxpzhoguevepewrp ; /usr/bin/python3
Dec 06 08:09:39 np0005548788.localdomain sudo[43900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548788.localdomain python3[43902]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548788.localdomain sudo[43900]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:39 np0005548788.localdomain sudo[43918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtbmyieujeieebbvxhhesqvcfrfoqxia ; /usr/bin/python3
Dec 06 08:09:39 np0005548788.localdomain sudo[43918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548788.localdomain python3[43920]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548788.localdomain sudo[43918]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:40 np0005548788.localdomain sudo[43936]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlujztwiuxwarrykpoqidznmmqosuibu ; /usr/bin/python3
Dec 06 08:09:40 np0005548788.localdomain sudo[43936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:40 np0005548788.localdomain python3[43938]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548788.localdomain sudo[43936]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:40 np0005548788.localdomain sudo[43954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqvmobiewhkcluobrescxtisyoughvqy ; /usr/bin/python3
Dec 06 08:09:40 np0005548788.localdomain sudo[43954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:40 np0005548788.localdomain python3[43956]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548788.localdomain sudo[43954]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:40 np0005548788.localdomain sudo[43972]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfcnvywsxgibqebyopygqfyogvjvwnyb ; /usr/bin/python3
Dec 06 08:09:40 np0005548788.localdomain sudo[43972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:40 np0005548788.localdomain python3[43974]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548788.localdomain sudo[43972]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548788.localdomain sudo[43990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxkgpfcuksqplxldcggcssiyaaudzzvp ; /usr/bin/python3
Dec 06 08:09:41 np0005548788.localdomain sudo[43990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:41 np0005548788.localdomain python3[43992]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:41 np0005548788.localdomain sudo[43990]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548788.localdomain sudo[44008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmaddhaimuwqdfpadchsijccecncrcxy ; /usr/bin/python3
Dec 06 08:09:41 np0005548788.localdomain sudo[44008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:41 np0005548788.localdomain python3[44010]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:41 np0005548788.localdomain sudo[44008]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548788.localdomain sudo[44025]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvausqfoydamrbdwrduopowrgeuegmop ; /usr/bin/python3
Dec 06 08:09:41 np0005548788.localdomain sudo[44025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:41 np0005548788.localdomain python3[44027]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:41 np0005548788.localdomain sudo[44025]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548788.localdomain sudo[44042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aomqbblhffbojzrmoxgznxoucooafunw ; /usr/bin/python3
Dec 06 08:09:41 np0005548788.localdomain sudo[44042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:42 np0005548788.localdomain python3[44044]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:42 np0005548788.localdomain sudo[44042]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:42 np0005548788.localdomain sudo[44059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxjibebtnzlliceftookiqfnrvldohji ; /usr/bin/python3
Dec 06 08:09:42 np0005548788.localdomain sudo[44059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:42 np0005548788.localdomain python3[44061]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:42 np0005548788.localdomain sudo[44059]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:42 np0005548788.localdomain sudo[44076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzyfoikbxtqglveydivpxrjesmlmzksi ; /usr/bin/python3
Dec 06 08:09:42 np0005548788.localdomain sudo[44076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:42 np0005548788.localdomain python3[44078]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:42 np0005548788.localdomain sudo[44076]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:42 np0005548788.localdomain sudo[44094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrhgurbcocwuilbvtoyyvddspmpuqujl ; /usr/bin/python3
Dec 06 08:09:42 np0005548788.localdomain sudo[44094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:43 np0005548788.localdomain python3[44096]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 08:09:43 np0005548788.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 08:09:43 np0005548788.localdomain systemd[1]: Stopped Apply Kernel Variables.
Dec 06 08:09:43 np0005548788.localdomain systemd[1]: Stopping Apply Kernel Variables...
Dec 06 08:09:43 np0005548788.localdomain systemd[1]: Starting Apply Kernel Variables...
Dec 06 08:09:43 np0005548788.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 08:09:43 np0005548788.localdomain systemd[1]: Finished Apply Kernel Variables.
Dec 06 08:09:43 np0005548788.localdomain sudo[44094]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:43 np0005548788.localdomain sudo[44114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezpkectmttaemoljypdddvizxkuezesj ; /usr/bin/python3
Dec 06 08:09:43 np0005548788.localdomain sudo[44114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:43 np0005548788.localdomain python3[44116]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:43 np0005548788.localdomain sudo[44114]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:43 np0005548788.localdomain sudo[44130]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkzyfisdqwxhvrglrlqjsepsjotbdopd ; /usr/bin/python3
Dec 06 08:09:43 np0005548788.localdomain sudo[44130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:43 np0005548788.localdomain python3[44132]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:43 np0005548788.localdomain sudo[44130]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:43 np0005548788.localdomain sudo[44146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neeerespcvwevuvsjbdqamqctahtfzzo ; /usr/bin/python3
Dec 06 08:09:43 np0005548788.localdomain sudo[44146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548788.localdomain python3[44148]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:44 np0005548788.localdomain sudo[44146]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548788.localdomain sudo[44162]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfkgvangyojewyxoczalgclffmruqfbl ; /usr/bin/python3
Dec 06 08:09:44 np0005548788.localdomain sudo[44162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548788.localdomain python3[44164]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:09:44 np0005548788.localdomain sudo[44162]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548788.localdomain sudo[44178]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arnwcznzedramxiifxvjedrzlireonae ; /usr/bin/python3
Dec 06 08:09:44 np0005548788.localdomain sudo[44178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548788.localdomain python3[44180]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:44 np0005548788.localdomain sudo[44178]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548788.localdomain sudo[44181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:09:44 np0005548788.localdomain sudo[44181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:44 np0005548788.localdomain sudo[44181]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548788.localdomain sudo[44208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjptwvbbqtrfdqgzennofmnsdbyxyvau ; /usr/bin/python3
Dec 06 08:09:44 np0005548788.localdomain sudo[44208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548788.localdomain sudo[44211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:09:44 np0005548788.localdomain sudo[44211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:45 np0005548788.localdomain python3[44224]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:45 np0005548788.localdomain sudo[44208]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548788.localdomain sudo[44242]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nznfveojnlfbeokkqgpvkclsfatrawwv ; /usr/bin/python3
Dec 06 08:09:45 np0005548788.localdomain sudo[44242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:45 np0005548788.localdomain sudo[44211]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548788.localdomain python3[44255]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:45 np0005548788.localdomain sudo[44242]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548788.localdomain sudo[44277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vccnaehimvyywswuhaihklcufwiyodba ; /usr/bin/python3
Dec 06 08:09:45 np0005548788.localdomain sudo[44277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:45 np0005548788.localdomain python3[44279]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:45 np0005548788.localdomain sudo[44277]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548788.localdomain sudo[44293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmzmegpdcfonvievotgzuyxsnvgiugvc ; /usr/bin/python3
Dec 06 08:09:45 np0005548788.localdomain sudo[44293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:46 np0005548788.localdomain python3[44295]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:46 np0005548788.localdomain sudo[44293]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:46 np0005548788.localdomain sudo[44309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:09:46 np0005548788.localdomain sudo[44309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:46 np0005548788.localdomain sudo[44309]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:46 np0005548788.localdomain sudo[44367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osmothsfpdowktvzelylaugmqhlfzqjk ; /usr/bin/python3
Dec 06 08:09:46 np0005548788.localdomain sudo[44367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:46 np0005548788.localdomain sudo[44347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:09:46 np0005548788.localdomain sudo[44347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:46 np0005548788.localdomain python3[44372]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:46 np0005548788.localdomain sudo[44367]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:46 np0005548788.localdomain sudo[44429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbyhfeciugeuezwibxsisrofejnsovad ; /usr/bin/python3
Dec 06 08:09:46 np0005548788.localdomain sudo[44429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:46 np0005548788.localdomain python3[44431]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008586.201568-76559-170586672893361/source _original_basename=tmpujfobuab follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:46 np0005548788.localdomain sudo[44429]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:46 np0005548788.localdomain sudo[44347]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:47 np0005548788.localdomain sudo[44476]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mugamlrzmidplyggavlyyuytfcltwhrp ; /usr/bin/python3
Dec 06 08:09:47 np0005548788.localdomain sudo[44476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:47 np0005548788.localdomain python3[44478]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:47 np0005548788.localdomain sudo[44476]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:47 np0005548788.localdomain sudo[44480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:09:47 np0005548788.localdomain sudo[44480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:47 np0005548788.localdomain sudo[44480]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:48 np0005548788.localdomain sudo[44508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajpjiixnnsyudqunbutjmmvvvtsjjstd ; /usr/bin/python3
Dec 06 08:09:48 np0005548788.localdomain sudo[44508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:48 np0005548788.localdomain python3[44510]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:48 np0005548788.localdomain sudo[44508]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:49 np0005548788.localdomain sudo[44556]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlxgjtmgclbpodjgyexakocbxpzqclbf ; /usr/bin/python3
Dec 06 08:09:49 np0005548788.localdomain sudo[44556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:49 np0005548788.localdomain python3[44558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:49 np0005548788.localdomain sudo[44556]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:49 np0005548788.localdomain sudo[44599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgsextmxalbhzhewjjvqnlcxnlyjmltw ; /usr/bin/python3
Dec 06 08:09:49 np0005548788.localdomain sudo[44599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:49 np0005548788.localdomain python3[44601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008588.9163232-76770-130643472400846/source _original_basename=tmprz78zu5u follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:49 np0005548788.localdomain sudo[44599]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:49 np0005548788.localdomain sudo[44629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omhwiwqvayrlijgaqmytbbfczwapginm ; /usr/bin/python3
Dec 06 08:09:49 np0005548788.localdomain sudo[44629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548788.localdomain python3[44631]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:50 np0005548788.localdomain sudo[44629]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:50 np0005548788.localdomain sudo[44645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojqotdbmwbccfmeztqjarzpegfeajiko ; /usr/bin/python3
Dec 06 08:09:50 np0005548788.localdomain sudo[44645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548788.localdomain python3[44647]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:50 np0005548788.localdomain sudo[44645]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:50 np0005548788.localdomain sudo[44661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtdzvvqeumfyuqchogoimcwrhbzlmavh ; /usr/bin/python3
Dec 06 08:09:50 np0005548788.localdomain sudo[44661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548788.localdomain python3[44663]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:50 np0005548788.localdomain sudo[44661]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:50 np0005548788.localdomain sudo[44677]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxnfxyrgcgxjkfatbrfnnbooskczhbte ; /usr/bin/python3
Dec 06 08:09:50 np0005548788.localdomain sudo[44677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:51 np0005548788.localdomain python3[44679]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:51 np0005548788.localdomain sudo[44677]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:51 np0005548788.localdomain sudo[44693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goczavetembnhvnvuzvivldmdyyltaby ; /usr/bin/python3
Dec 06 08:09:51 np0005548788.localdomain sudo[44693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:51 np0005548788.localdomain python3[44695]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:51 np0005548788.localdomain sudo[44693]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:51 np0005548788.localdomain sudo[44709]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxhrabydrpeyirwxjbanynfwzdwqzpoz ; /usr/bin/python3
Dec 06 08:09:51 np0005548788.localdomain sudo[44709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:51 np0005548788.localdomain python3[44711]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:51 np0005548788.localdomain sudo[44709]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:51 np0005548788.localdomain sudo[44725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcntaijlkvnthqwnjweynkhqkbuvmnll ; /usr/bin/python3
Dec 06 08:09:51 np0005548788.localdomain sudo[44725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:52 np0005548788.localdomain python3[44727]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:52 np0005548788.localdomain sudo[44725]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:52 np0005548788.localdomain sudo[44741]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvrzhqvsipbgrepnsrydxlmkfdttplit ; /usr/bin/python3
Dec 06 08:09:52 np0005548788.localdomain sudo[44741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:52 np0005548788.localdomain python3[44743]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:52 np0005548788.localdomain sudo[44741]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:52 np0005548788.localdomain sudo[44757]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twvaazuhrrqwnvtkdzjvcqtjefvmpmwx ; /usr/bin/python3
Dec 06 08:09:52 np0005548788.localdomain sudo[44757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:52 np0005548788.localdomain python3[44759]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:52 np0005548788.localdomain sudo[44757]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:52 np0005548788.localdomain sudo[44773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhlackcurscjtomneaaqqhznwgiuglgd ; /usr/bin/python3
Dec 06 08:09:52 np0005548788.localdomain sudo[44773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:53 np0005548788.localdomain python3[44775]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Dec 06 08:09:53 np0005548788.localdomain groupadd[44776]: group added to /etc/group: name=qemu, GID=107
Dec 06 08:09:53 np0005548788.localdomain groupadd[44776]: group added to /etc/gshadow: name=qemu
Dec 06 08:09:53 np0005548788.localdomain groupadd[44776]: new group: name=qemu, GID=107
Dec 06 08:09:53 np0005548788.localdomain sudo[44773]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:53 np0005548788.localdomain sudo[44795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aogpiwzqeeqvkldwjfblqiwflgommgzz ; /usr/bin/python3
Dec 06 08:09:53 np0005548788.localdomain sudo[44795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:53 np0005548788.localdomain python3[44797]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548788.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 08:09:53 np0005548788.localdomain useradd[44799]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Dec 06 08:09:53 np0005548788.localdomain sudo[44795]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:53 np0005548788.localdomain sudo[44819]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaxnpmmdcdxvhfrfpvxutdohxscsokjg ; /usr/bin/python3
Dec 06 08:09:53 np0005548788.localdomain sudo[44819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:53 np0005548788.localdomain python3[44821]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Dec 06 08:09:53 np0005548788.localdomain sudo[44819]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:54 np0005548788.localdomain sudo[44835]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esbqfpwckeylasjzcmzbphdbhgykzwzi ; /usr/bin/python3
Dec 06 08:09:54 np0005548788.localdomain sudo[44835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:54 np0005548788.localdomain python3[44837]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:54 np0005548788.localdomain sudo[44835]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:54 np0005548788.localdomain sudo[44884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqzeymzhpaohwchrdumfkbjcbigvxahx ; /usr/bin/python3
Dec 06 08:09:54 np0005548788.localdomain sudo[44884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:54 np0005548788.localdomain python3[44886]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:54 np0005548788.localdomain sudo[44884]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:54 np0005548788.localdomain sudo[44927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odconnavgxdfedzipsxnmwqgoewkgwob ; /usr/bin/python3
Dec 06 08:09:54 np0005548788.localdomain sudo[44927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:55 np0005548788.localdomain python3[44929]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008594.4842017-77025-173346848240705/source _original_basename=tmp_hc4_p37 follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:55 np0005548788.localdomain sudo[44927]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:55 np0005548788.localdomain sudo[44957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyetyyupnovqrutstobpiuyrylhpzlus ; /usr/bin/python3
Dec 06 08:09:55 np0005548788.localdomain sudo[44957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:55 np0005548788.localdomain python3[44959]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 06 08:09:56 np0005548788.localdomain sudo[44957]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:56 np0005548788.localdomain sudo[44979]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhbfdcgpwopcepxamyftpijwkkyrwncc ; /usr/bin/python3
Dec 06 08:09:56 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 06 08:09:56 np0005548788.localdomain sudo[44979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:56 np0005548788.localdomain python3[44981]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:56 np0005548788.localdomain sudo[44979]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:56 np0005548788.localdomain sudo[44995]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okaileyakpazozoscavakkccrolzgvdh ; /usr/bin/python3
Dec 06 08:09:56 np0005548788.localdomain sudo[44995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:56 np0005548788.localdomain python3[44997]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:56 np0005548788.localdomain sudo[44995]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:57 np0005548788.localdomain sudo[45011]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnwbrkpqiwxqwyqqjgslfdjkvwdvdemt ; /usr/bin/python3
Dec 06 08:09:57 np0005548788.localdomain sudo[45011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:57 np0005548788.localdomain python3[45013]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Dec 06 08:09:58 np0005548788.localdomain sudo[45011]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:58 np0005548788.localdomain sudo[45031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxudlulomfioyweyvowgcoqilooacedb ; /usr/bin/python3
Dec 06 08:09:58 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 06 08:09:58 np0005548788.localdomain sudo[45031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:58 np0005548788.localdomain python3[45033]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:10:01 np0005548788.localdomain sudo[45031]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:01 np0005548788.localdomain sudo[45048]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unksdaegbqmkyqshccdluwtrzthtgwer ; /usr/bin/python3
Dec 06 08:10:01 np0005548788.localdomain sudo[45048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:01 np0005548788.localdomain python3[45050]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 08:10:02 np0005548788.localdomain sudo[45048]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:02 np0005548788.localdomain sudo[45109]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwukfuujltrbhqikscsnmdjpeohmggny ; /usr/bin/python3
Dec 06 08:10:02 np0005548788.localdomain sudo[45109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:02 np0005548788.localdomain python3[45111]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:02 np0005548788.localdomain sudo[45109]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:02 np0005548788.localdomain sudo[45125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwbrwfcscgctxtsfehppsjvbrolrnocc ; /usr/bin/python3
Dec 06 08:10:02 np0005548788.localdomain sudo[45125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:02 np0005548788.localdomain python3[45127]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:03 np0005548788.localdomain sudo[45125]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:03 np0005548788.localdomain sudo[45185]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuoobbkrhwiyysjanfxkppczbfnrkoei ; /usr/bin/python3
Dec 06 08:10:03 np0005548788.localdomain sudo[45185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:03 np0005548788.localdomain python3[45187]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:03 np0005548788.localdomain sudo[45185]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:03 np0005548788.localdomain sudo[45228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncieexzlbfxvwqkoagzzzytfexpdsjlt ; /usr/bin/python3
Dec 06 08:10:03 np0005548788.localdomain sudo[45228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:03 np0005548788.localdomain python3[45230]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008603.229649-77385-29905890165780/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=c2b8964e582545a803ae226377b0c5420e9d143d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:04 np0005548788.localdomain sudo[45228]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:04 np0005548788.localdomain sudo[45290]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwwcvwfbnsxbuotdzzktzfrissbdyzqo ; /usr/bin/python3
Dec 06 08:10:04 np0005548788.localdomain sudo[45290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:04 np0005548788.localdomain python3[45292]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:04 np0005548788.localdomain sudo[45290]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:04 np0005548788.localdomain sudo[45335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhkuizoeukkboacyjhdjgkskvlvfgann ; /usr/bin/python3
Dec 06 08:10:04 np0005548788.localdomain sudo[45335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:05 np0005548788.localdomain python3[45337]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008604.2013924-77433-187314846156052/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:05 np0005548788.localdomain sudo[45335]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:05 np0005548788.localdomain sudo[45365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rklxciaiaotmgymlhupbppyruagicyxb ; /usr/bin/python3
Dec 06 08:10:05 np0005548788.localdomain sudo[45365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:05 np0005548788.localdomain python3[45367]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:05 np0005548788.localdomain sudo[45365]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:05 np0005548788.localdomain sudo[45381]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvwnjosjqycvoawhkemjoloobipnhtuv ; /usr/bin/python3
Dec 06 08:10:05 np0005548788.localdomain sudo[45381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:05 np0005548788.localdomain python3[45383]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:05 np0005548788.localdomain sudo[45381]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:05 np0005548788.localdomain sudo[45397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrzlqgyhgeiunjtzkrmcizizbglxgaji ; /usr/bin/python3
Dec 06 08:10:05 np0005548788.localdomain sudo[45397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:05 np0005548788.localdomain python3[45399]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:05 np0005548788.localdomain sudo[45397]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:06 np0005548788.localdomain sudo[45413]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eymodgjllzmfrrxaynyrrszuriafahyp ; /usr/bin/python3
Dec 06 08:10:06 np0005548788.localdomain sudo[45413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:06 np0005548788.localdomain python3[45415]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:06 np0005548788.localdomain sudo[45413]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:06 np0005548788.localdomain sudo[45461]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trnaeigeqzpepdbprpyyxknirfrjiqtm ; /usr/bin/python3
Dec 06 08:10:06 np0005548788.localdomain sudo[45461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:06 np0005548788.localdomain python3[45463]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:06 np0005548788.localdomain sudo[45461]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:07 np0005548788.localdomain sudo[45504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nconwcnvyslmfyojxsdahxorcpdbglvw ; /usr/bin/python3
Dec 06 08:10:07 np0005548788.localdomain sudo[45504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:07 np0005548788.localdomain python3[45506]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008606.6819463-77654-82617123221464/source _original_basename=tmp1u6y0185 follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:07 np0005548788.localdomain sudo[45504]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:07 np0005548788.localdomain sudo[45534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suubdttpoejfgmnndzrbztqllchwjexj ; /usr/bin/python3
Dec 06 08:10:07 np0005548788.localdomain sudo[45534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:07 np0005548788.localdomain python3[45536]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:07 np0005548788.localdomain sudo[45534]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:07 np0005548788.localdomain sudo[45550]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsufijeqzpiklonderybobdyfzebopth ; /usr/bin/python3
Dec 06 08:10:07 np0005548788.localdomain sudo[45550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:08 np0005548788.localdomain python3[45552]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:08 np0005548788.localdomain sudo[45550]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:08 np0005548788.localdomain sudo[45566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azzpdncxdphthrfoeciautmvzxviqixn ; /usr/bin/python3
Dec 06 08:10:08 np0005548788.localdomain sudo[45566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:08 np0005548788.localdomain python3[45568]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:10:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:10:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3405 writes, 16K keys, 3405 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3405 writes, 206 syncs, 16.53 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3405 writes, 16K keys, 3405 commit groups, 1.0 writes per commit group, ingest: 15.32 MB, 0.03 MB/s
                                                          Interval WAL: 3405 writes, 206 syncs, 16.53 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:10:11 np0005548788.localdomain sudo[45566]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:12 np0005548788.localdomain sudo[45615]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpvwrsqdbfbyewvnvkpmnvuqbrrfbato ; /usr/bin/python3
Dec 06 08:10:12 np0005548788.localdomain sudo[45615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:12 np0005548788.localdomain python3[45617]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:12 np0005548788.localdomain sudo[45615]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:12 np0005548788.localdomain sudo[45660]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmxjrwsjzbtdatzcwegbxvtnmywahtly ; /usr/bin/python3
Dec 06 08:10:12 np0005548788.localdomain sudo[45660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:12 np0005548788.localdomain python3[45662]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008611.9657462-78013-114822037498560/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:12 np0005548788.localdomain sudo[45660]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:12 np0005548788.localdomain sudo[45691]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uccpahsckwtddyrmrgacwftvismepegh ; /usr/bin/python3
Dec 06 08:10:12 np0005548788.localdomain sudo[45691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:13 np0005548788.localdomain python3[45693]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 08:10:13 np0005548788.localdomain sshd[1130]: Received signal 15; terminating.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: sshd.service: Consumed 5.930s CPU time, read 2.6M from disk, written 8.0K to disk.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 08:10:13 np0005548788.localdomain sshd[45697]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:10:13 np0005548788.localdomain sshd[45697]: Server listening on 0.0.0.0 port 22.
Dec 06 08:10:13 np0005548788.localdomain sshd[45697]: Server listening on :: port 22.
Dec 06 08:10:13 np0005548788.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 08:10:13 np0005548788.localdomain sudo[45691]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:13 np0005548788.localdomain sudo[45711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twrjfdhanirudckpcxdlfqmxxyewreom ; /usr/bin/python3
Dec 06 08:10:13 np0005548788.localdomain sudo[45711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:13 np0005548788.localdomain python3[45713]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:13 np0005548788.localdomain sudo[45711]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:10:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3247 writes, 16K keys, 3247 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3247 writes, 139 syncs, 23.36 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3247 writes, 16K keys, 3247 commit groups, 1.0 writes per commit group, ingest: 14.63 MB, 0.02 MB/s
                                                          Interval WAL: 3247 writes, 139 syncs, 23.36 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:10:14 np0005548788.localdomain sudo[45729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kptpxwrrkhiklyzvvpsgwtxhbpwoovfw ; /usr/bin/python3
Dec 06 08:10:14 np0005548788.localdomain sudo[45729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:14 np0005548788.localdomain python3[45731]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:14 np0005548788.localdomain sudo[45729]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:14 np0005548788.localdomain sudo[45747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyxomgqdqlemxlrzwgcotjrypapywqdo ; /usr/bin/python3
Dec 06 08:10:14 np0005548788.localdomain sudo[45747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:15 np0005548788.localdomain python3[45749]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:10:17 np0005548788.localdomain sudo[45747]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:18 np0005548788.localdomain sudo[45796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbqmxjtlmedgnrjzpcvydgjldjgifsfm ; /usr/bin/python3
Dec 06 08:10:18 np0005548788.localdomain sudo[45796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:18 np0005548788.localdomain python3[45798]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:18 np0005548788.localdomain sudo[45796]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:18 np0005548788.localdomain sudo[45814]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptftnkqugvildnwlrtpvvgphbfegjqxx ; /usr/bin/python3
Dec 06 08:10:18 np0005548788.localdomain sudo[45814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:19 np0005548788.localdomain python3[45816]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:19 np0005548788.localdomain sudo[45814]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:19 np0005548788.localdomain sudo[45844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnhporcorkizvrcjwrexkgsrbjlmzczr ; /usr/bin/python3
Dec 06 08:10:19 np0005548788.localdomain sudo[45844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:19 np0005548788.localdomain python3[45846]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:10:20 np0005548788.localdomain sudo[45844]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:21 np0005548788.localdomain sudo[45894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieudimijzemmvfyqfvsystvgcxqzdfrw ; /usr/bin/python3
Dec 06 08:10:21 np0005548788.localdomain sudo[45894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:21 np0005548788.localdomain python3[45896]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:21 np0005548788.localdomain sudo[45894]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:21 np0005548788.localdomain sudo[45912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zssoqoatcemrnbcpdbfhzdvnpknpdkoq ; /usr/bin/python3
Dec 06 08:10:21 np0005548788.localdomain sudo[45912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:21 np0005548788.localdomain python3[45914]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:21 np0005548788.localdomain sudo[45912]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:21 np0005548788.localdomain sudo[45942]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvtryipvuaiebfcxygwjdfajkbexjskd ; /usr/bin/python3
Dec 06 08:10:21 np0005548788.localdomain sudo[45942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:22 np0005548788.localdomain python3[45944]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:10:22 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:10:22 np0005548788.localdomain systemd-rc-local-generator[45969]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:10:22 np0005548788.localdomain systemd-sysv-generator[45974]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:10:22 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:10:22 np0005548788.localdomain systemd[1]: Starting chronyd online sources service...
Dec 06 08:10:22 np0005548788.localdomain chronyc[45983]: 200 OK
Dec 06 08:10:22 np0005548788.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Dec 06 08:10:22 np0005548788.localdomain systemd[1]: Finished chronyd online sources service.
Dec 06 08:10:22 np0005548788.localdomain sudo[45942]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:22 np0005548788.localdomain sudo[45997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moiysrsjyppfrsbtyltwqwemgcwoerbo ; /usr/bin/python3
Dec 06 08:10:22 np0005548788.localdomain sudo[45997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:23 np0005548788.localdomain python3[45999]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:23 np0005548788.localdomain chronyd[25948]: System clock was stepped by -0.000154 seconds
Dec 06 08:10:23 np0005548788.localdomain sudo[45997]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:23 np0005548788.localdomain sudo[46014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywghcndwbwswupjclrhoufwhoybfbbrs ; /usr/bin/python3
Dec 06 08:10:23 np0005548788.localdomain sudo[46014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:23 np0005548788.localdomain python3[46016]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:23 np0005548788.localdomain sudo[46014]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:23 np0005548788.localdomain sudo[46031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pribhrhwkhgxgxsxdfocxdzcsligpbis ; /usr/bin/python3
Dec 06 08:10:23 np0005548788.localdomain sudo[46031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:23 np0005548788.localdomain python3[46033]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:23 np0005548788.localdomain chronyd[25948]: System clock was stepped by -0.000000 seconds
Dec 06 08:10:23 np0005548788.localdomain sudo[46031]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:23 np0005548788.localdomain sudo[46048]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mefpnovbowikzsdyfxsnlpodxlydlazg ; /usr/bin/python3
Dec 06 08:10:23 np0005548788.localdomain sudo[46048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:24 np0005548788.localdomain python3[46050]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:24 np0005548788.localdomain sudo[46048]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:24 np0005548788.localdomain sudo[46065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycbnwbvkplljvipoyauaqlbbvqcwzksv ; /usr/bin/python3
Dec 06 08:10:24 np0005548788.localdomain sudo[46065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:24 np0005548788.localdomain python3[46067]: ansible-timezone Invoked with name=UTC hwclock=None
Dec 06 08:10:24 np0005548788.localdomain systemd[1]: Starting Time & Date Service...
Dec 06 08:10:24 np0005548788.localdomain systemd[1]: Started Time & Date Service.
Dec 06 08:10:24 np0005548788.localdomain sudo[46065]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:25 np0005548788.localdomain sudo[46085]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olmfkaivdanrzrvxidwlsnbaearzdrjy ; /usr/bin/python3
Dec 06 08:10:25 np0005548788.localdomain sudo[46085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:25 np0005548788.localdomain python3[46087]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:25 np0005548788.localdomain sudo[46085]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:25 np0005548788.localdomain sudo[46102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdipcqskmjdwdqgeomwsguzllewqpcxe ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 06 08:10:25 np0005548788.localdomain sudo[46102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:26 np0005548788.localdomain python3[46104]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:26 np0005548788.localdomain sudo[46102]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:26 np0005548788.localdomain sudo[46119]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlgbnaveqczkygqyvvnixgnjbtkmpvii ; /usr/bin/python3
Dec 06 08:10:26 np0005548788.localdomain sudo[46119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:26 np0005548788.localdomain python3[46121]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Dec 06 08:10:26 np0005548788.localdomain sudo[46119]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:26 np0005548788.localdomain sudo[46135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvchsqfkcxbmtcygqwnpntbnshknjxun ; /usr/bin/python3
Dec 06 08:10:26 np0005548788.localdomain sudo[46135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:27 np0005548788.localdomain python3[46137]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:10:27 np0005548788.localdomain sudo[46135]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:27 np0005548788.localdomain sudo[46151]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npwwxewoswwszauuazxhewqcwbycttwb ; /usr/bin/python3
Dec 06 08:10:27 np0005548788.localdomain sudo[46151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:27 np0005548788.localdomain python3[46153]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:27 np0005548788.localdomain sudo[46151]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:27 np0005548788.localdomain sudo[46167]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbqcvzilndxwtxmrfqwpfihhlrpgohrx ; /usr/bin/python3
Dec 06 08:10:27 np0005548788.localdomain sudo[46167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:27 np0005548788.localdomain python3[46169]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:27 np0005548788.localdomain sudo[46167]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:28 np0005548788.localdomain sudo[46215]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqtbsktpdctqjnpfnecmscpijafvyypc ; /usr/bin/python3
Dec 06 08:10:28 np0005548788.localdomain sudo[46215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:28 np0005548788.localdomain python3[46217]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:28 np0005548788.localdomain sudo[46215]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:28 np0005548788.localdomain sudo[46258]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rssfqdbpmfswzazvkwuxyjumrzjhzdau ; /usr/bin/python3
Dec 06 08:10:28 np0005548788.localdomain sudo[46258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:28 np0005548788.localdomain python3[46260]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008628.1807432-78886-18880145077767/source _original_basename=tmpgcqioow4 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:28 np0005548788.localdomain sudo[46258]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:29 np0005548788.localdomain sudo[46320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qopctczohtrgsuvjrzzjykbzslcyoydh ; /usr/bin/python3
Dec 06 08:10:29 np0005548788.localdomain sudo[46320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:29 np0005548788.localdomain python3[46322]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:29 np0005548788.localdomain sudo[46320]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:29 np0005548788.localdomain sudo[46363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egpeigeehdowufpfzhptoozwlyvzvazk ; /usr/bin/python3
Dec 06 08:10:29 np0005548788.localdomain sudo[46363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:29 np0005548788.localdomain python3[46365]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008629.0655787-78941-70518153412819/source _original_basename=tmp7oa9gro8 follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:29 np0005548788.localdomain sudo[46363]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:30 np0005548788.localdomain sudo[46393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwwpkwsgqynrtaloiwiakqytbryuzkog ; /usr/bin/python3
Dec 06 08:10:30 np0005548788.localdomain sudo[46393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:30 np0005548788.localdomain python3[46395]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 08:10:31 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:10:31 np0005548788.localdomain systemd-rc-local-generator[46423]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:10:31 np0005548788.localdomain systemd-sysv-generator[46426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:10:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:10:31 np0005548788.localdomain sudo[46393]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:31 np0005548788.localdomain sudo[46448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okcluslcftddtynninmcvdolnfaovrmw ; /usr/bin/python3
Dec 06 08:10:31 np0005548788.localdomain sudo[46448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:31 np0005548788.localdomain python3[46450]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:31 np0005548788.localdomain sudo[46448]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:32 np0005548788.localdomain sudo[46464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvhhxsazrdmpmekrqonsoydyzrkenyhj ; /usr/bin/python3
Dec 06 08:10:32 np0005548788.localdomain sudo[46464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:32 np0005548788.localdomain python3[46466]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:32 np0005548788.localdomain systemd[36159]: Created slice User Background Tasks Slice.
Dec 06 08:10:32 np0005548788.localdomain systemd[36159]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 08:10:32 np0005548788.localdomain sudo[46464]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:32 np0005548788.localdomain systemd[36159]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 08:10:32 np0005548788.localdomain sudo[46482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnvevlmxxhwuztstzovhbjfaxiiptasv ; /usr/bin/python3
Dec 06 08:10:32 np0005548788.localdomain sudo[46482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:32 np0005548788.localdomain python3[46484]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:32 np0005548788.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Dec 06 08:10:32 np0005548788.localdomain sudo[46482]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:32 np0005548788.localdomain sudo[46499]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdhvavbyvnzyndskwxgbyaukmctnjufz ; /usr/bin/python3
Dec 06 08:10:32 np0005548788.localdomain sudo[46499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:33 np0005548788.localdomain python3[46501]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:33 np0005548788.localdomain sudo[46499]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:33 np0005548788.localdomain sudo[46515]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tilxmkgdmqdjkaojbvvpoqtziyvdkial ; /usr/bin/python3
Dec 06 08:10:33 np0005548788.localdomain sudo[46515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:33 np0005548788.localdomain python3[46517]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:33 np0005548788.localdomain sudo[46515]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:33 np0005548788.localdomain sudo[46563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uorhvjbvprugsebbkzflkswryqbowatc ; /usr/bin/python3
Dec 06 08:10:33 np0005548788.localdomain sudo[46563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:34 np0005548788.localdomain python3[46565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:34 np0005548788.localdomain sudo[46563]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:34 np0005548788.localdomain sudo[46606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnqlmqcgqgrzqqintbxlbglhxjdwumvg ; /usr/bin/python3
Dec 06 08:10:34 np0005548788.localdomain sudo[46606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:34 np0005548788.localdomain python3[46608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008633.5937288-79216-103208554128722/source _original_basename=tmpmye3xq7y follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:34 np0005548788.localdomain sudo[46606]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:46 np0005548788.localdomain sshd[46623]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:10:47 np0005548788.localdomain sudo[46625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:10:47 np0005548788.localdomain sudo[46625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:10:47 np0005548788.localdomain sudo[46625]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:47 np0005548788.localdomain sudo[46640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:10:47 np0005548788.localdomain sudo[46640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:10:48 np0005548788.localdomain sudo[46640]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:49 np0005548788.localdomain sudo[46687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:10:49 np0005548788.localdomain sudo[46687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:10:49 np0005548788.localdomain sudo[46687]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:53 np0005548788.localdomain sshd[46623]: Received disconnect from 45.78.222.109 port 44536:11: Bye Bye [preauth]
Dec 06 08:10:53 np0005548788.localdomain sshd[46623]: Disconnected from authenticating user root 45.78.222.109 port 44536 [preauth]
Dec 06 08:10:54 np0005548788.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 08:10:57 np0005548788.localdomain sudo[46717]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqiznizgttqfothlelhaptvuhvoapidv ; /usr/bin/python3
Dec 06 08:10:57 np0005548788.localdomain sudo[46717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:57 np0005548788.localdomain python3[46719]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:10:57 np0005548788.localdomain sudo[46717]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:57 np0005548788.localdomain sudo[46733]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yryztlglhuwngtfzufnbykaoowmfllsm ; /usr/bin/python3
Dec 06 08:10:57 np0005548788.localdomain sudo[46733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:58 np0005548788.localdomain python3[46735]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Dec 06 08:10:58 np0005548788.localdomain sudo[46733]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:58 np0005548788.localdomain sudo[46749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgejiglilrakvsiiuhptjolxmwmppihf ; /usr/bin/python3
Dec 06 08:10:58 np0005548788.localdomain sudo[46749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:58 np0005548788.localdomain python3[46751]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:10:58 np0005548788.localdomain sudo[46749]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:58 np0005548788.localdomain sudo[46765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cllggbrbreiefftvybhjsamhmvpxwzdk ; /usr/bin/python3
Dec 06 08:10:58 np0005548788.localdomain sudo[46765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:58 np0005548788.localdomain python3[46767]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:58 np0005548788.localdomain sudo[46765]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:58 np0005548788.localdomain sudo[46781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzdfeliyxbvrxheevqzmluwbmdnkjwyv ; /usr/bin/python3
Dec 06 08:10:58 np0005548788.localdomain sudo[46781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:59 np0005548788.localdomain python3[46783]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:59 np0005548788.localdomain sudo[46781]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:59 np0005548788.localdomain sudo[46797]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cikmqhpkiacqpuuterdgdctenpiqhthx ; /usr/bin/python3
Dec 06 08:10:59 np0005548788.localdomain sudo[46797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:59 np0005548788.localdomain python3[46799]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:11:00 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:11:00 np0005548788.localdomain sudo[46797]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:00 np0005548788.localdomain sudo[46819]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfngubopkbdhrvcrgiedbvoisfudgvol ; /usr/bin/python3
Dec 06 08:11:00 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 06 08:11:00 np0005548788.localdomain sudo[46819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:00 np0005548788.localdomain python3[46821]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:11:00 np0005548788.localdomain sudo[46819]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:01 np0005548788.localdomain sudo[46835]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkoqvjcdwdivwduwlvvtuvwwrsedlvvt ; /usr/bin/python3
Dec 06 08:11:01 np0005548788.localdomain sudo[46835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:01 np0005548788.localdomain sudo[46835]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:01 np0005548788.localdomain sudo[46883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kssnbxvluqorcbwdfwroqefojqtrrvit ; /usr/bin/python3
Dec 06 08:11:01 np0005548788.localdomain sudo[46883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:01 np0005548788.localdomain sudo[46883]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:02 np0005548788.localdomain sudo[46926]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rplnzrguyzbcgiemyzfbecorfdjukwew ; /usr/bin/python3
Dec 06 08:11:02 np0005548788.localdomain sudo[46926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:02 np0005548788.localdomain sudo[46926]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:02 np0005548788.localdomain sudo[46956]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjdhdpefpbxnmdkabhounmufyxotvqlc ; /usr/bin/python3
Dec 06 08:11:02 np0005548788.localdomain sudo[46956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:03 np0005548788.localdomain python3[46958]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Dec 06 08:11:03 np0005548788.localdomain sudo[46956]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:03 np0005548788.localdomain rsyslogd[760]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Dec 06 08:11:03 np0005548788.localdomain sudo[46972]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjzqfmeczlqulhgujissiqepnfsrymbz ; /usr/bin/python3
Dec 06 08:11:03 np0005548788.localdomain sudo[46972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:03 np0005548788.localdomain python3[46974]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:11:03 np0005548788.localdomain sudo[46972]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:03 np0005548788.localdomain sudo[46988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgwhykczmgbdecrswkbnbvqvbbvthwbr ; /usr/bin/python3
Dec 06 08:11:03 np0005548788.localdomain sudo[46988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:03 np0005548788.localdomain python3[46990]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:11:03 np0005548788.localdomain sudo[46988]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:04 np0005548788.localdomain sudo[47004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmakkshmdjwypxzvthbsuyjhlukxrevd ; /usr/bin/python3
Dec 06 08:11:04 np0005548788.localdomain sudo[47004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:04 np0005548788.localdomain python3[47006]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Dec 06 08:11:04 np0005548788.localdomain sudo[47004]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:09 np0005548788.localdomain sudo[47052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdztvlxpxeonvcjulyuutyngyxkvepto ; /usr/bin/python3
Dec 06 08:11:09 np0005548788.localdomain sudo[47052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:09 np0005548788.localdomain python3[47054]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:11:09 np0005548788.localdomain sudo[47052]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:10 np0005548788.localdomain sudo[47095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obadmjqpgopjbkphxhihpkxwknfuatcw ; /usr/bin/python3
Dec 06 08:11:10 np0005548788.localdomain sudo[47095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:10 np0005548788.localdomain python3[47097]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008669.4450257-80756-49502875429266/source _original_basename=tmpwx2uyfqw follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:11:10 np0005548788.localdomain sudo[47095]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:10 np0005548788.localdomain sudo[47125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tveibwifsgehmtobqjjjfqpvghappioz ; /usr/bin/python3
Dec 06 08:11:10 np0005548788.localdomain sudo[47125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:10 np0005548788.localdomain python3[47127]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:11:10 np0005548788.localdomain sudo[47125]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:11 np0005548788.localdomain sudo[47175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqcsxflxzxbynhnbfnjfgtjauxtfldsz ; /usr/bin/python3
Dec 06 08:11:11 np0005548788.localdomain sudo[47175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:11 np0005548788.localdomain sudo[47175]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:11 np0005548788.localdomain sudo[47218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irbspovoeudlwnvtqfvnhwbqlmyvjsdb ; /usr/bin/python3
Dec 06 08:11:11 np0005548788.localdomain sudo[47218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:12 np0005548788.localdomain sudo[47218]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:12 np0005548788.localdomain sudo[47248]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpuwmphyipsanvqvwqrguknjfgfwzmpk ; /usr/bin/python3
Dec 06 08:11:12 np0005548788.localdomain sudo[47248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:12 np0005548788.localdomain python3[47250]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:11:12 np0005548788.localdomain sudo[47248]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:13 np0005548788.localdomain sudo[47296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrpmcoaihjiizfmjtxtlxgddmzvokrhw ; /usr/bin/python3
Dec 06 08:11:13 np0005548788.localdomain sudo[47296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:13 np0005548788.localdomain sudo[47296]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:13 np0005548788.localdomain sudo[47339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwparsjqnnngbxtwwplzdzmvfeoxdwfj ; /usr/bin/python3
Dec 06 08:11:13 np0005548788.localdomain sudo[47339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:13 np0005548788.localdomain sudo[47339]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:14 np0005548788.localdomain sudo[47369]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-messdlafregpauafipbtfvpimidaqjqj ; /usr/bin/python3
Dec 06 08:11:14 np0005548788.localdomain sudo[47369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:14 np0005548788.localdomain python3[47371]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:11:14 np0005548788.localdomain sudo[47369]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:16 np0005548788.localdomain sudo[47385]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdovqquyrakebeetqvgagzfgsskvmios ; /usr/bin/python3
Dec 06 08:11:16 np0005548788.localdomain sudo[47385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:17 np0005548788.localdomain python3[47387]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:11:17 np0005548788.localdomain sudo[47385]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:17 np0005548788.localdomain sudo[47402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grmzhxpyixrvzszlfqchlmiqodmxepvo ; /usr/bin/python3
Dec 06 08:11:17 np0005548788.localdomain sudo[47402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:18 np0005548788.localdomain python3[47404]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:11:21 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:21 np0005548788.localdomain dbus-broker-launch[18432]: Noticed file-system modification, trigger reload.
Dec 06 08:11:21 np0005548788.localdomain dbus-broker-launch[18432]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 06 08:11:21 np0005548788.localdomain dbus-broker-launch[18432]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 06 08:11:21 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:22 np0005548788.localdomain systemd[1]: Reexecuting.
Dec 06 08:11:22 np0005548788.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 08:11:22 np0005548788.localdomain systemd[1]: Detected virtualization kvm.
Dec 06 08:11:22 np0005548788.localdomain systemd[1]: Detected architecture x86-64.
Dec 06 08:11:22 np0005548788.localdomain systemd-sysv-generator[47461]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:11:22 np0005548788.localdomain systemd-rc-local-generator[47457]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:11:22 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:11:30 np0005548788.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:11:31 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:11:31 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:31 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 06 08:11:31 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:11:32 np0005548788.localdomain systemd-rc-local-generator[47573]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:11:32 np0005548788.localdomain systemd-sysv-generator[47576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Stopping Journal Service...
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 06 08:11:32 np0005548788.localdomain systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:11:32 np0005548788.localdomain systemd-journald[618]: Journal stopped
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Stopped Journal Service.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: systemd-journald.service: Consumed 1.959s CPU time.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Starting Journal Service...
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: systemd-udevd.service: Consumed 3.122s CPU time.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 08:11:32 np0005548788.localdomain systemd-journald[47853]: Journal started
Dec 06 08:11:32 np0005548788.localdomain systemd-journald[47853]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 12.3M, max 314.7M, 302.3M free.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Started Journal Service.
Dec 06 08:11:32 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 06 08:11:32 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 08:11:32 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:11:32 np0005548788.localdomain systemd-udevd[47856]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 08:11:32 np0005548788.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 08:11:32 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:11:33 np0005548788.localdomain systemd-rc-local-generator[48459]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:11:33 np0005548788.localdomain systemd-sysv-generator[48462]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.392s CPU time.
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: run-r1317c39a02814423bac382faa8e45e84.service: Deactivated successfully.
Dec 06 08:11:33 np0005548788.localdomain systemd[1]: run-ra99d201e7d2947ff80577acf46ec2966.service: Deactivated successfully.
Dec 06 08:11:34 np0005548788.localdomain sudo[47402]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:35 np0005548788.localdomain sudo[48892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saanfpcwcttcjiuuebrenbdyxjuhdrgd ; /usr/bin/python3
Dec 06 08:11:35 np0005548788.localdomain sudo[48892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:35 np0005548788.localdomain python3[48894]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Dec 06 08:11:35 np0005548788.localdomain sudo[48892]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:35 np0005548788.localdomain sudo[48911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sezuoqmdyokveqyskmgtqygeocctfnxw ; /usr/bin/python3
Dec 06 08:11:35 np0005548788.localdomain sudo[48911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:35 np0005548788.localdomain python3[48913]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:11:35 np0005548788.localdomain sudo[48911]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:36 np0005548788.localdomain sudo[48929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqeuurjonogtkkcrlzluouisibqxsxox ; /usr/bin/python3
Dec 06 08:11:36 np0005548788.localdomain sudo[48929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:36 np0005548788.localdomain python3[48931]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:11:36 np0005548788.localdomain python3[48931]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Dec 06 08:11:36 np0005548788.localdomain python3[48931]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Dec 06 08:11:43 np0005548788.localdomain podman[48943]: 2025-12-06 08:11:36.881758666 +0000 UTC m=+0.026505869 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:11:43 np0005548788.localdomain python3[48931]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json
Dec 06 08:11:43 np0005548788.localdomain sudo[48929]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:44 np0005548788.localdomain sudo[49042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niuihxbvanizacyrsibxatuvixlfiuxj ; /usr/bin/python3
Dec 06 08:11:44 np0005548788.localdomain sudo[49042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:44 np0005548788.localdomain python3[49044]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:11:44 np0005548788.localdomain python3[49044]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Dec 06 08:11:44 np0005548788.localdomain python3[49044]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Dec 06 08:11:50 np0005548788.localdomain sudo[49119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:11:50 np0005548788.localdomain sudo[49119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:50 np0005548788.localdomain sudo[49119]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:50 np0005548788.localdomain sudo[49134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:11:50 np0005548788.localdomain sudo[49134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:51 np0005548788.localdomain podman[49056]: 2025-12-06 08:11:44.269092971 +0000 UTC m=+0.048087425 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:11:51 np0005548788.localdomain python3[49044]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json
Dec 06 08:11:51 np0005548788.localdomain sudo[49042]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:51 np0005548788.localdomain sudo[49254]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxmhrexpavrzcwxeunbpkurvbpqoeeuz ; /usr/bin/python3
Dec 06 08:11:51 np0005548788.localdomain sudo[49254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:51 np0005548788.localdomain podman[49256]: 2025-12-06 08:11:51.418263398 +0000 UTC m=+0.083492172 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, io.openshift.expose-services=)
Dec 06 08:11:51 np0005548788.localdomain python3[49258]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:11:51 np0005548788.localdomain python3[49258]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Dec 06 08:11:51 np0005548788.localdomain podman[49256]: 2025-12-06 08:11:51.523629444 +0000 UTC m=+0.188858278 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Dec 06 08:11:51 np0005548788.localdomain python3[49258]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Dec 06 08:11:51 np0005548788.localdomain sudo[49134]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:51 np0005548788.localdomain sudo[49348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:11:51 np0005548788.localdomain sudo[49348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:51 np0005548788.localdomain sudo[49348]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:51 np0005548788.localdomain sudo[49363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:11:51 np0005548788.localdomain sudo[49363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:52 np0005548788.localdomain sudo[49363]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:52 np0005548788.localdomain sshd[49435]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:11:55 np0005548788.localdomain sshd[49435]: Received disconnect from 102.140.97.134 port 33414:11: Bye Bye [preauth]
Dec 06 08:11:55 np0005548788.localdomain sshd[49435]: Disconnected from authenticating user root 102.140.97.134 port 33414 [preauth]
Dec 06 08:11:59 np0005548788.localdomain sudo[49706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:11:59 np0005548788.localdomain sudo[49706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:59 np0005548788.localdomain sudo[49706]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:08 np0005548788.localdomain podman[49300]: 2025-12-06 08:11:51.607485576 +0000 UTC m=+0.042352215 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:12:08 np0005548788.localdomain python3[49258]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json
Dec 06 08:12:08 np0005548788.localdomain sudo[49254]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:08 np0005548788.localdomain sudo[50016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrkwxyqqwcjoqafzsmbpxmppeckccikt ; /usr/bin/python3
Dec 06 08:12:08 np0005548788.localdomain sudo[50016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:08 np0005548788.localdomain python3[50018]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:08 np0005548788.localdomain python3[50018]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Dec 06 08:12:08 np0005548788.localdomain python3[50018]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Dec 06 08:12:22 np0005548788.localdomain podman[50030]: 2025-12-06 08:12:08.887407437 +0000 UTC m=+0.046188706 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:12:22 np0005548788.localdomain python3[50018]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json
Dec 06 08:12:22 np0005548788.localdomain sudo[50016]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:22 np0005548788.localdomain sudo[50110]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whcgyiddjwwskvlvoqxhhfgkdkkiryed ; /usr/bin/python3
Dec 06 08:12:22 np0005548788.localdomain sudo[50110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:22 np0005548788.localdomain python3[50112]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:22 np0005548788.localdomain python3[50112]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Dec 06 08:12:23 np0005548788.localdomain python3[50112]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Dec 06 08:12:31 np0005548788.localdomain podman[50125]: 2025-12-06 08:12:23.075357431 +0000 UTC m=+0.044067979 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:12:31 np0005548788.localdomain python3[50112]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json
Dec 06 08:12:32 np0005548788.localdomain sudo[50110]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:32 np0005548788.localdomain sudo[50361]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilnvfmhimgcwpvrdbwaegiappxayoazm ; /usr/bin/python3
Dec 06 08:12:32 np0005548788.localdomain sudo[50361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:32 np0005548788.localdomain python3[50363]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:32 np0005548788.localdomain python3[50363]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Dec 06 08:12:32 np0005548788.localdomain python3[50363]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Dec 06 08:12:36 np0005548788.localdomain podman[50376]: 2025-12-06 08:12:32.510498855 +0000 UTC m=+0.047403350 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:12:36 np0005548788.localdomain python3[50363]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json
Dec 06 08:12:36 np0005548788.localdomain sudo[50361]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:37 np0005548788.localdomain sudo[50452]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fchypxwwhnlobyhtdxowlijacualkgxf ; /usr/bin/python3
Dec 06 08:12:37 np0005548788.localdomain sudo[50452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:37 np0005548788.localdomain python3[50454]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:37 np0005548788.localdomain python3[50454]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Dec 06 08:12:37 np0005548788.localdomain python3[50454]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Dec 06 08:12:39 np0005548788.localdomain podman[50466]: 2025-12-06 08:12:37.317588099 +0000 UTC m=+0.045322316 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:12:39 np0005548788.localdomain python3[50454]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json
Dec 06 08:12:39 np0005548788.localdomain sudo[50452]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:39 np0005548788.localdomain sudo[50541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypouftnmpjtzvrcwanaplrieshkyotku ; /usr/bin/python3
Dec 06 08:12:39 np0005548788.localdomain sudo[50541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:39 np0005548788.localdomain python3[50543]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:39 np0005548788.localdomain python3[50543]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Dec 06 08:12:39 np0005548788.localdomain python3[50543]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Dec 06 08:12:41 np0005548788.localdomain podman[50556]: 2025-12-06 08:12:39.966676192 +0000 UTC m=+0.039639083 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:12:41 np0005548788.localdomain python3[50543]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json
Dec 06 08:12:42 np0005548788.localdomain sudo[50541]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:42 np0005548788.localdomain sudo[50631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhrysgttdvdnoxcqcuqpcjsehtivejlq ; /usr/bin/python3
Dec 06 08:12:42 np0005548788.localdomain sudo[50631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:42 np0005548788.localdomain python3[50633]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:42 np0005548788.localdomain python3[50633]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Dec 06 08:12:42 np0005548788.localdomain python3[50633]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Dec 06 08:12:45 np0005548788.localdomain podman[50645]: 2025-12-06 08:12:42.491436854 +0000 UTC m=+0.043290101 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 06 08:12:45 np0005548788.localdomain python3[50633]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json
Dec 06 08:12:45 np0005548788.localdomain sudo[50631]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:45 np0005548788.localdomain sudo[50719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfqhkqobvrehuibjlfybwbrzcrhjbhll ; /usr/bin/python3
Dec 06 08:12:45 np0005548788.localdomain sudo[50719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:45 np0005548788.localdomain python3[50721]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:45 np0005548788.localdomain python3[50721]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Dec 06 08:12:45 np0005548788.localdomain python3[50721]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Dec 06 08:12:48 np0005548788.localdomain podman[50734]: 2025-12-06 08:12:45.552513394 +0000 UTC m=+0.045189583 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:12:48 np0005548788.localdomain python3[50721]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json
Dec 06 08:12:49 np0005548788.localdomain sudo[50719]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:49 np0005548788.localdomain sudo[50821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvovwsqkgkytksrgedpwzxoxylcxncuu ; /usr/bin/python3
Dec 06 08:12:49 np0005548788.localdomain sudo[50821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:49 np0005548788.localdomain python3[50823]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:49 np0005548788.localdomain python3[50823]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Dec 06 08:12:49 np0005548788.localdomain python3[50823]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Dec 06 08:12:51 np0005548788.localdomain podman[50835]: 2025-12-06 08:12:49.44764084 +0000 UTC m=+0.046232195 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:12:51 np0005548788.localdomain python3[50823]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json
Dec 06 08:12:51 np0005548788.localdomain sudo[50821]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:52 np0005548788.localdomain sudo[50910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzzxojbnekwgbpksfzjuxejqlydhfwul ; /usr/bin/python3
Dec 06 08:12:52 np0005548788.localdomain sudo[50910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:52 np0005548788.localdomain python3[50912]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:12:52 np0005548788.localdomain sudo[50910]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:52 np0005548788.localdomain sudo[50960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kanlfpoyzldfkhqvxpgmkexalapbycgz ; /usr/bin/python3
Dec 06 08:12:52 np0005548788.localdomain sudo[50960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:52 np0005548788.localdomain sudo[50960]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:52 np0005548788.localdomain sudo[50978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmomjlodbkdgsjtckmupvjnrueopleas ; /usr/bin/python3
Dec 06 08:12:52 np0005548788.localdomain sudo[50978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:53 np0005548788.localdomain sudo[50978]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:53 np0005548788.localdomain sudo[51082]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-recnqtvduazaslkfqttbchqjpctxlube ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008773.2838285-83555-260612820270756/async_wrapper.py 857792411474 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008773.2838285-83555-260612820270756/AnsiballZ_command.py _
Dec 06 08:12:53 np0005548788.localdomain sudo[51082]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:12:53 np0005548788.localdomain ansible-async_wrapper.py[51084]: Invoked with 857792411474 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008773.2838285-83555-260612820270756/AnsiballZ_command.py _
Dec 06 08:12:53 np0005548788.localdomain ansible-async_wrapper.py[51087]: Starting module and watcher
Dec 06 08:12:53 np0005548788.localdomain ansible-async_wrapper.py[51087]: Start watching 51088 (3600)
Dec 06 08:12:53 np0005548788.localdomain ansible-async_wrapper.py[51088]: Start module (51088)
Dec 06 08:12:53 np0005548788.localdomain ansible-async_wrapper.py[51084]: Return async_wrapper task started.
Dec 06 08:12:53 np0005548788.localdomain sudo[51082]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:54 np0005548788.localdomain sudo[51103]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oevsicermberszleteptpmuecujajzwz ; /usr/bin/python3
Dec 06 08:12:54 np0005548788.localdomain sudo[51103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:54 np0005548788.localdomain python3[51108]: ansible-ansible.legacy.async_status Invoked with jid=857792411474.51084 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:12:54 np0005548788.localdomain sudo[51103]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    (file & line not available)
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    (file & line not available)
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.12 seconds
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Notice: Applied catalog in 0.05 seconds
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Application:
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    Initial environment: production
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    Converged environment: production
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:          Run mode: user
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Changes:
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:             Total: 3
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Events:
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:           Success: 3
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:             Total: 3
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Resources:
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:           Changed: 3
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:       Out of sync: 3
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:             Total: 10
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Time:
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:          Schedule: 0.00
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:              File: 0.00
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:              Exec: 0.01
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:            Augeas: 0.02
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    Transaction evaluation: 0.05
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    Catalog application: 0.05
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:    Config retrieval: 0.15
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:          Last run: 1765008777
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:        Filebucket: 0.00
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:             Total: 0.05
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]: Version:
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:            Config: 1765008777
Dec 06 08:12:57 np0005548788.localdomain puppet-user[51107]:            Puppet: 7.10.0
Dec 06 08:12:57 np0005548788.localdomain ansible-async_wrapper.py[51088]: Module complete (51088)
Dec 06 08:12:58 np0005548788.localdomain ansible-async_wrapper.py[51087]: Done in kid B.
Dec 06 08:13:00 np0005548788.localdomain sudo[51221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:13:00 np0005548788.localdomain sudo[51221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:13:00 np0005548788.localdomain sudo[51221]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:00 np0005548788.localdomain sudo[51236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:13:00 np0005548788.localdomain sudo[51236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:13:00 np0005548788.localdomain sudo[51236]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:01 np0005548788.localdomain sudo[51283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:13:01 np0005548788.localdomain sudo[51283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:13:01 np0005548788.localdomain sudo[51283]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:04 np0005548788.localdomain sudo[51311]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hofayspwhyzwdcerbvtrbzkbbewfjavi ; /usr/bin/python3
Dec 06 08:13:04 np0005548788.localdomain sudo[51311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:04 np0005548788.localdomain python3[51313]: ansible-ansible.legacy.async_status Invoked with jid=857792411474.51084 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:13:04 np0005548788.localdomain sudo[51311]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:04 np0005548788.localdomain sudo[51327]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdujrthaojcqtfaxixjzhvhwqrjjhaer ; /usr/bin/python3
Dec 06 08:13:04 np0005548788.localdomain sudo[51327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:05 np0005548788.localdomain python3[51329]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:13:05 np0005548788.localdomain sudo[51327]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:05 np0005548788.localdomain sudo[51343]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgcxbqfmqagqvoxpmaqapqkqizvojlwy ; /usr/bin/python3
Dec 06 08:13:05 np0005548788.localdomain sudo[51343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:05 np0005548788.localdomain python3[51345]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:13:05 np0005548788.localdomain sudo[51343]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:05 np0005548788.localdomain sudo[51391]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdvjbdpwzahkoaaoksjdqatotelyxryz ; /usr/bin/python3
Dec 06 08:13:05 np0005548788.localdomain sudo[51391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:05 np0005548788.localdomain python3[51393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:06 np0005548788.localdomain sudo[51391]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:06 np0005548788.localdomain sudo[51434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oltwzdempxbzgikdcadngieuaahuzucc ; /usr/bin/python3
Dec 06 08:13:06 np0005548788.localdomain sudo[51434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:06 np0005548788.localdomain python3[51436]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008785.636183-83774-170510570940108/source _original_basename=tmpqu6_d4ko follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:13:06 np0005548788.localdomain sudo[51434]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:07 np0005548788.localdomain sudo[51464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kprblbhbnofwqeewtqajoiqzywxqobmt ; /usr/bin/python3
Dec 06 08:13:07 np0005548788.localdomain sudo[51464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:07 np0005548788.localdomain python3[51466]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:07 np0005548788.localdomain sudo[51464]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:07 np0005548788.localdomain sudo[51480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wefpgxxoszyitlaidfrrkstjhvcvkgkv ; /usr/bin/python3
Dec 06 08:13:07 np0005548788.localdomain sudo[51480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:08 np0005548788.localdomain sudo[51480]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:08 np0005548788.localdomain sudo[51567]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmdvuvbmzuhnmtgnmhsjymenmkpkzilo ; /usr/bin/python3
Dec 06 08:13:08 np0005548788.localdomain sudo[51567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:08 np0005548788.localdomain python3[51569]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:13:08 np0005548788.localdomain sudo[51567]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:08 np0005548788.localdomain sudo[51586]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljgchjtnxyacuoteicdxdzvriajyzszc ; /usr/bin/python3
Dec 06 08:13:08 np0005548788.localdomain sudo[51586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:09 np0005548788.localdomain python3[51588]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:13:09 np0005548788.localdomain sudo[51586]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:09 np0005548788.localdomain sudo[51602]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnvjdhzmruphvqmfzezjtqcpelpzvmre ; /usr/bin/python3
Dec 06 08:13:09 np0005548788.localdomain sudo[51602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:09 np0005548788.localdomain python3[51604]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005548788 step=1 update_config_hash_only=False
Dec 06 08:13:09 np0005548788.localdomain sudo[51602]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:09 np0005548788.localdomain sudo[51618]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olivtzcclhsrxuwaxpnypabpgnivbpyb ; /usr/bin/python3
Dec 06 08:13:09 np0005548788.localdomain sudo[51618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:10 np0005548788.localdomain python3[51620]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:10 np0005548788.localdomain sudo[51618]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:10 np0005548788.localdomain sudo[51634]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyvqnxsughgequfetgmsjkddlectcqfq ; /usr/bin/python3
Dec 06 08:13:10 np0005548788.localdomain sudo[51634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:10 np0005548788.localdomain python3[51636]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:13:10 np0005548788.localdomain sudo[51634]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:10 np0005548788.localdomain sudo[51650]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aozoywpbujttpnpgzfjxntriipauuvii ; /usr/bin/python3
Dec 06 08:13:10 np0005548788.localdomain sudo[51650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:10 np0005548788.localdomain python3[51652]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 08:13:11 np0005548788.localdomain sudo[51650]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:11 np0005548788.localdomain sudo[51691]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmdnbfjcowyynyeqaclarbnuqwahvyey ; /usr/bin/python3
Dec 06 08:13:11 np0005548788.localdomain sudo[51691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:11 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:13:12 np0005548788.localdomain podman[51874]: 2025-12-06 08:13:12.260419251 +0000 UTC m=+0.034109804 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:13:12 np0005548788.localdomain podman[51875]: 2025-12-06 08:13:12.26409138 +0000 UTC m=+0.034129925 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:12 np0005548788.localdomain podman[51883]: 2025-12-06 08:13:12.264777025 +0000 UTC m=+0.030803753 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:13:12 np0005548788.localdomain podman[51875]: 2025-12-06 08:13:12.3664224 +0000 UTC m=+0.136460925 container create 7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:13:12 np0005548788.localdomain podman[51883]: 2025-12-06 08:13:12.396400264 +0000 UTC m=+0.162427012 container create 3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team)
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libpod-conmon-7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2.scope.
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/433650af555efa19b8507439c2ca3ea6c59dd70b79e1a09f2868d9a23cd1c65b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548788.localdomain podman[51922]: 2025-12-06 08:13:12.42928453 +0000 UTC m=+0.164635328 container create c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:13:12 np0005548788.localdomain podman[51874]: 2025-12-06 08:13:12.43577346 +0000 UTC m=+0.209464003 container create fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, container_name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, tcib_managed=true, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libpod-conmon-3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6.scope.
Dec 06 08:13:12 np0005548788.localdomain podman[51889]: 2025-12-06 08:13:12.444734913 +0000 UTC m=+0.203330182 container create 0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=container-puppet-crond, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 06 08:13:12 np0005548788.localdomain podman[51889]: 2025-12-06 08:13:12.358388147 +0000 UTC m=+0.116983426 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:13:12 np0005548788.localdomain podman[51922]: 2025-12-06 08:13:12.363421925 +0000 UTC m=+0.098772763 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libpod-conmon-c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c.scope.
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libpod-conmon-0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476.scope.
Dec 06 08:13:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d623bf221e6e39ec968f36fc3f06f79e6b1927337c95facabcab11b35de0560d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b131d40d9f7e4c77f700140905b9f02aae8796a802130306ce053e341f06500e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548788.localdomain podman[51883]: 2025-12-06 08:13:12.486908069 +0000 UTC m=+0.252934827 container init 3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z)
Dec 06 08:13:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1fbcfcf1579b596f592db44996077b25388e39ffb03069d7f60e867593509e2/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548788.localdomain podman[51922]: 2025-12-06 08:13:12.49392911 +0000 UTC m=+0.229279908 container init c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:13:12 np0005548788.localdomain podman[51883]: 2025-12-06 08:13:12.497772943 +0000 UTC m=+0.263799701 container start 3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container)
Dec 06 08:13:12 np0005548788.localdomain podman[51883]: 2025-12-06 08:13:12.498041698 +0000 UTC m=+0.264068456 container attach 3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, container_name=container-puppet-nova_libvirt, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 08:13:12 np0005548788.localdomain podman[51922]: 2025-12-06 08:13:12.501527643 +0000 UTC m=+0.236878481 container start c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, release=1761123044, container_name=container-puppet-collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:13:12 np0005548788.localdomain podman[51922]: 2025-12-06 08:13:12.501782378 +0000 UTC m=+0.237133216 container attach c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:51:28Z)
Dec 06 08:13:13 np0005548788.localdomain podman[51875]: 2025-12-06 08:13:13.782974975 +0000 UTC m=+1.553013540 container init 7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public)
Dec 06 08:13:13 np0005548788.localdomain podman[51875]: 2025-12-06 08:13:13.796732079 +0000 UTC m=+1.566770634 container start 7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1)
Dec 06 08:13:13 np0005548788.localdomain podman[51875]: 2025-12-06 08:13:13.797070318 +0000 UTC m=+1.567108883 container attach 7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=container-puppet-metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, release=1761123044, version=17.1.12)
Dec 06 08:13:13 np0005548788.localdomain podman[51889]: 2025-12-06 08:13:13.810369003 +0000 UTC m=+1.568964272 container init 0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Dec 06 08:13:13 np0005548788.localdomain podman[51889]: 2025-12-06 08:13:13.82280497 +0000 UTC m=+1.581400259 container start 0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, build-date=2025-11-18T22:49:32Z, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:13 np0005548788.localdomain podman[51889]: 2025-12-06 08:13:13.823099587 +0000 UTC m=+1.581694886 container attach 0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=container-puppet-crond, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:13:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3.scope.
Dec 06 08:13:13 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:13 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75558001a48f3327b2401ddbe77f60b90335541a0f5b86a4118ca1d021773542/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:13 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75558001a48f3327b2401ddbe77f60b90335541a0f5b86a4118ca1d021773542/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:13 np0005548788.localdomain podman[51874]: 2025-12-06 08:13:13.933306285 +0000 UTC m=+1.706996828 container init fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=container-puppet-iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:13 np0005548788.localdomain podman[51874]: 2025-12-06 08:13:13.945019506 +0000 UTC m=+1.718710009 container start fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, config_id=tripleo_puppet_step1, version=17.1.12)
Dec 06 08:13:13 np0005548788.localdomain podman[51874]: 2025-12-06 08:13:13.94795533 +0000 UTC m=+1.721645863 container attach fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=container-puppet-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Dec 06 08:13:14 np0005548788.localdomain podman[51776]: 2025-12-06 08:13:12.156507338 +0000 UTC m=+0.038279954 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 06 08:13:14 np0005548788.localdomain podman[52095]: 2025-12-06 08:13:14.779799418 +0000 UTC m=+0.065377856 container create d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:59Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, tcib_managed=true, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:13:14 np0005548788.localdomain systemd[1]: Started libpod-conmon-d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7.scope.
Dec 06 08:13:14 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:14 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f8522de440bc74dcb16bf54418e2bbfa2c88457c079b686ca75647399637b8a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:14 np0005548788.localdomain podman[52095]: 2025-12-06 08:13:14.746125694 +0000 UTC m=+0.031704132 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 06 08:13:14 np0005548788.localdomain podman[52095]: 2025-12-06 08:13:14.846068742 +0000 UTC m=+0.131647170 container init d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.12, container_name=container-puppet-ceilometer, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:11:59Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-central, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, vcs-type=git)
Dec 06 08:13:14 np0005548788.localdomain podman[52095]: 2025-12-06 08:13:14.854904612 +0000 UTC m=+0.140483040 container start d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, name=rhosp17/openstack-ceilometer-central, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_id=tripleo_puppet_step1, batch=17.1_20251118.1)
Dec 06 08:13:14 np0005548788.localdomain podman[52095]: 2025-12-06 08:13:14.855570136 +0000 UTC m=+0.141148604 container attach d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 08:13:15 np0005548788.localdomain systemd[1]: tmp-crun.JVkdhF.mount: Deactivated successfully.
Dec 06 08:13:15 np0005548788.localdomain ovs-vsctl[52269]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: Accepting previously invalid value for target type 'Integer'
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.12 seconds
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}7eb0cfcdcbb76a8af5279c9066596f561d84055999d408c5a0b29801bec07a61'
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Notice: Applied catalog in 0.03 seconds
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Application:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    Initial environment: production
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    Converged environment: production
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:          Run mode: user
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Changes:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:             Total: 7
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Events:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:           Success: 7
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:             Total: 7
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Resources:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:           Skipped: 13
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:           Changed: 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:       Out of sync: 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:             Total: 20
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Time:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:              File: 0.02
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    Transaction evaluation: 0.03
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    Catalog application: 0.03
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:    Config retrieval: 0.15
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:          Last run: 1765008795
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:             Total: 0.03
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]: Version:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:            Config: 1765008795
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52037]:            Puppet: 7.10.0
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    (file & line not available)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.10 seconds
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.07 seconds
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Dec 06 08:13:15 np0005548788.localdomain crontab[52456]: (root) LIST (root)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52064]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Dec 06 08:13:15 np0005548788.localdomain crontab[52459]: (root) REPLACE (root)
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Notice: Applied catalog in 0.08 seconds
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Application:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    Initial environment: production
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    Converged environment: production
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:          Run mode: user
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Changes:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:             Total: 2
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Events:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:           Success: 2
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:             Total: 2
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Resources:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:           Changed: 2
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:       Out of sync: 2
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:           Skipped: 7
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:             Total: 9
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Time:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:              File: 0.01
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:              Cron: 0.05
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    Transaction evaluation: 0.07
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    Catalog application: 0.08
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:    Config retrieval: 0.11
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:          Last run: 1765008795
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:             Total: 0.08
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]: Version:
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:            Config: 1765008795
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52047]:            Puppet: 7.10.0
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: in a future release. Use nova::cinder::os_region_name instead
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52024]: in a future release. Use nova::cinder::catalog_info instead
Dec 06 08:13:15 np0005548788.localdomain puppet-user[52022]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.36 seconds
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2.scope: Consumed 2.134s CPU time.
Dec 06 08:13:16 np0005548788.localdomain podman[51875]: 2025-12-06 08:13:16.058336546 +0000 UTC m=+3.828375101 container died 7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-metrics_qdr)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: tmp-crun.B8D88Z.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain podman[52494]: 2025-12-06 08:13:16.169154108 +0000 UTC m=+0.102616126 container cleanup 7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, container_name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-conmon-7747e4b72ebecc04191507c00df7562915ea8570f3228eb6c4bb965fdd785cb2.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476.scope: Consumed 2.211s CPU time.
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Notice: Applied catalog in 0.25 seconds
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Application:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:    Initial environment: production
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:    Converged environment: production
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:          Run mode: user
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Changes:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:             Total: 43
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Events:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:           Success: 43
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:             Total: 43
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Resources:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:           Skipped: 14
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:           Changed: 38
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:       Out of sync: 38
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:             Total: 82
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Time:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:       Concat file: 0.00
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:              File: 0.09
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:    Transaction evaluation: 0.24
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:    Catalog application: 0.25
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:    Config retrieval: 0.43
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:          Last run: 1765008796
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:    Concat fragment: 0.00
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:             Total: 0.25
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]: Version:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:            Config: 1765008795
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52022]:            Puppet: 7.10.0
Dec 06 08:13:16 np0005548788.localdomain podman[51889]: 2025-12-06 08:13:16.333423018 +0000 UTC m=+4.092018287 container died 0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Notice: Applied catalog in 0.48 seconds
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Application:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:    Initial environment: production
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:    Converged environment: production
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:          Run mode: user
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Changes:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:             Total: 4
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Events:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:           Success: 4
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:             Total: 4
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Resources:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:           Changed: 4
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:       Out of sync: 4
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:           Skipped: 8
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:             Total: 13
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Time:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:              File: 0.00
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:              Exec: 0.05
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:    Config retrieval: 0.13
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:            Augeas: 0.40
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:    Transaction evaluation: 0.46
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:    Catalog application: 0.48
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:          Last run: 1765008796
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:             Total: 0.48
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]: Version:
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:            Config: 1765008795
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52064]:            Puppet: 7.10.0
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-433650af555efa19b8507439c2ca3ea6c59dd70b79e1a09f2868d9a23cd1c65b-merged.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a1fbcfcf1579b596f592db44996077b25388e39ffb03069d7f60e867593509e2-merged.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain podman[52558]: 2025-12-06 08:13:16.404620458 +0000 UTC m=+0.113353516 container cleanup 0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, vcs-type=git, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-conmon-0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Dec 06 08:13:16 np0005548788.localdomain podman[52639]: 2025-12-06 08:13:16.627168162 +0000 UTC m=+0.068102445 container create 6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6.scope.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c.scope: Consumed 2.670s CPU time.
Dec 06 08:13:16 np0005548788.localdomain podman[51922]: 2025-12-06 08:13:16.676703707 +0000 UTC m=+4.412054515 container died c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, io.openshift.expose-services=, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e09d5332b76f4ff47d3f47a68c6210f077613857ce41c59afde7fed1d48f940c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:16 np0005548788.localdomain podman[52639]: 2025-12-06 08:13:16.588980741 +0000 UTC m=+0.029915104 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:13:16 np0005548788.localdomain podman[52639]: 2025-12-06 08:13:16.703068293 +0000 UTC m=+0.144002576 container init 6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-rsyslog, architecture=x86_64, release=1761123044, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, container_name=container-puppet-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Dec 06 08:13:16 np0005548788.localdomain podman[52639]: 2025-12-06 08:13:16.716707706 +0000 UTC m=+0.157641989 container start 6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-rsyslog, tcib_managed=true, config_id=tripleo_puppet_step1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z)
Dec 06 08:13:16 np0005548788.localdomain podman[52639]: 2025-12-06 08:13:16.71784462 +0000 UTC m=+0.158778903 container attach 6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, batch=17.1_20251118.1, container_name=container-puppet-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3.scope: Consumed 2.540s CPU time.
Dec 06 08:13:16 np0005548788.localdomain podman[51874]: 2025-12-06 08:13:16.784457512 +0000 UTC m=+4.558148045 container died fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, name=rhosp17/openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=container-puppet-iscsid)
Dec 06 08:13:16 np0005548788.localdomain podman[52720]: 2025-12-06 08:13:16.801259373 +0000 UTC m=+0.113203083 container cleanup c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=container-puppet-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-conmon-c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:13:16 np0005548788.localdomain podman[52781]: 2025-12-06 08:13:16.845820811 +0000 UTC m=+0.057120909 container cleanup fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, release=1761123044, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_puppet_step1)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: libpod-conmon-fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:13:16 np0005548788.localdomain podman[52753]: 2025-12-06 08:13:16.868268053 +0000 UTC m=+0.129516904 container create 16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=container-puppet-ovn_controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52138]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52138]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52138]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52138]:    (file & line not available)
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56.scope.
Dec 06 08:13:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a1b47ffdfc6345204e58005adbebe0d6f3492126b8c0115e7ce0b40d2b42062/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a1b47ffdfc6345204e58005adbebe0d6f3492126b8c0115e7ce0b40d2b42062/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52138]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52138]:    (file & line not available)
Dec 06 08:13:16 np0005548788.localdomain podman[52753]: 2025-12-06 08:13:16.910309637 +0000 UTC m=+0.171558488 container init 16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Dec 06 08:13:16 np0005548788.localdomain podman[52753]: 2025-12-06 08:13:16.82250588 +0000 UTC m=+0.083754751 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:13:16 np0005548788.localdomain puppet-user[52024]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 1.22 seconds
Dec 06 08:13:16 np0005548788.localdomain podman[52753]: 2025-12-06 08:13:16.968769693 +0000 UTC m=+0.230018534 container start 16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-type=git)
Dec 06 08:13:16 np0005548788.localdomain podman[52753]: 2025-12-06 08:13:16.969048279 +0000 UTC m=+0.230297120 container attach 16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_puppet_step1, container_name=container-puppet-ovn_controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.39 seconds
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}b5992c61c5e6c0fa60ac7720677a0efdfb73ceba695978e2f56794a0d035436f'
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Warning: Empty environment setting 'TLS_PASSWORD'
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}8f9f91b7bc846aa12da1e2df7356fc45f862596082e133d7976104ee8d1893c1'
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-75558001a48f3327b2401ddbe77f60b90335541a0f5b86a4118ca1d021773542-merged.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fae32f24db3d230b41d5a4a7c9e1c3fbb4eaf1d6f866f454d629616e234c5bf3-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b131d40d9f7e4c77f700140905b9f02aae8796a802130306ce053e341f06500e-merged.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c575f685ef762309e2a79c8b05c6749b4167388b82825f407ae593653ceaac0c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Notice: Applied catalog in 0.44 seconds
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Application:
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:    Initial environment: production
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:    Converged environment: production
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:          Run mode: user
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Changes:
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:             Total: 31
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Events:
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:           Success: 31
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:             Total: 31
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Resources:
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:           Skipped: 22
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:           Changed: 31
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:       Out of sync: 31
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:             Total: 151
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Time:
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:           Package: 0.01
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:    Ceilometer config: 0.36
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:    Transaction evaluation: 0.42
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:    Catalog application: 0.44
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:    Config retrieval: 0.45
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:          Last run: 1765008797
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:         Resources: 0.00
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:             Total: 0.44
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]: Version:
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:            Config: 1765008796
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52138]:            Puppet: 7.10.0
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Dec 06 08:13:17 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain systemd[1]: libpod-d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7.scope: Deactivated successfully.
Dec 06 08:13:18 np0005548788.localdomain systemd[1]: libpod-d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7.scope: Consumed 2.947s CPU time.
Dec 06 08:13:18 np0005548788.localdomain podman[52095]: 2025-12-06 08:13:18.312605354 +0000 UTC m=+3.598183812 container died d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:59Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:18 np0005548788.localdomain systemd[1]: tmp-crun.F6O9Xx.mount: Deactivated successfully.
Dec 06 08:13:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9f8522de440bc74dcb16bf54418e2bbfa2c88457c079b686ca75647399637b8a-merged.mount: Deactivated successfully.
Dec 06 08:13:18 np0005548788.localdomain podman[52987]: 2025-12-06 08:13:18.439472322 +0000 UTC m=+0.114951192 container cleanup d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-central, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_puppet_step1, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:13:18 np0005548788.localdomain systemd[1]: libpod-conmon-d752e30167ca8b7c1f5c838fb61ead2d9f57dc51f999b72016f649d8933343b7.scope: Deactivated successfully.
Dec 06 08:13:18 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]:    (file & line not available)
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]:    (file & line not available)
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]:    (file & line not available)
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]:    (file & line not available)
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52788]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.26 seconds
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Dec 06 08:13:18 np0005548788.localdomain puppet-user[52906]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.28 seconds
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53163]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53165]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53167]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}bedfffb4c1dbb89b8df495816ea22af56baa02d6ef80861fcf7dc6b324f7e3b1'
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Notice: Applied catalog in 0.26 seconds
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Application:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:    Initial environment: production
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:    Converged environment: production
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:          Run mode: user
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Changes:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:             Total: 3
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Events:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:           Success: 3
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:             Total: 3
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Resources:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:           Skipped: 11
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:           Changed: 3
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:       Out of sync: 3
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:             Total: 25
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Time:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:       Concat file: 0.00
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:    Concat fragment: 0.00
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:              File: 0.14
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:    Transaction evaluation: 0.25
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:    Catalog application: 0.26
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:    Config retrieval: 0.31
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:          Last run: 1765008799
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:             Total: 0.26
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]: Version:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:            Config: 1765008798
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52788]:            Puppet: 7.10.0
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53170]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005548788.localdomain
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005548788.novalocal' to 'np0005548788.localdomain'
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53172]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}9f797f9d49cf12085061840a6e15e35ef08aaf3c80bbe03bcf23d28dd55767ae'
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53179]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53182]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53187]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53189]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53191]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53193]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:24:cc:0d
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53218]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53222]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain ovs-vsctl[53229]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain systemd[1]: libpod-6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6.scope: Deactivated successfully.
Dec 06 08:13:19 np0005548788.localdomain systemd[1]: libpod-6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6.scope: Consumed 2.441s CPU time.
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain podman[52639]: 2025-12-06 08:13:19.467568528 +0000 UTC m=+2.908502801 container died 6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, version=17.1.12, name=rhosp17/openstack-rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Notice: Applied catalog in 0.46 seconds
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Application:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:    Initial environment: production
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:    Converged environment: production
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:          Run mode: user
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Changes:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:             Total: 14
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Events:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:           Success: 14
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:             Total: 14
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Resources:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:           Skipped: 12
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:           Changed: 14
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:       Out of sync: 14
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:             Total: 29
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Time:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:              Exec: 0.02
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:    Config retrieval: 0.31
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:         Vs config: 0.39
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:    Transaction evaluation: 0.45
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:    Catalog application: 0.46
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:          Last run: 1765008799
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:             Total: 0.46
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]: Version:
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:            Config: 1765008798
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52906]:            Puppet: 7.10.0
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548788.localdomain systemd[1]: libpod-16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56.scope: Deactivated successfully.
Dec 06 08:13:19 np0005548788.localdomain systemd[1]: libpod-16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56.scope: Consumed 2.782s CPU time.
Dec 06 08:13:19 np0005548788.localdomain podman[52753]: 2025-12-06 08:13:19.847895082 +0000 UTC m=+3.109143933 container died 16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Dec 06 08:13:20 np0005548788.localdomain podman[52925]: 2025-12-06 08:13:17.082709902 +0000 UTC m=+0.034011512 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5a1b47ffdfc6345204e58005adbebe0d6f3492126b8c0115e7ce0b40d2b42062-merged.mount: Deactivated successfully.
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:20 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Dec 06 08:13:20 np0005548788.localdomain podman[53286]: 2025-12-06 08:13:20.474854416 +0000 UTC m=+0.618319900 container cleanup 16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:20 np0005548788.localdomain podman[53241]: 2025-12-06 08:13:20.489184554 +0000 UTC m=+1.015177909 container cleanup 6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, tcib_managed=true, config_id=tripleo_puppet_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: libpod-conmon-16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56.scope: Deactivated successfully.
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: libpod-conmon-6843b1666c16f25391c238bc56a2f476e5c17d20de4dcf44cab5a21eeccdbed6.scope: Deactivated successfully.
Dec 06 08:13:20 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:13:20 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:13:20 np0005548788.localdomain podman[53348]: 2025-12-06 08:13:20.644218416 +0000 UTC m=+0.074867430 container create 7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:23:27Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-neutron, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-server, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1)
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: Started libpod-conmon-7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07.scope.
Dec 06 08:13:20 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:20 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b72aa0b501df4a0ecf9e2c06f7d460b982575e1ef57771c6fbb48ab40682d902/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:20 np0005548788.localdomain podman[53348]: 2025-12-06 08:13:20.60860541 +0000 UTC m=+0.039254404 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 06 08:13:20 np0005548788.localdomain podman[53348]: 2025-12-06 08:13:20.708960687 +0000 UTC m=+0.139609672 container init 7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, container_name=container-puppet-neutron, vendor=Red Hat, Inc., build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-server-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, vcs-type=git, architecture=x86_64, config_id=tripleo_puppet_step1, name=rhosp17/openstack-neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 06 08:13:20 np0005548788.localdomain podman[53348]: 2025-12-06 08:13:20.717281656 +0000 UTC m=+0.147930620 container start 7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, build-date=2025-11-19T00:23:27Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-neutron, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:13:20 np0005548788.localdomain podman[53348]: 2025-12-06 08:13:20.717852338 +0000 UTC m=+0.148501332 container attach 7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_id=tripleo_puppet_step1, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, build-date=2025-11-19T00:23:27Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-server-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-server, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, vcs-type=git)
Dec 06 08:13:20 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Dec 06 08:13:20 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:20 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e09d5332b76f4ff47d3f47a68c6210f077613857ce41c59afde7fed1d48f940c-merged.mount: Deactivated successfully.
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4'
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Notice: Applied catalog in 4.78 seconds
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Application:
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Initial environment: production
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Converged environment: production
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:          Run mode: user
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Changes:
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:             Total: 183
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Events:
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:           Success: 183
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:             Total: 183
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Resources:
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:           Changed: 183
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:       Out of sync: 183
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:           Skipped: 57
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:             Total: 487
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Time:
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:       Concat file: 0.00
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Concat fragment: 0.00
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:            Anchor: 0.00
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:         File line: 0.00
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Virtlogd config: 0.00
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Virtstoraged config: 0.01
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Virtsecretd config: 0.02
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:              Exec: 0.02
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Virtqemud config: 0.02
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:           Package: 0.02
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:              File: 0.03
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Virtproxyd config: 0.03
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Virtnodedevd config: 0.05
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:            Augeas: 1.10
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Config retrieval: 1.47
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:          Last run: 1765008801
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:       Nova config: 3.27
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Transaction evaluation: 4.76
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:    Catalog application: 4.78
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:         Resources: 0.00
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:             Total: 4.78
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]: Version:
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:            Config: 1765008795
Dec 06 08:13:21 np0005548788.localdomain puppet-user[52024]:            Puppet: 7.10.0
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]:    (file & line not available)
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]:    (file & line not available)
Dec 06 08:13:22 np0005548788.localdomain puppet-user[53403]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Dec 06 08:13:22 np0005548788.localdomain systemd[1]: libpod-3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6.scope: Deactivated successfully.
Dec 06 08:13:22 np0005548788.localdomain systemd[1]: libpod-3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6.scope: Consumed 8.736s CPU time.
Dec 06 08:13:22 np0005548788.localdomain podman[51883]: 2025-12-06 08:13:22.982872709 +0000 UTC m=+10.748899487 container died 3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, container_name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 08:13:23 np0005548788.localdomain systemd[1]: tmp-crun.xpGBlR.mount: Deactivated successfully.
Dec 06 08:13:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d623bf221e6e39ec968f36fc3f06f79e6b1927337c95facabcab11b35de0560d-merged.mount: Deactivated successfully.
Dec 06 08:13:23 np0005548788.localdomain podman[53515]: 2025-12-06 08:13:23.150798808 +0000 UTC m=+0.158583100 container cleanup 3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., release=1761123044)
Dec 06 08:13:23 np0005548788.localdomain systemd[1]: libpod-conmon-3af232aad55edd33e14f05858ec267c1ff5fc15cddab19f95847e021ff3d2fa6.scope: Deactivated successfully.
Dec 06 08:13:23 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.62 seconds
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Notice: Applied catalog in 0.60 seconds
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Application:
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Initial environment: production
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Converged environment: production
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:          Run mode: user
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Changes:
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:             Total: 33
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Events:
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:           Success: 33
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:             Total: 33
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Resources:
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:           Skipped: 21
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:           Changed: 33
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:       Out of sync: 33
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:             Total: 155
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Time:
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:         Resources: 0.00
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Ovn metadata agent config: 0.05
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Neutron config: 0.48
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Transaction evaluation: 0.59
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Catalog application: 0.60
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:    Config retrieval: 0.69
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:          Last run: 1765008803
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:             Total: 0.60
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]: Version:
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:            Config: 1765008802
Dec 06 08:13:23 np0005548788.localdomain puppet-user[53403]:            Puppet: 7.10.0
Dec 06 08:13:24 np0005548788.localdomain systemd[1]: libpod-7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07.scope: Deactivated successfully.
Dec 06 08:13:24 np0005548788.localdomain systemd[1]: libpod-7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07.scope: Consumed 3.613s CPU time.
Dec 06 08:13:24 np0005548788.localdomain podman[53586]: 2025-12-06 08:13:24.557683954 +0000 UTC m=+0.036464394 container died 7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=container-puppet-neutron, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, release=1761123044, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-neutron-server-container, summary=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-server)
Dec 06 08:13:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b72aa0b501df4a0ecf9e2c06f7d460b982575e1ef57771c6fbb48ab40682d902-merged.mount: Deactivated successfully.
Dec 06 08:13:24 np0005548788.localdomain podman[53586]: 2025-12-06 08:13:24.630913288 +0000 UTC m=+0.109693708 container cleanup 7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, build-date=2025-11-19T00:23:27Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-server-container, architecture=x86_64, release=1761123044, version=17.1.12, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team)
Dec 06 08:13:24 np0005548788.localdomain systemd[1]: libpod-conmon-7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07.scope: Deactivated successfully.
Dec 06 08:13:24 np0005548788.localdomain python3[51693]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548788 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548788', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 06 08:13:24 np0005548788.localdomain sudo[51691]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:25 np0005548788.localdomain sudo[53638]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyyapchagwhicunllbzfmnefmpwyfiko ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:25 np0005548788.localdomain sudo[53638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:25 np0005548788.localdomain python3[53640]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:25 np0005548788.localdomain sudo[53638]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:25 np0005548788.localdomain sudo[53654]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vukefqkftaqtwaiqnhmivbkxaedfnjli ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:25 np0005548788.localdomain sudo[53654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:26 np0005548788.localdomain sudo[53654]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:26 np0005548788.localdomain sudo[53670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhfqtjborwijodbkeqcacokbmqcgezdx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:26 np0005548788.localdomain sudo[53670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:26 np0005548788.localdomain python3[53672]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:13:26 np0005548788.localdomain sudo[53670]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:26 np0005548788.localdomain sudo[53720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzcffaqylfmgbrqjvfpzxktvrweygetx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:26 np0005548788.localdomain sudo[53720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:27 np0005548788.localdomain python3[53722]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:27 np0005548788.localdomain sudo[53720]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:27 np0005548788.localdomain sudo[53763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzmhkwwdoegpdwafhqmuqhprjdkypvfi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:27 np0005548788.localdomain sudo[53763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:27 np0005548788.localdomain python3[53765]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008806.8135343-84342-241879445773116/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:27 np0005548788.localdomain sudo[53763]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:27 np0005548788.localdomain sudo[53825]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgmefgmochjctuubdfngjmogklpweqww ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:27 np0005548788.localdomain sudo[53825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:27 np0005548788.localdomain python3[53827]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:28 np0005548788.localdomain sudo[53825]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:28 np0005548788.localdomain sudo[53868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwypkymfpbuarkeqchbvnqfpalqlotup ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:28 np0005548788.localdomain sudo[53868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:28 np0005548788.localdomain python3[53870]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008807.6525095-84342-50558010115972/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:28 np0005548788.localdomain sudo[53868]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:28 np0005548788.localdomain sudo[53930]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsvjipippgyoubhgvalpzryuqnummyul ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:28 np0005548788.localdomain sudo[53930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:28 np0005548788.localdomain python3[53932]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:28 np0005548788.localdomain sudo[53930]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:29 np0005548788.localdomain sudo[53973]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbjdfdzfvyjpfvfsljxdvhxtwnlfdske ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:29 np0005548788.localdomain sudo[53973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:29 np0005548788.localdomain python3[53975]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008808.6268942-84412-96965139062303/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:29 np0005548788.localdomain sudo[53973]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:29 np0005548788.localdomain sudo[54035]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jofecakuvdtfvbtaukiwylolvgjavlvy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:29 np0005548788.localdomain sudo[54035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:29 np0005548788.localdomain python3[54037]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:29 np0005548788.localdomain sudo[54035]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:30 np0005548788.localdomain sudo[54078]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvhiolcztsbwtbaxnjtjzzlwquxhounq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:30 np0005548788.localdomain sudo[54078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:30 np0005548788.localdomain python3[54080]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008809.5456593-84478-84607990682979/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:30 np0005548788.localdomain sudo[54078]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:30 np0005548788.localdomain sudo[54108]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wirxenqfpruhujwujiwcvhoxlxkarnbw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:30 np0005548788.localdomain sudo[54108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:30 np0005548788.localdomain python3[54110]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:13:30 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:13:30 np0005548788.localdomain systemd-sysv-generator[54138]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:30 np0005548788.localdomain systemd-rc-local-generator[54133]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:30 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:31 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:13:31 np0005548788.localdomain systemd-rc-local-generator[54171]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:31 np0005548788.localdomain systemd-sysv-generator[54177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:31 np0005548788.localdomain systemd[1]: Starting TripleO Container Shutdown...
Dec 06 08:13:31 np0005548788.localdomain systemd[1]: Finished TripleO Container Shutdown.
Dec 06 08:13:31 np0005548788.localdomain sudo[54108]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:31 np0005548788.localdomain sudo[54232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfwjlflehkrpsniprxfbpujxylcclmhr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:31 np0005548788.localdomain sudo[54232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:31 np0005548788.localdomain python3[54234]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:31 np0005548788.localdomain sudo[54232]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:32 np0005548788.localdomain sudo[54275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjlorfrtbbtojepgpcxdhecuvmaoevqt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:32 np0005548788.localdomain sudo[54275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:32 np0005548788.localdomain python3[54277]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008811.4878993-84525-226146399720460/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:32 np0005548788.localdomain sudo[54275]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:32 np0005548788.localdomain sudo[54337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vomhucgsienparkuhepffdokysatfhim ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:32 np0005548788.localdomain sudo[54337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:32 np0005548788.localdomain python3[54339]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:32 np0005548788.localdomain sudo[54337]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:33 np0005548788.localdomain sudo[54380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqxiepkebjaezwghmvkobnkpqhuyjvfc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:33 np0005548788.localdomain sudo[54380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:33 np0005548788.localdomain python3[54382]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008812.4750502-84550-63715529706594/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:33 np0005548788.localdomain sudo[54380]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:33 np0005548788.localdomain sudo[54410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgkfcnfmjijokqiqpteqooptxfboplph ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:33 np0005548788.localdomain sudo[54410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:33 np0005548788.localdomain python3[54412]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:13:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:13:33 np0005548788.localdomain systemd-rc-local-generator[54438]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:33 np0005548788.localdomain systemd-sysv-generator[54441]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:13:34 np0005548788.localdomain systemd-sysv-generator[54475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:34 np0005548788.localdomain systemd-rc-local-generator[54471]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:13:34 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:13:34 np0005548788.localdomain sudo[54410]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:34 np0005548788.localdomain sudo[54503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srlyjjmybwvtebpfyjgdcbyucjowuuxf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:34 np0005548788.localdomain sudo[54503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 81961d6936cf88d92c0300cf23428c94
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: c6dd5c7aeba6260998a0bbde3ab20933
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: d3b0a004e533211bab6cc44495102b19
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 1d18e9db1b81af61c21222485fd9085f
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 1d18e9db1b81af61c21222485fd9085f
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: dc659970751309b021f4b1201ffad0ee
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain python3[54505]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 558ed7a6d0c1bb3d92c212dc57d9717b
Dec 06 08:13:35 np0005548788.localdomain sudo[54503]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:35 np0005548788.localdomain sudo[54519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnvfvsbzlrevbjjbtyalbrsooclpwnea ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:35 np0005548788.localdomain sudo[54519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:35 np0005548788.localdomain sudo[54519]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:36 np0005548788.localdomain sudo[54561]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdpebwrqbolduwsphzumxqzszfrnrjgb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:36 np0005548788.localdomain sudo[54561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:36 np0005548788.localdomain python3[54563]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:13:36 np0005548788.localdomain podman[54601]: 2025-12-06 08:13:36.824626801 +0000 UTC m=+0.093328922 container create 7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr_init_logs, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:13:36 np0005548788.localdomain systemd[1]: Started libpod-conmon-7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486.scope.
Dec 06 08:13:36 np0005548788.localdomain podman[54601]: 2025-12-06 08:13:36.781414661 +0000 UTC m=+0.050116812 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:36 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6697f986b9c629bb25e894d1903f9f626e3727e4d7efcca4b461d4a3e304ed0e/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:36 np0005548788.localdomain podman[54601]: 2025-12-06 08:13:36.909507572 +0000 UTC m=+0.178209703 container init 7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 08:13:36 np0005548788.localdomain systemd[1]: tmp-crun.XkxwEr.mount: Deactivated successfully.
Dec 06 08:13:36 np0005548788.localdomain podman[54601]: 2025-12-06 08:13:36.920753808 +0000 UTC m=+0.189455929 container start 7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr_init_logs)
Dec 06 08:13:36 np0005548788.localdomain podman[54601]: 2025-12-06 08:13:36.921318375 +0000 UTC m=+0.190020546 container attach 7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, release=1761123044, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=metrics_qdr_init_logs, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:13:36 np0005548788.localdomain systemd[1]: libpod-7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486.scope: Deactivated successfully.
Dec 06 08:13:36 np0005548788.localdomain podman[54601]: 2025-12-06 08:13:36.930801187 +0000 UTC m=+0.199503338 container died 7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr_init_logs, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, vcs-type=git)
Dec 06 08:13:37 np0005548788.localdomain podman[54621]: 2025-12-06 08:13:37.012794739 +0000 UTC m=+0.072342516 container cleanup 7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git)
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: libpod-conmon-7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486.scope: Deactivated successfully.
Dec 06 08:13:37 np0005548788.localdomain python3[54563]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Dec 06 08:13:37 np0005548788.localdomain podman[54698]: 2025-12-06 08:13:37.541450701 +0000 UTC m=+0.113148002 container create 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:13:37 np0005548788.localdomain podman[54698]: 2025-12-06 08:13:37.474389328 +0000 UTC m=+0.046086659 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: Started libpod-conmon-3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.scope.
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:37 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d4ba4a6e0259b5150b68a23f46c9e702457315d900e4a8419ae01ffeed1203/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:37 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d4ba4a6e0259b5150b68a23f46c9e702457315d900e4a8419ae01ffeed1203/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:13:37 np0005548788.localdomain podman[54698]: 2025-12-06 08:13:37.642290773 +0000 UTC m=+0.213988074 container init 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team)
Dec 06 08:13:37 np0005548788.localdomain sudo[54718]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:13:37 np0005548788.localdomain sudo[54718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:13:37 np0005548788.localdomain podman[54698]: 2025-12-06 08:13:37.684554013 +0000 UTC m=+0.256251284 container start 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, version=17.1.12, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:13:37 np0005548788.localdomain python3[54563]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=81961d6936cf88d92c0300cf23428c94 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:37 np0005548788.localdomain sudo[54718]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6697f986b9c629bb25e894d1903f9f626e3727e4d7efcca4b461d4a3e304ed0e-merged.mount: Deactivated successfully.
Dec 06 08:13:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:37 np0005548788.localdomain podman[54720]: 2025-12-06 08:13:37.834604288 +0000 UTC m=+0.135746526 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, tcib_managed=true, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Dec 06 08:13:37 np0005548788.localdomain sudo[54561]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:38 np0005548788.localdomain podman[54720]: 2025-12-06 08:13:38.053601934 +0000 UTC m=+0.354744172 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Dec 06 08:13:38 np0005548788.localdomain sudo[54790]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiwmfctztgkxtaqlyvilxoyqauisaxjw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:38 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:13:38 np0005548788.localdomain sudo[54790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:38 np0005548788.localdomain python3[54792]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:38 np0005548788.localdomain sudo[54790]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:38 np0005548788.localdomain sudo[54806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okpbqksxklagyiusynbjsguaxpgrftpk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:38 np0005548788.localdomain sudo[54806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:38 np0005548788.localdomain python3[54808]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:13:38 np0005548788.localdomain sudo[54806]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:38 np0005548788.localdomain sudo[54867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sscsjlmtuhfmdfupnjktrchyoavuxxwr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:38 np0005548788.localdomain sudo[54867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:39 np0005548788.localdomain python3[54869]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008818.5319493-84727-249327246223503/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:39 np0005548788.localdomain sudo[54867]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:39 np0005548788.localdomain sudo[54883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkhaawfgxvwpblizldlyfxcpeoiboqih ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:39 np0005548788.localdomain sudo[54883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:39 np0005548788.localdomain python3[54885]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:13:39 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:13:39 np0005548788.localdomain systemd-sysv-generator[54915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:39 np0005548788.localdomain systemd-rc-local-generator[54909]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:39 np0005548788.localdomain sudo[54883]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:40 np0005548788.localdomain sudo[54935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojbuupazihxbblyopjucqzhlkvziftsd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:40 np0005548788.localdomain sudo[54935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:40 np0005548788.localdomain python3[54937]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:13:41 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:13:41 np0005548788.localdomain systemd-sysv-generator[54967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:41 np0005548788.localdomain systemd-rc-local-generator[54964]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:41 np0005548788.localdomain systemd[1]: Starting metrics_qdr container...
Dec 06 08:13:41 np0005548788.localdomain systemd[1]: Started metrics_qdr container.
Dec 06 08:13:41 np0005548788.localdomain sudo[54935]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:42 np0005548788.localdomain sudo[55017]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huweobbsidsrnvrhxxlmkukhkcbjcagt ; /usr/bin/python3
Dec 06 08:13:42 np0005548788.localdomain sudo[55017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:42 np0005548788.localdomain python3[55019]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:42 np0005548788.localdomain sudo[55017]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:42 np0005548788.localdomain sudo[55065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkatfypbcpqstqhcwtfdjoaoykvwvgfw ; /usr/bin/python3
Dec 06 08:13:42 np0005548788.localdomain sudo[55065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:42 np0005548788.localdomain sudo[55065]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:43 np0005548788.localdomain sudo[55108]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpuzhdulxbggafxuvtcdklrmixhzdcde ; /usr/bin/python3
Dec 06 08:13:43 np0005548788.localdomain sudo[55108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:43 np0005548788.localdomain sudo[55108]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:43 np0005548788.localdomain sudo[55138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqmsxrcaoccobhqvnbpddbavpxsizehp ; /usr/bin/python3
Dec 06 08:13:43 np0005548788.localdomain sudo[55138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:43 np0005548788.localdomain python3[55140]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005548788 step=1 update_config_hash_only=False
Dec 06 08:13:43 np0005548788.localdomain sudo[55138]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:44 np0005548788.localdomain sudo[55154]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xraidmbflrocgyxxqpofhvcpxouvcwdb ; /usr/bin/python3
Dec 06 08:13:44 np0005548788.localdomain sudo[55154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:44 np0005548788.localdomain python3[55156]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:44 np0005548788.localdomain sudo[55154]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:44 np0005548788.localdomain sudo[55170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxcgvrofaindjsfneltvbxsmfwlcpedt ; /usr/bin/python3
Dec 06 08:13:44 np0005548788.localdomain sudo[55170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:44 np0005548788.localdomain python3[55172]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:13:44 np0005548788.localdomain sudo[55170]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:57 np0005548788.localdomain sshd[55173]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:13:59 np0005548788.localdomain sshd[55173]: Received disconnect from 152.32.172.117 port 35124:11: Bye Bye [preauth]
Dec 06 08:13:59 np0005548788.localdomain sshd[55173]: Disconnected from authenticating user root 152.32.172.117 port 35124 [preauth]
Dec 06 08:14:01 np0005548788.localdomain sudo[55175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:14:01 np0005548788.localdomain sudo[55175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:14:01 np0005548788.localdomain sudo[55175]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:01 np0005548788.localdomain sudo[55190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:14:01 np0005548788.localdomain sudo[55190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:14:02 np0005548788.localdomain sudo[55190]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:03 np0005548788.localdomain sudo[55237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:14:03 np0005548788.localdomain sudo[55237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:14:03 np0005548788.localdomain sudo[55237]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:14:08 np0005548788.localdomain podman[55252]: 2025-12-06 08:14:08.257817662 +0000 UTC m=+0.086754679 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:14:08 np0005548788.localdomain podman[55252]: 2025-12-06 08:14:08.442742781 +0000 UTC m=+0.271679778 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:14:08 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:14:08 np0005548788.localdomain sshd[55282]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:14:13 np0005548788.localdomain sshd[55282]: Received disconnect from 102.140.97.134 port 34652:11: Bye Bye [preauth]
Dec 06 08:14:13 np0005548788.localdomain sshd[55282]: Disconnected from authenticating user root 102.140.97.134 port 34652 [preauth]
Dec 06 08:14:29 np0005548788.localdomain sshd[55284]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:14:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:14:39 np0005548788.localdomain podman[55285]: 2025-12-06 08:14:39.253598168 +0000 UTC m=+0.083968663 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Dec 06 08:14:39 np0005548788.localdomain podman[55285]: 2025-12-06 08:14:39.440323381 +0000 UTC m=+0.270693816 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, version=17.1.12, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:14:39 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:14:39 np0005548788.localdomain sshd[55284]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:14:39 np0005548788.localdomain sshd[55284]: banner exchange: Connection from 14.103.142.227 port 46174: Connection timed out
Dec 06 08:15:03 np0005548788.localdomain sudo[55313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:15:03 np0005548788.localdomain sudo[55313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:15:03 np0005548788.localdomain sudo[55313]: pam_unix(sudo:session): session closed for user root
Dec 06 08:15:03 np0005548788.localdomain sudo[55328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:15:03 np0005548788.localdomain sudo[55328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:15:03 np0005548788.localdomain sudo[55328]: pam_unix(sudo:session): session closed for user root
Dec 06 08:15:04 np0005548788.localdomain sudo[55374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:15:04 np0005548788.localdomain sudo[55374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:15:04 np0005548788.localdomain sudo[55374]: pam_unix(sudo:session): session closed for user root
Dec 06 08:15:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:15:10 np0005548788.localdomain systemd[1]: tmp-crun.krdRXv.mount: Deactivated successfully.
Dec 06 08:15:10 np0005548788.localdomain podman[55389]: 2025-12-06 08:15:10.220472365 +0000 UTC m=+0.056486858 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step1)
Dec 06 08:15:10 np0005548788.localdomain podman[55389]: 2025-12-06 08:15:10.402369411 +0000 UTC m=+0.238383844 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, managed_by=tripleo_ansible)
Dec 06 08:15:10 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:15:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:15:41 np0005548788.localdomain podman[55417]: 2025-12-06 08:15:41.257618945 +0000 UTC m=+0.082345844 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 08:15:41 np0005548788.localdomain podman[55417]: 2025-12-06 08:15:41.4908811 +0000 UTC m=+0.315608049 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:15:41 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:16:02 np0005548788.localdomain sshd[55447]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:16:04 np0005548788.localdomain sudo[55449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:16:04 np0005548788.localdomain sudo[55449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:16:04 np0005548788.localdomain sudo[55449]: pam_unix(sudo:session): session closed for user root
Dec 06 08:16:04 np0005548788.localdomain sudo[55464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:16:04 np0005548788.localdomain sudo[55464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:16:05 np0005548788.localdomain sudo[55464]: pam_unix(sudo:session): session closed for user root
Dec 06 08:16:06 np0005548788.localdomain sudo[55510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:16:06 np0005548788.localdomain sudo[55510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:16:06 np0005548788.localdomain sudo[55510]: pam_unix(sudo:session): session closed for user root
Dec 06 08:16:09 np0005548788.localdomain sshd[55447]: Received disconnect from 45.78.222.109 port 39814:11: Bye Bye [preauth]
Dec 06 08:16:09 np0005548788.localdomain sshd[55447]: Disconnected from authenticating user root 45.78.222.109 port 39814 [preauth]
Dec 06 08:16:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:16:12 np0005548788.localdomain podman[55525]: 2025-12-06 08:16:12.259836011 +0000 UTC m=+0.084958541 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Dec 06 08:16:12 np0005548788.localdomain podman[55525]: 2025-12-06 08:16:12.448255394 +0000 UTC m=+0.273377924 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:16:12 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:16:19 np0005548788.localdomain sshd[55554]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:16:28 np0005548788.localdomain sshd[55555]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:16:29 np0005548788.localdomain sshd[55554]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:16:29 np0005548788.localdomain sshd[55554]: banner exchange: Connection from 183.15.121.139 port 42642: Connection timed out
Dec 06 08:16:33 np0005548788.localdomain sshd[55555]: Received disconnect from 102.140.97.134 port 47810:11: Bye Bye [preauth]
Dec 06 08:16:33 np0005548788.localdomain sshd[55555]: Disconnected from authenticating user root 102.140.97.134 port 47810 [preauth]
Dec 06 08:16:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:16:43 np0005548788.localdomain podman[55557]: 2025-12-06 08:16:43.256612958 +0000 UTC m=+0.088339154 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 08:16:43 np0005548788.localdomain podman[55557]: 2025-12-06 08:16:43.48071138 +0000 UTC m=+0.312437576 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Dec 06 08:16:43 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:16:46 np0005548788.localdomain sshd[55586]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:16:48 np0005548788.localdomain sshd[55586]: Received disconnect from 152.32.172.117 port 51618:11: Bye Bye [preauth]
Dec 06 08:16:48 np0005548788.localdomain sshd[55586]: Disconnected from authenticating user root 152.32.172.117 port 51618 [preauth]
Dec 06 08:17:06 np0005548788.localdomain sudo[55588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:17:06 np0005548788.localdomain sudo[55588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:17:06 np0005548788.localdomain sudo[55588]: pam_unix(sudo:session): session closed for user root
Dec 06 08:17:06 np0005548788.localdomain sudo[55603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:17:06 np0005548788.localdomain sudo[55603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:17:07 np0005548788.localdomain sudo[55603]: pam_unix(sudo:session): session closed for user root
Dec 06 08:17:07 np0005548788.localdomain sudo[55649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:17:07 np0005548788.localdomain sudo[55649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:17:07 np0005548788.localdomain sudo[55649]: pam_unix(sudo:session): session closed for user root
Dec 06 08:17:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:17:14 np0005548788.localdomain systemd[1]: tmp-crun.Gq406V.mount: Deactivated successfully.
Dec 06 08:17:14 np0005548788.localdomain podman[55665]: 2025-12-06 08:17:14.247340718 +0000 UTC m=+0.076386214 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, vendor=Red Hat, Inc., container_name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:17:14 np0005548788.localdomain podman[55665]: 2025-12-06 08:17:14.434525443 +0000 UTC m=+0.263570929 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:17:14 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:17:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:17:45 np0005548788.localdomain systemd[1]: tmp-crun.AURhMG.mount: Deactivated successfully.
Dec 06 08:17:45 np0005548788.localdomain podman[55695]: 2025-12-06 08:17:45.261590598 +0000 UTC m=+0.089209272 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, managed_by=tripleo_ansible)
Dec 06 08:17:45 np0005548788.localdomain podman[55695]: 2025-12-06 08:17:45.452069995 +0000 UTC m=+0.279688709 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:17:45 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:18:07 np0005548788.localdomain sudo[55725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:18:07 np0005548788.localdomain sudo[55725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:07 np0005548788.localdomain sudo[55725]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:07 np0005548788.localdomain sudo[55740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:18:07 np0005548788.localdomain sudo[55740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:08 np0005548788.localdomain sudo[55740]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:08 np0005548788.localdomain sudo[55786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:08 np0005548788.localdomain sudo[55786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:08 np0005548788.localdomain sudo[55786]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:12 np0005548788.localdomain sshd[55801]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:14 np0005548788.localdomain sshd[55801]: Received disconnect from 152.32.172.117 port 37818:11: Bye Bye [preauth]
Dec 06 08:18:14 np0005548788.localdomain sshd[55801]: Disconnected from authenticating user root 152.32.172.117 port 37818 [preauth]
Dec 06 08:18:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:18:16 np0005548788.localdomain podman[55803]: 2025-12-06 08:18:16.268131689 +0000 UTC m=+0.097026470 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 06 08:18:16 np0005548788.localdomain podman[55803]: 2025-12-06 08:18:16.485866416 +0000 UTC m=+0.314761237 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:18:16 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:18:29 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [3,1,5] r=2 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:31 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [1,3,2] r=2 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:32 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [5,0,1] r=0 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:33 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 25 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [5,0,1] r=0 lpr=24 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:34 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 25 pg[5.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [4,5,3] r=1 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:37 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 29 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=29 pruub=15.216296196s) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 active pruub 1119.253540039s@ mbc={}] start_peering_interval up [3,1,5] -> [3,1,5], acting [3,1,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:37 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 29 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=29 pruub=15.213653564s) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1119.253540039s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1f( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1e( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1b( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1c( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1d( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.9( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.a( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.8( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.4( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.5( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.3( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.7( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.6( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.b( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.c( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.d( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.f( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.11( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.10( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.13( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.14( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.15( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.e( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.17( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.16( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.19( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.18( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.12( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.1a( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 30 pg[2.2( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=2 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:39 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 31 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=31 pruub=10.208538055s) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active pruub 1115.679931641s@ mbc={}] start_peering_interval up [5,0,1] -> [5,0,1], acting [5,0,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:39 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 31 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=31 pruub=10.208538055s) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1115.679931641s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:39 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 31 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=31 pruub=15.674138069s) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active pruub 1125.421508789s@ mbc={}] start_peering_interval up [1,3,2] -> [1,3,2], acting [1,3,2] -> [1,3,2], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:39 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 31 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=31 pruub=15.671662331s) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.421508789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1c( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1e( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.11( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.12( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.13( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.10( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.15( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.17( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.14( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.9( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.16( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.8( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.b( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.d( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.a( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.6( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1f( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.7( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.4( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.5( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.3( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.2( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.f( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.e( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.c( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1d( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1a( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.18( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.19( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1b( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.19( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.18( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1a( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.17( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.15( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.16( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.14( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.13( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.12( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.10( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.11( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.f( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1b( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.e( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.c( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.d( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.2( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.3( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.4( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.5( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.7( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.6( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.8( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.9( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.a( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.b( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1c( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1d( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1e( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.0( empty local-lis/les=31/32 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 32 pg[3.1f( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=2 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.16( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 32 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=0 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:41 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 33 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=8.837900162s) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 1116.382202148s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:41 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 33 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=8.835554123s) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1116.382202148s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1d( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1e( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1f( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.10( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.11( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.12( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.13( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.14( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.15( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.16( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.17( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.8( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.9( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.a( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.b( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.7( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.5( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.c( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.4( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.3( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.6( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.2( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.f( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.d( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1c( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1b( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.1a( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.19( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.18( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 34 pg[5.e( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:42 np0005548788.localdomain sshd[55834]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:45 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 35 pg[6.0( empty local-lis/les=0/0 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [5,0,1] r=0 lpr=35 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:46 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Dec 06 08:18:46 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Dec 06 08:18:46 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 36 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [5,0,1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:18:47 np0005548788.localdomain podman[55835]: 2025-12-06 08:18:47.242251346 +0000 UTC m=+0.071963821 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Dec 06 08:18:47 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec 06 08:18:47 np0005548788.localdomain podman[55835]: 2025-12-06 08:18:47.423671174 +0000 UTC m=+0.253383649 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 06 08:18:47 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec 06 08:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:18:48 np0005548788.localdomain sshd[55863]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:48 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 37 pg[7.0( empty local-lis/les=0/0 n=0 ec=37/37 lis/c=0/0 les/c/f=0/0/0 sis=37) [0,1,5] r=2 lpr=37 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:49 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec 06 08:18:49 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec 06 08:18:50 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 06 08:18:50 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 06 08:18:52 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.c scrub starts
Dec 06 08:18:52 np0005548788.localdomain sudo[55865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:52 np0005548788.localdomain sudo[55865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:52 np0005548788.localdomain sudo[55865]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:52 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.c scrub ok
Dec 06 08:18:52 np0005548788.localdomain sshd[55863]: Received disconnect from 102.140.97.134 port 42112:11: Bye Bye [preauth]
Dec 06 08:18:52 np0005548788.localdomain sshd[55863]: Disconnected from authenticating user root 102.140.97.134 port 42112 [preauth]
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301354408s) [2,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784912109s@ mbc={}] start_peering_interval up [1,3,2] -> [2,3,4], acting [1,3,2] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301080704s) [1,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784790039s@ mbc={}] start_peering_interval up [1,3,2] -> [1,0,2], acting [1,3,2] -> [1,0,2], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301186562s) [0,5,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784912109s@ mbc={}] start_peering_interval up [1,3,2] -> [0,5,1], acting [1,3,2] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301354408s) [2,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1134.784912109s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.300980568s) [1,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.784790039s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301116943s) [0,5,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.784912109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.300901413s) [5,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784912109s@ mbc={}] start_peering_interval up [1,3,2] -> [5,4,3], acting [1,3,2] -> [5,4,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301057816s) [2,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.785278320s@ mbc={}] start_peering_interval up [1,3,2] -> [2,0,1], acting [1,3,2] -> [2,0,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.301057816s) [2,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1134.785278320s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.300449371s) [3,5,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784790039s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,1], acting [1,3,2] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.300187111s) [3,5,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.784790039s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.299889565s) [4,3,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784545898s@ mbc={}] start_peering_interval up [1,3,2] -> [4,3,2], acting [1,3,2] -> [4,3,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.297385216s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.782226562s@ mbc={}] start_peering_interval up [1,3,2] -> [4,5,0], acting [1,3,2] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.299781799s) [4,3,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.784545898s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.297132492s) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.782104492s@ mbc={}] start_peering_interval up [1,3,2] -> [2,1,3], acting [1,3,2] -> [2,1,3], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.297132492s) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1134.782104492s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.297275543s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.782226562s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296927452s) [5,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.782104492s@ mbc={}] start_peering_interval up [1,3,2] -> [5,0,4], acting [1,3,2] -> [5,0,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296778679s) [3,5,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781982422s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,4], acting [1,3,2] -> [3,5,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296830177s) [5,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.782104492s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296675682s) [3,5,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781982422s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296619415s) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.782104492s@ mbc={}] start_peering_interval up [1,3,2] -> [2,1,3], acting [1,3,2] -> [2,1,3], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296415329s) [0,2,4] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781738281s@ mbc={}] start_peering_interval up [1,3,2] -> [0,2,4], acting [1,3,2] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296619415s) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1134.782104492s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296368599s) [0,2,4] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781738281s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.300225258s) [5,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.784912109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295962334s) [2,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781616211s@ mbc={}] start_peering_interval up [1,3,2] -> [2,3,4], acting [1,3,2] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296040535s) [0,5,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781982422s@ mbc={}] start_peering_interval up [1,3,2] -> [0,5,1], acting [1,3,2] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.296003342s) [0,5,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781982422s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295962334s) [2,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1134.781616211s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295612335s) [5,4,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781494141s@ mbc={}] start_peering_interval up [1,3,2] -> [5,4,0], acting [1,3,2] -> [5,4,0], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295581818s) [5,4,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781494141s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295733452s) [3,1,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781616211s@ mbc={}] start_peering_interval up [1,3,2] -> [3,1,2], acting [1,3,2] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295584679s) [3,1,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781616211s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295334816s) [0,1,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781494141s@ mbc={}] start_peering_interval up [1,3,2] -> [0,1,2], acting [1,3,2] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295229912s) [4,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781372070s@ mbc={}] start_peering_interval up [1,3,2] -> [4,0,2], acting [1,3,2] -> [4,0,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295290947s) [0,1,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781494141s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295205116s) [4,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781372070s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295529366s) [3,1,5] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.781738281s@ mbc={}] start_peering_interval up [1,3,2] -> [3,1,5], acting [1,3,2] -> [3,1,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295481682s) [3,1,5] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.781738281s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.293224335s) [3,2,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779663086s@ mbc={}] start_peering_interval up [1,3,2] -> [3,2,1], acting [1,3,2] -> [3,2,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.293202400s) [3,2,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779663086s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.293354034s) [3,4,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779785156s@ mbc={}] start_peering_interval up [1,3,2] -> [3,4,2], acting [1,3,2] -> [3,4,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.293271065s) [3,4,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779785156s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.293018341s) [2,3,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779541016s@ mbc={}] start_peering_interval up [1,3,2] -> [2,3,1], acting [1,3,2] -> [2,3,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.292901993s) [1,3,5] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779418945s@ mbc={}] start_peering_interval up [1,3,2] -> [1,3,5], acting [1,3,2] -> [1,3,5], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.291307449s) [3,2,4] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779418945s@ mbc={}] start_peering_interval up [1,3,2] -> [3,2,4], acting [1,3,2] -> [3,2,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.291636467s) [5,3,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779785156s@ mbc={}] start_peering_interval up [1,3,2] -> [5,3,1], acting [1,3,2] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.291878700s) [1,3,5] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779418945s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.291243553s) [3,2,4] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779418945s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.291582108s) [5,3,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779785156s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.290897369s) [5,1,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779418945s@ mbc={}] start_peering_interval up [1,3,2] -> [5,1,3], acting [1,3,2] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.290733337s) [5,0,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779296875s@ mbc={}] start_peering_interval up [1,3,2] -> [5,0,1], acting [1,3,2] -> [5,0,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.290689468s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.779296875s@ mbc={}] start_peering_interval up [1,3,2] -> [4,5,0], acting [1,3,2] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.290634155s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779296875s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.290616035s) [5,0,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779296875s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.290736198s) [5,1,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.779418945s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295738220s) [4,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1134.784912109s@ mbc={}] start_peering_interval up [1,3,2] -> [4,0,2], acting [1,3,2] -> [4,0,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.295690536s) [4,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1134.784912109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.293018341s) [2,3,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1134.779541016s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,4,0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,0,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.a( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,1,3] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.e( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,4,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.6( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.1( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,4,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334831238s) [0,1,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568847656s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,5], acting [4,5,3] -> [0,1,5], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334668159s) [0,1,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568847656s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334175110s) [0,5,4] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568847656s@ mbc={}] start_peering_interval up [4,5,3] -> [0,5,4], acting [4,5,3] -> [0,5,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.332857132s) [4,0,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567749023s@ mbc={}] start_peering_interval up [4,5,3] -> [4,0,2], acting [4,5,3] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.333668709s) [0,5,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568603516s@ mbc={}] start_peering_interval up [4,5,3] -> [0,5,1], acting [4,5,3] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.333612442s) [0,5,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568603516s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.333977699s) [0,5,4] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568847656s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.333230019s) [0,1,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568359375s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,2], acting [4,5,3] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.332760811s) [4,0,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567749023s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.333208084s) [0,1,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568359375s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.8( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.246846199s) [0,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482055664s@ mbc={}] start_peering_interval up [3,1,5] -> [0,1,2], acting [3,1,5] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.8( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.246791840s) [0,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482055664s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.2( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.332561493s) [5,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568481445s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.332561493s) [5,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1132.568481445s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,0,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.330751419s) [5,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567749023s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.330751419s) [5,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1132.567749023s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.330706596s) [5,1,0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567749023s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.330706596s) [5,1,0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1132.567749023s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.11( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238343239s) [0,2,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.475952148s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.12( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238075256s) [4,2,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.475708008s@ mbc={}] start_peering_interval up [3,1,5] -> [4,2,3], acting [3,1,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.11( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238303185s) [0,2,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.475952148s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.12( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238034248s) [4,2,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.475708008s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.330038071s) [3,2,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567871094s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.330002785s) [3,2,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567871094s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.14( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237791061s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.475708008s@ mbc={}] start_peering_interval up [3,1,5] -> [3,4,2], acting [3,1,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.14( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237761497s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.475708008s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.16( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.244288445s) [0,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482299805s@ mbc={}] start_peering_interval up [3,1,5] -> [0,1,2], acting [3,1,5] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.16( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.244262695s) [0,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482299805s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.329728127s) [3,4,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567871094s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.329699516s) [3,4,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567871094s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,3,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.328671455s) [0,4,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567993164s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,5], acting [4,5,3] -> [0,4,5], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.329221725s) [3,4,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568481445s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.18( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.243102074s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482421875s@ mbc={}] start_peering_interval up [3,1,5] -> [4,3,2], acting [3,1,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.328587532s) [0,4,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567993164s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.329167366s) [3,4,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568481445s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.18( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.243017197s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482421875s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.19( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.328026772s) [5,3,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568359375s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,4], acting [4,5,3] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.328026772s) [5,3,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1132.568359375s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.1e( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.327738762s) [5,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.569091797s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.17( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240991592s) [3,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482177734s@ mbc={}] start_peering_interval up [3,1,5] -> [3,1,2], acting [3,1,5] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.326543808s) [3,5,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567871094s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,1], acting [4,5,3] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.17( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240945816s) [3,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482177734s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.326490402s) [3,5,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567871094s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.327738762s) [5,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1132.569091797s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240779877s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482421875s@ mbc={}] start_peering_interval up [3,1,5] -> [3,4,2], acting [3,1,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272908211s) [1,2,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514648438s@ mbc={}] start_peering_interval up [5,0,1] -> [1,2,3], acting [5,0,1] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240687370s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482421875s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.19( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240411758s) [2,0,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482421875s@ mbc={}] start_peering_interval up [3,1,5] -> [2,0,4], acting [3,1,5] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272827148s) [1,2,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.514648438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.19( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240350723s) [2,0,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482421875s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.3( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.234210014s) [3,2,1] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476440430s@ mbc={}] start_peering_interval up [3,1,5] -> [3,2,1], acting [3,1,5] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.3( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.234152794s) [3,2,1] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476440430s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272211075s) [2,0,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514648438s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,1], acting [5,0,1] -> [2,0,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272062302s) [4,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514038086s@ mbc={}] start_peering_interval up [5,0,1] -> [4,3,2], acting [5,0,1] -> [4,3,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.7( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.233971596s) [3,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476440430s@ mbc={}] start_peering_interval up [3,1,5] -> [3,1,2], acting [3,1,5] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.7( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.233916283s) [3,1,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476440430s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272160530s) [2,0,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.514648438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271885872s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514648438s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,4], acting [5,0,1] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.7( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271786690s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.514648438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.b( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271209717s) [5,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514038086s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,3], acting [5,0,1] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271591187s) [4,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.514038086s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271209717s) [5,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1130.514038086s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.15( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.231940269s) [1,2,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.475830078s@ mbc={}] start_peering_interval up [3,1,5] -> [1,2,3], acting [3,1,5] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.324471474s) [1,3,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568237305s@ mbc={}] start_peering_interval up [4,5,3] -> [1,3,5], acting [4,5,3] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.324370384s) [1,3,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568237305s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.274140358s) [0,1,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.518188477s@ mbc={}] start_peering_interval up [5,0,1] -> [0,1,5], acting [5,0,1] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270014763s) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514038086s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,3], acting [5,0,1] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270014763s) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1130.514038086s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.274003029s) [0,1,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.518188477s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.323934555s) [2,4,0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568237305s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.323894501s) [2,4,0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568237305s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.13( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232738495s) [1,0,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.477172852s@ mbc={}] start_peering_interval up [3,1,5] -> [1,0,2], acting [3,1,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.13( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232697487s) [1,0,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.477172852s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.268421173s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512939453s@ mbc={}] start_peering_interval up [5,0,1] -> [4,2,0], acting [5,0,1] -> [4,2,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.325380325s) [3,5,4] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568359375s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.268368721s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512939453s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.323727608s) [3,5,4] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568359375s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.15( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.231884003s) [1,2,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.475830078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267865181s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512939453s@ mbc={}] start_peering_interval up [5,0,1] -> [4,2,0], acting [5,0,1] -> [4,2,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269606590s) [5,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.514648438s@ mbc={}] start_peering_interval up [5,0,1] -> [5,3,4], acting [5,0,1] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269606590s) [5,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1130.514648438s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267746925s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512939453s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.10( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230931282s) [4,5,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476196289s@ mbc={}] start_peering_interval up [3,1,5] -> [4,5,3], acting [3,1,5] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.10( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230871201s) [4,5,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476196289s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.323095322s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568481445s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.f( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.323056221s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568481445s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230310440s) [4,0,5] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.475952148s@ mbc={}] start_peering_interval up [3,1,5] -> [4,0,5], acting [3,1,5] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.322584152s) [0,4,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568237305s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230264664s) [4,0,5] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.475952148s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.322444916s) [0,4,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568237305s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.4( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230897903s) [5,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476806641s@ mbc={}] start_peering_interval up [3,1,5] -> [5,1,3], acting [3,1,5] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267156601s) [1,5,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512939453s@ mbc={}] start_peering_interval up [5,0,1] -> [1,5,0], acting [5,0,1] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.4( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230897903s) [5,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.476806641s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229878426s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.475952148s@ mbc={}] start_peering_interval up [3,1,5] -> [2,4,3], acting [3,1,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266829491s) [1,5,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512939453s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.322672844s) [2,1,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568847656s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,3], acting [4,5,3] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266205788s) [1,3,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512573242s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,5], acting [5,0,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.322636604s) [2,1,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568847656s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266139030s) [1,3,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512573242s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.322407722s) [3,5,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568725586s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,1], acting [4,5,3] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.322353363s) [3,5,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568725586s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229597092s) [1,0,5] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476196289s@ mbc={}] start_peering_interval up [3,1,5] -> [1,0,5], acting [3,1,5] -> [1,0,5], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229560852s) [1,0,5] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476196289s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271414757s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.518066406s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,4], acting [5,0,1] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.2( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.235354424s) [3,5,4] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482177734s@ mbc={}] start_peering_interval up [3,1,5] -> [3,5,4], acting [3,1,5] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.2( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.235304832s) [3,5,4] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482177734s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271357536s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.518066406s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.265398026s) [0,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512329102s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.265315056s) [0,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512329102s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229838371s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.475952148s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229169846s) [1,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476440430s@ mbc={}] start_peering_interval up [3,1,5] -> [1,3,2], acting [3,1,5] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264986038s) [3,1,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512207031s@ mbc={}] start_peering_interval up [5,0,1] -> [3,1,5], acting [5,0,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229111671s) [1,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476440430s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.265299797s) [4,5,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512695312s@ mbc={}] start_peering_interval up [5,0,1] -> [4,5,0], acting [5,0,1] -> [4,5,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264914513s) [3,1,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512207031s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.265254974s) [4,5,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512695312s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.6( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229081154s) [2,1,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476684570s@ mbc={}] start_peering_interval up [3,1,5] -> [2,1,3], acting [3,1,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.6( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229049683s) [2,1,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476684570s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.321266174s) [2,0,4] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568847656s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.321172714s) [2,0,4] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568847656s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228097916s) [1,5,0] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476074219s@ mbc={}] start_peering_interval up [3,1,5] -> [1,5,0], acting [3,1,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264368057s) [2,1,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512329102s@ mbc={}] start_peering_interval up [5,0,1] -> [2,1,3], acting [5,0,1] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228000641s) [1,5,0] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476074219s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264307022s) [2,1,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512329102s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263778687s) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511962891s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,3], acting [5,0,1] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263778687s) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1130.511962891s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263220787s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511474609s@ mbc={}] start_peering_interval up [5,0,1] -> [4,0,2], acting [5,0,1] -> [4,0,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.9( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228552818s) [5,0,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.477050781s@ mbc={}] start_peering_interval up [3,1,5] -> [5,0,4], acting [3,1,5] -> [5,0,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228689194s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.477050781s@ mbc={}] start_peering_interval up [3,1,5] -> [2,4,3], acting [3,1,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264203072s) [3,1,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512695312s@ mbc={}] start_peering_interval up [5,0,1] -> [3,1,2], acting [5,0,1] -> [3,1,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.319213867s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567749023s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.9( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228552818s) [5,0,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.477050781s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263113022s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511474609s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264139175s) [3,1,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512695312s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228513718s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.477050781s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.319149017s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567749023s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.233473778s) [5,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482177734s@ mbc={}] start_peering_interval up [3,1,5] -> [5,1,3], acting [3,1,5] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.233473778s) [5,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.482177734s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.319754601s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568603516s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263085365s) [1,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511962891s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,2], acting [5,0,1] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.5( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227906227s) [1,2,0] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476806641s@ mbc={}] start_peering_interval up [3,1,5] -> [1,2,0], acting [3,1,5] -> [1,2,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.319692612s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568603516s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.5( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227868080s) [1,2,0] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476806641s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263104439s) [1,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512084961s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,2], acting [5,0,1] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263012886s) [1,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512084961s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263379097s) [3,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512695312s@ mbc={}] start_peering_interval up [5,0,1] -> [3,5,1], acting [5,0,1] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,3,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263330460s) [3,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512695312s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.262600899s) [1,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511962891s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.319359779s) [1,0,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568847656s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,5], acting [4,5,3] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.319282532s) [1,0,5] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568847656s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227091789s) [1,5,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.476928711s@ mbc={}] start_peering_interval up [3,1,5] -> [1,5,3], acting [3,1,5] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227510452s) [1,0,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.477416992s@ mbc={}] start_peering_interval up [3,1,5] -> [1,0,2], acting [3,1,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.226990700s) [1,5,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.476928711s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227369308s) [1,0,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.477416992s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232427597s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482421875s@ mbc={}] start_peering_interval up [3,1,5] -> [4,3,2], acting [3,1,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232397079s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482421875s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232213974s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482299805s@ mbc={}] start_peering_interval up [3,1,5] -> [4,3,2], acting [3,1,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232153893s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482299805s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.318252563s) [0,4,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.568603516s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260832787s) [0,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511352539s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261752129s) [2,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.512207031s@ mbc={}] start_peering_interval up [5,0,1] -> [2,4,3], acting [5,0,1] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.318101883s) [0,4,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.568603516s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260647774s) [0,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511352539s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261548996s) [2,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.512207031s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.316751480s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1132.567749023s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260300636s) [3,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511352539s@ mbc={}] start_peering_interval up [5,0,1] -> [3,2,4], acting [5,0,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.316720009s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1132.567749023s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260234833s) [3,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511352539s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260169983s) [4,3,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511474609s@ mbc={}] start_peering_interval up [5,0,1] -> [4,3,5], acting [5,0,1] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260125160s) [4,3,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511474609s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259753227s) [3,2,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511230469s@ mbc={}] start_peering_interval up [5,0,1] -> [3,2,1], acting [5,0,1] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230656624s) [2,3,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1128.482177734s@ mbc={}] start_peering_interval up [3,1,5] -> [2,3,4], acting [3,1,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259681702s) [3,2,1] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511230469s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259551048s) [1,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511230469s@ mbc={}] start_peering_interval up [5,0,1] -> [1,2,0], acting [5,0,1] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[2.1f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230626106s) [2,3,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.482177734s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259629250s) [3,4,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1130.511352539s@ mbc={}] start_peering_interval up [5,0,1] -> [3,4,2], acting [5,0,1] -> [3,4,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259535789s) [3,4,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511352539s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.258985519s) [1,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1130.511230469s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 06 08:18:53 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [3,2,1] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.10( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,1,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.14( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,4,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,1,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.1f( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.1f( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [3,4,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.1a( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,4,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.15( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,2,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.12( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,2,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.14( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,2,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,5,4] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [0,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,2,1] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,1,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.5( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,1,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.1c( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [0,4,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [0,4,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,5] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,4,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,2,4] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,2,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [0,1,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [0,1,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [0,1,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.1a( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,3,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [0,2,4] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 41 pg[3.1c( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [0,5,1] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.19( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,2,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.3( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[4.17( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,2,0] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.2( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,2] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,0,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [5,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,2,3] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,3,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,2,3] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[2.1e( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [5,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,0,2] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,2,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,2,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[4.b( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,0,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,4,0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,1,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,4,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[2.1( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,4,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,1,3] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,3,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[2.e( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,4,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[3.2( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,0,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,0,4] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [5,0,4] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,3,4] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[4.1e( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[3.4( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,3,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 42 pg[3.b( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,3,1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,3,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,1,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[3.19( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,0,1] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 42 pg[5.17( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,1,0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548788.localdomain sudo[55880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:54 np0005548788.localdomain sudo[55880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:54 np0005548788.localdomain sudo[55880]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:55 np0005548788.localdomain sudo[55895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:55 np0005548788.localdomain sudo[55895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:55 np0005548788.localdomain sudo[55895]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:55 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 06 08:18:58 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 06 08:18:59 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 06 08:18:59 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 06 08:19:02 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec 06 08:19:02 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec 06 08:19:03 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec 06 08:19:03 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec 06 08:19:03 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 06 08:19:03 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 06 08:19:04 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 06 08:19:04 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 06 08:19:05 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 06 08:19:05 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 06 08:19:05 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 06 08:19:05 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 06 08:19:06 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec 06 08:19:06 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec 06 08:19:06 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 06 08:19:06 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 06 08:19:08 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.12 deep-scrub starts
Dec 06 08:19:08 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 3.12 deep-scrub ok
Dec 06 08:19:09 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec 06 08:19:10 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec 06 08:19:10 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec 06 08:19:10 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec 06 08:19:11 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.4 deep-scrub starts
Dec 06 08:19:11 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 4.4 deep-scrub ok
Dec 06 08:19:11 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 06 08:19:12 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 06 08:19:12 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 06 08:19:12 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 06 08:19:15 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 06 08:19:15 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 06 08:19:16 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 06 08:19:16 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 06 08:19:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:19:18 np0005548788.localdomain podman[55910]: 2025-12-06 08:19:18.238414291 +0000 UTC m=+0.072067054 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:19:18 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec 06 08:19:18 np0005548788.localdomain podman[55910]: 2025-12-06 08:19:18.442436853 +0000 UTC m=+0.276089596 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:19:18 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:19:18 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec 06 08:19:20 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.14 deep-scrub starts
Dec 06 08:19:20 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.14 deep-scrub ok
Dec 06 08:19:22 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec 06 08:19:22 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec 06 08:19:22 np0005548788.localdomain sudo[55952]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oznwobdujbkqkhgxigsjlwpyikmpejrn ; /usr/bin/python3
Dec 06 08:19:22 np0005548788.localdomain sudo[55952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:22 np0005548788.localdomain python3[55954]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:22 np0005548788.localdomain sudo[55952]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:23 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec 06 08:19:23 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec 06 08:19:24 np0005548788.localdomain sudo[55968]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiidoreectodfyhczniukeotiijtemtx ; /usr/bin/python3
Dec 06 08:19:24 np0005548788.localdomain sudo[55968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:24 np0005548788.localdomain python3[55970]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:24 np0005548788.localdomain sudo[55968]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:25 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 06 08:19:25 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 06 08:19:25 np0005548788.localdomain sudo[55984]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nctqeomkvymgaetccmhdfdqrqqqlwpwx ; /usr/bin/python3
Dec 06 08:19:25 np0005548788.localdomain sudo[55984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:26 np0005548788.localdomain python3[55986]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:26 np0005548788.localdomain sudo[55984]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:26 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 06 08:19:26 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 06 08:19:26 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 06 08:19:26 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 06 08:19:27 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 06 08:19:27 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 06 08:19:28 np0005548788.localdomain sudo[56032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rywahlffdwxkcbtbbhumfxbgitrvvcip ; /usr/bin/python3
Dec 06 08:19:28 np0005548788.localdomain sudo[56032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:28 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 06 08:19:28 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 06 08:19:28 np0005548788.localdomain python3[56034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:28 np0005548788.localdomain sudo[56032]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:28 np0005548788.localdomain sudo[56075]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxxuljvxwkicqxjectnzzwcemzduxguu ; /usr/bin/python3
Dec 06 08:19:28 np0005548788.localdomain sudo[56075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:28 np0005548788.localdomain python3[56077]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009167.9104047-92087-166284313837831/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=9d631b6552ddeaa0e75a39b18f2bdb583e0e85e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:28 np0005548788.localdomain sudo[56075]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:30 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.e deep-scrub starts
Dec 06 08:19:30 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.e deep-scrub ok
Dec 06 08:19:31 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec 06 08:19:31 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec 06 08:19:31 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 06 08:19:31 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 06 08:19:32 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 06 08:19:32 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 06 08:19:33 np0005548788.localdomain sudo[56137]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywaryafddjkclafopswunearrqowucsy ; /usr/bin/python3
Dec 06 08:19:33 np0005548788.localdomain sudo[56137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:33 np0005548788.localdomain python3[56139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:33 np0005548788.localdomain sudo[56137]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:34 np0005548788.localdomain sudo[56180]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhcoevttkhymqubblfgexwckoyguiaub ; /usr/bin/python3
Dec 06 08:19:34 np0005548788.localdomain sudo[56180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:34 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 06 08:19:34 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 06 08:19:34 np0005548788.localdomain python3[56182]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009173.434891-92087-4755061017757/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=04fcaa63c42fa3b2b702e4421ebc774041538ebd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:34 np0005548788.localdomain sudo[56180]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:35 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 06 08:19:35 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[31731]: osd.2 43 crush map has features 432629239337189376, adjusting msgr requires for clients
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[31731]: osd.2 43 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[31731]: osd.2 43 crush map has features 3314933000854323200, adjusting msgr requires for osds
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 43 pg[2.1f( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=13.512114525s) [5,3,4] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1180.524658203s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 43 pg[2.1f( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=13.512019157s) [5,3,4] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1180.524658203s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[32690]: osd.5 43 crush map has features 432629239337189376, adjusting msgr requires for clients
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[32690]: osd.5 43 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[32690]: osd.5 43 crush map has features 3314933000854323200, adjusting msgr requires for osds
Dec 06 08:19:36 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 43 pg[2.1f( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,3,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:37 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 44 pg[2.1f( empty local-lis/les=43/44 n=0 ec=29/20 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,3,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:38 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 06 08:19:38 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 06 08:19:39 np0005548788.localdomain sudo[56242]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsuzdudydmmnpisfjamlfhklusbgutkb ; /usr/bin/python3
Dec 06 08:19:39 np0005548788.localdomain sudo[56242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:39 np0005548788.localdomain python3[56244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:39 np0005548788.localdomain sudo[56242]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:39 np0005548788.localdomain sudo[56285]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imwqiwuejvrnsssqecdmsluaqrafdsri ; /usr/bin/python3
Dec 06 08:19:39 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec 06 08:19:39 np0005548788.localdomain sudo[56285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:39 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec 06 08:19:39 np0005548788.localdomain python3[56287]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009178.9093993-92087-160504690405131/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=0cb3e740065655621c29366f25db5e0ef0002cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:39 np0005548788.localdomain sudo[56285]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:41 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 46 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=46 pruub=9.469113350s) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active pruub 1176.900878906s@ mbc={}] start_peering_interval up [5,0,1] -> [5,0,1], acting [5,0,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:41 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 46 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=46 pruub=9.469113350s) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1176.900878906s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.16( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.5( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.17( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.c( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1d( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1e( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.12( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1c( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.13( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.11( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.10( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.15( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.14( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.b( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.9( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.a( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.8( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.4( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.f( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.3( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.2( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.6( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.7( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.d( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.e( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1f( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.18( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.19( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1b( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1a( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.0( empty local-lis/les=46/47 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.19( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.7( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.4( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.9( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.18( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.6( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.3( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.5( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.12( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.11( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.14( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.1a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.16( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.13( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:42 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 47 pg[6.10( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=0 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:43 np0005548788.localdomain sshd[56302]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:19:43 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 48 pg[7.0( v 40'39 (0'0,40'39] local-lis/les=37/38 n=22 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=48 pruub=9.495718956s) [0,1,5] r=2 lpr=48 pi=[37,48)/1 luod=0'0 lua=40'37 crt=40'39 lcod 40'38 mlcod 0'0 active pruub 1178.959716797s@ mbc={}] start_peering_interval up [0,1,5] -> [0,1,5], acting [0,1,5] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:43 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 48 pg[7.0( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=48 pruub=9.493415833s) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 lcod 40'38 mlcod 0'0 unknown NOTIFY pruub 1178.959716797s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.4( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.d( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.b( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.8( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.9( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.e( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.a( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.5( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.3( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.2( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.c( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.f( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.7( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 49 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=2 lpr=48 pi=[37,48)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548788.localdomain sshd[56302]: Received disconnect from 152.32.172.117 port 41522:11: Bye Bye [preauth]
Dec 06 08:19:44 np0005548788.localdomain sshd[56302]: Disconnected from authenticating user root 152.32.172.117 port 41522 [preauth]
Dec 06 08:19:45 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts
Dec 06 08:19:45 np0005548788.localdomain sudo[56349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqailqnzyyvdodhcsxhxcguvpyyofove ; /usr/bin/python3
Dec 06 08:19:45 np0005548788.localdomain sudo[56349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:45 np0005548788.localdomain python3[56351]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:45 np0005548788.localdomain sudo[56349]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:46 np0005548788.localdomain sudo[56394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhhkltojfvuobuapeqakugjidcrfyosv ; /usr/bin/python3
Dec 06 08:19:46 np0005548788.localdomain sudo[56394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:46 np0005548788.localdomain python3[56396]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009185.5291116-92410-228909177088674/source _original_basename=tmpaqhr4mqv follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:46 np0005548788.localdomain sudo[56394]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:46 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec 06 08:19:46 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec 06 08:19:47 np0005548788.localdomain sudo[56456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqnoqduilaknnrwgairkwrxjiydpwhou ; /usr/bin/python3
Dec 06 08:19:47 np0005548788.localdomain sudo[56456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:47 np0005548788.localdomain python3[56458]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:47 np0005548788.localdomain sudo[56456]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:47 np0005548788.localdomain sudo[56499]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsurxwhhobtzuegpjslischjkbznuvua ; /usr/bin/python3
Dec 06 08:19:47 np0005548788.localdomain sudo[56499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:47 np0005548788.localdomain python3[56501]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009187.1429195-92496-37516194696630/source _original_basename=tmpltco37or follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:47 np0005548788.localdomain sudo[56499]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:48 np0005548788.localdomain sudo[56529]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fimnjbhtpszkjhzsebyrqwhzwblnwwqh ; /usr/bin/python3
Dec 06 08:19:48 np0005548788.localdomain sudo[56529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:48 np0005548788.localdomain python3[56531]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Dec 06 08:19:48 np0005548788.localdomain crontab[56532]: (root) LIST (root)
Dec 06 08:19:48 np0005548788.localdomain crontab[56533]: (root) REPLACE (root)
Dec 06 08:19:48 np0005548788.localdomain sudo[56529]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:48 np0005548788.localdomain sudo[56547]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wripfysiauzksjglwkveggecpleymoag ; /usr/bin/python3
Dec 06 08:19:48 np0005548788.localdomain sudo[56547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:19:48 np0005548788.localdomain systemd[1]: tmp-crun.2KNrLP.mount: Deactivated successfully.
Dec 06 08:19:48 np0005548788.localdomain podman[56550]: 2025-12-06 08:19:48.6683705 +0000 UTC m=+0.080975385 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:19:48 np0005548788.localdomain python3[56549]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:19:48 np0005548788.localdomain sudo[56547]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:48 np0005548788.localdomain podman[56550]: 2025-12-06 08:19:48.889707136 +0000 UTC m=+0.302311991 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 06 08:19:48 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:19:49 np0005548788.localdomain sudo[56627]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcwvnkiesphivdzdqnrjfrdxqxoqxkla ; /usr/bin/python3
Dec 06 08:19:49 np0005548788.localdomain sudo[56627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:49 np0005548788.localdomain sudo[56627]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.13( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,4,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.10( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,1,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.b( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,3,1] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.6( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,3,1] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.c( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,0,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.18( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013180733s) [5,3,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463867188s@ mbc={}] start_peering_interval up [5,0,1] -> [5,3,4], acting [5,0,1] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.18( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013180733s) [5,3,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.463867188s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011224747s) [5,4,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463012695s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,0], acting [5,0,1] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.017636299s) [3,4,5] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468505859s@ mbc={}] start_peering_interval up [5,0,1] -> [3,4,5], acting [5,0,1] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.19( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010860443s) [0,5,1] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463012695s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010766029s) [3,2,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463134766s@ mbc={}] start_peering_interval up [5,0,1] -> [3,2,4], acting [5,0,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010684013s) [3,2,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.463134766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.015999794s) [3,4,5] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468505859s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.19( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010604858s) [0,5,1] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.463012695s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.015019417s) [1,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467773438s@ mbc={}] start_peering_interval up [5,0,1] -> [1,0,2], acting [5,0,1] -> [1,0,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011572838s) [1,3,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463867188s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,2], acting [5,0,1] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.014948845s) [1,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467773438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.7( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010258675s) [3,5,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463134766s@ mbc={}] start_peering_interval up [5,0,1] -> [3,5,4], acting [5,0,1] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.6( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010487556s) [2,3,1] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463989258s@ mbc={}] start_peering_interval up [5,0,1] -> [2,3,1], acting [5,0,1] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010821342s) [1,3,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.463867188s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.2( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013737679s) [3,5,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467285156s@ mbc={}] start_peering_interval up [5,0,1] -> [3,5,4], acting [5,0,1] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.7( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009961128s) [3,5,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.463134766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.2( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013685226s) [3,5,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467285156s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.6( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010320663s) [2,3,1] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.463989258s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.8( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009988785s) [3,1,5] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.464111328s@ mbc={}] start_peering_interval up [5,0,1] -> [3,1,5], acting [5,0,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011224747s) [5,4,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.463012695s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.8( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009846687s) [3,1,5] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.464111328s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013271332s) [5,4,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467407227s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,0], acting [5,0,1] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.9( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.008986473s) [5,1,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463500977s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,0], acting [5,0,1] -> [5,1,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013782501s) [0,5,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468383789s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,4], acting [5,0,1] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.9( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.008986473s) [5,1,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.463500977s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013739586s) [0,5,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468383789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.008440971s) [2,3,1] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463378906s@ mbc={}] start_peering_interval up [5,0,1] -> [2,3,1], acting [5,0,1] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.14( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013347626s) [5,3,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468261719s@ mbc={}] start_peering_interval up [5,0,1] -> [5,3,4], acting [5,0,1] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.14( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013347626s) [5,3,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.468261719s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.15( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013157845s) [3,4,5] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468261719s@ mbc={}] start_peering_interval up [5,0,1] -> [3,4,5], acting [5,0,1] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.15( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013128281s) [3,4,5] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468261719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.008297920s) [2,3,1] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.463378906s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.10( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013226509s) [2,1,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468383789s@ mbc={}] start_peering_interval up [5,0,1] -> [2,1,3], acting [5,0,1] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.11( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.012829781s) [5,1,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468139648s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,3], acting [5,0,1] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013271332s) [5,4,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.467407227s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.12( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011894226s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467407227s@ mbc={}] start_peering_interval up [5,0,1] -> [4,0,2], acting [5,0,1] -> [4,0,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.12( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011810303s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467407227s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.10( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.012916565s) [2,1,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468383789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.13( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.013003349s) [2,4,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468750000s@ mbc={}] start_peering_interval up [5,0,1] -> [2,4,3], acting [5,0,1] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.13( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.012869835s) [2,4,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468750000s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011691093s) [1,5,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467773438s@ mbc={}] start_peering_interval up [5,0,1] -> [1,5,3], acting [5,0,1] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011518478s) [1,5,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467773438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.11( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.012829781s) [5,1,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.468139648s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011713982s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468261719s@ mbc={}] start_peering_interval up [5,0,1] -> [4,0,2], acting [5,0,1] -> [4,0,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011647224s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468261719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.16( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011459351s) [5,0,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468505859s@ mbc={}] start_peering_interval up [5,0,1] -> [5,0,4], acting [5,0,1] -> [5,0,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.16( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.011459351s) [5,0,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.468505859s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.5( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.010006905s) [0,1,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467285156s@ mbc={}] start_peering_interval up [5,0,1] -> [0,1,2], acting [5,0,1] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.006406784s) [5,4,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.463745117s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,3], acting [5,0,1] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.5( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009955406s) [0,1,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467285156s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009305954s) [2,0,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467285156s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,4], acting [5,0,1] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009227753s) [2,0,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467285156s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.17( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009984970s) [1,2,0] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468139648s@ mbc={}] start_peering_interval up [5,0,1] -> [1,2,0], acting [5,0,1] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.1d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.006406784s) [5,4,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.463745117s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.17( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.009462357s) [1,2,0] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468139648s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.3( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.007202148s) [0,4,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.467285156s@ mbc={}] start_peering_interval up [5,0,1] -> [0,4,2], acting [5,0,1] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.3( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.007036209s) [0,4,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.467285156s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.007799149s) [0,5,1] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1184.468750000s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 50 pg[6.d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.007684708s) [0,5,1] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.468750000s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548788.localdomain sudo[56645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqulgktnbxxhxoyfjobxoysbvlzghruq ; /usr/bin/python3
Dec 06 08:19:49 np0005548788.localdomain sudo[56645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:49 np0005548788.localdomain sudo[56645]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:50 np0005548788.localdomain sudo[56749]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eanurlyoadzfnsjiehdavrzvrzxgqudp ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009189.832854-92648-22641515946581/async_wrapper.py 103866932199 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009189.832854-92648-22641515946581/AnsiballZ_command.py _
Dec 06 08:19:50 np0005548788.localdomain sudo[56749]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.e( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [3,2,4] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.1( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,0,2] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.17( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,2,0] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.1b( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,3,2] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.1e( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [4,0,2] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.12( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [4,0,2] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.5( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [0,1,2] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 50 pg[6.3( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [0,4,2] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.1d( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,4,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.18( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,3,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.14( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,3,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.11( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,1,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 51 pg[6.10( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,1,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 51 pg[6.b( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,3,1] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 51 pg[6.6( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,3,1] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.9( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,1,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.f( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,4,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 51 pg[6.13( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,4,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.1f( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,4,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 51 pg[6.c( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,0,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 51 pg[6.16( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,0,4] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548788.localdomain ansible-async_wrapper.py[56751]: Invoked with 103866932199 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009189.832854-92648-22641515946581/AnsiballZ_command.py _
Dec 06 08:19:50 np0005548788.localdomain ansible-async_wrapper.py[56755]: Starting module and watcher
Dec 06 08:19:50 np0005548788.localdomain ansible-async_wrapper.py[56755]: Start watching 56756 (3600)
Dec 06 08:19:50 np0005548788.localdomain ansible-async_wrapper.py[56756]: Start module (56756)
Dec 06 08:19:50 np0005548788.localdomain ansible-async_wrapper.py[56751]: Return async_wrapper task started.
Dec 06 08:19:50 np0005548788.localdomain sudo[56749]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:50 np0005548788.localdomain sudo[56771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjbabbuylbokwsekvcyzmyhrurtlubrg ; /usr/bin/python3
Dec 06 08:19:50 np0005548788.localdomain sudo[56771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:50 np0005548788.localdomain python3[56776]: ansible-ansible.legacy.async_status Invoked with jid=103866932199.56751 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:19:50 np0005548788.localdomain sudo[56771]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:51 np0005548788.localdomain sshd[55834]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:19:51 np0005548788.localdomain sshd[55834]: banner exchange: Connection from 45.78.222.109 port 35934: Connection timed out
Dec 06 08:19:52 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 06 08:19:52 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 06 08:19:53 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 06 08:19:53 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 06 08:19:54 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec 06 08:19:54 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec 06 08:19:54 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec 06 08:19:54 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    (file & line not available)
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    (file & line not available)
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.11 seconds
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Notice: Applied catalog in 0.04 seconds
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Application:
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    Initial environment: production
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    Converged environment: production
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:          Run mode: user
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Changes:
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Events:
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Resources:
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:             Total: 10
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Time:
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:          Schedule: 0.00
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:              File: 0.00
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:              Exec: 0.01
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:            Augeas: 0.01
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    Transaction evaluation: 0.03
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    Catalog application: 0.04
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:    Config retrieval: 0.14
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:          Last run: 1765009194
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:        Filebucket: 0.00
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:             Total: 0.05
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]: Version:
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:            Config: 1765009194
Dec 06 08:19:54 np0005548788.localdomain puppet-user[56775]:            Puppet: 7.10.0
Dec 06 08:19:54 np0005548788.localdomain ansible-async_wrapper.py[56756]: Module complete (56756)
Dec 06 08:19:55 np0005548788.localdomain sudo[56887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:19:55 np0005548788.localdomain sudo[56887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:55 np0005548788.localdomain sudo[56887]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:55 np0005548788.localdomain sudo[56902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:19:55 np0005548788.localdomain sudo[56902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:55 np0005548788.localdomain ansible-async_wrapper.py[56755]: Done in kid B.
Dec 06 08:19:55 np0005548788.localdomain sudo[56902]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:55 np0005548788.localdomain sudo[56938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:19:55 np0005548788.localdomain sudo[56938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:55 np0005548788.localdomain sudo[56938]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:55 np0005548788.localdomain sudo[56953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:19:55 np0005548788.localdomain sudo[56953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:56 np0005548788.localdomain sudo[56953]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.562097549s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505371094s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.562007904s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1194.505371094s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.561987877s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505615234s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.561906815s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1194.505615234s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.562243462s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.506103516s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.561652184s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505493164s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.562163353s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1194.506103516s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 52 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.561559677s) [2,1,0] r=-1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1194.505493164s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 52 pg[7.e( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 52 pg[7.2( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 52 pg[7.6( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:56 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 52 pg[7.a( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:57 np0005548788.localdomain sudo[57000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:19:57 np0005548788.localdomain sudo[57000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:57 np0005548788.localdomain sudo[57000]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:57 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 53 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:57 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 53 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=52/53 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:57 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 53 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=52/53 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:57 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 53 pg[7.e( v 40'39 lc 40'11 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52) [2,1,0] r=0 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.102653503s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505493164s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.102653503s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1194.505493164s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.102437019s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505493164s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.102437019s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1194.505493164s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.102712631s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505981445s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.102712631s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1194.505981445s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.101785660s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1194.505249023s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 54 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.101785660s) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1194.505249023s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:59 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Dec 06 08:20:00 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 55 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=54/55 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(1+2)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:00 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 55 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:00 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 55 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(1+2)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:00 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 55 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=0 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:00 np0005548788.localdomain sudo[57028]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgxetmvjwcwvvdzmabfmyuhdyfgggcqc ; /usr/bin/python3
Dec 06 08:20:00 np0005548788.localdomain sudo[57028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:01 np0005548788.localdomain python3[57030]: ansible-ansible.legacy.async_status Invoked with jid=103866932199.56751 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:20:01 np0005548788.localdomain sudo[57028]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:01 np0005548788.localdomain sudo[57044]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmekgcmtljjgmndqlshazzhzowssqjcr ; /usr/bin/python3
Dec 06 08:20:01 np0005548788.localdomain sudo[57044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:01 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 56 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.620527267s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1202.504882812s@ mbc={}] start_peering_interval up [0,1,5] -> [2,3,4], acting [0,1,5] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:01 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 56 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.620438576s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1202.504882812s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:01 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 56 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.620127678s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1202.505615234s@ mbc={}] start_peering_interval up [0,1,5] -> [2,3,4], acting [0,1,5] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:01 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 56 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.620075226s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1202.505615234s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:01 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 56 pg[7.4( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56) [2,3,4] r=0 lpr=56 pi=[48,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:01 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 56 pg[7.c( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56) [2,3,4] r=0 lpr=56 pi=[48,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:01 np0005548788.localdomain python3[57046]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:20:01 np0005548788.localdomain sudo[57044]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:02 np0005548788.localdomain sudo[57060]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiyeylhjavtbnbntpyikqshxgrwjjghk ; /usr/bin/python3
Dec 06 08:20:02 np0005548788.localdomain sudo[57060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:02 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 06 08:20:02 np0005548788.localdomain python3[57062]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:20:02 np0005548788.localdomain sudo[57060]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:02 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 06 08:20:02 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 06 08:20:02 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 57 pg[7.4( v 40'39 lc 40'9 (0'0,40'39] local-lis/les=56/57 n=4 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56) [2,3,4] r=0 lpr=56 pi=[48,56)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(0+3)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:02 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 57 pg[7.c( v 40'39 lc 40'10 (0'0,40'39] local-lis/les=56/57 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56) [2,3,4] r=0 lpr=56 pi=[48,56)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:02 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 06 08:20:02 np0005548788.localdomain sudo[57110]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifhipkfafepvutnrodesikxzfksilzsl ; /usr/bin/python3
Dec 06 08:20:02 np0005548788.localdomain sudo[57110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:02 np0005548788.localdomain python3[57112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:02 np0005548788.localdomain sudo[57110]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:02 np0005548788.localdomain sudo[57128]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eottlujpnlysxcralhwyabvtwrrumlhp ; /usr/bin/python3
Dec 06 08:20:02 np0005548788.localdomain sudo[57128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:03 np0005548788.localdomain python3[57130]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpt81rry22 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:20:03 np0005548788.localdomain sudo[57128]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:03 np0005548788.localdomain sudo[57158]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjlcaowjmimvkpwuueeutdmopeklpfen ; /usr/bin/python3
Dec 06 08:20:03 np0005548788.localdomain sudo[57158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:03 np0005548788.localdomain python3[57160]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:03 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 58 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.851862907s) [4,5,0] r=1 lpr=58 pi=[48,58)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1202.505737305s@ mbc={}] start_peering_interval up [0,1,5] -> [4,5,0], acting [0,1,5] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:03 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 58 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.851797104s) [4,5,0] r=1 lpr=58 pi=[48,58)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1202.505737305s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:03 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 58 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.851645470s) [4,5,0] r=1 lpr=58 pi=[48,58)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1202.505859375s@ mbc={}] start_peering_interval up [0,1,5] -> [4,5,0], acting [0,1,5] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:03 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 58 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.851586342s) [4,5,0] r=1 lpr=58 pi=[48,58)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1202.505859375s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:03 np0005548788.localdomain sudo[57158]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:03 np0005548788.localdomain sudo[57174]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzyeimatwpemzhsjjclsteswquulkkku ; /usr/bin/python3
Dec 06 08:20:03 np0005548788.localdomain sudo[57174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:04 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 06 08:20:04 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 06 08:20:04 np0005548788.localdomain sudo[57174]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:04 np0005548788.localdomain sudo[57261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sryozztzifznfyeybceskdscxyypjvhx ; /usr/bin/python3
Dec 06 08:20:04 np0005548788.localdomain sudo[57261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:04 np0005548788.localdomain python3[57263]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:20:04 np0005548788.localdomain sudo[57261]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.2 deep-scrub starts
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.2 deep-scrub ok
Dec 06 08:20:05 np0005548788.localdomain sudo[57280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylwrakteuybzzfyeueshubgovjasfujj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:05 np0005548788.localdomain sudo[57280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 06 08:20:05 np0005548788.localdomain python3[57282]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:05 np0005548788.localdomain sudo[57280]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 60 pg[7.6( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60) [5,0,4] r=0 lpr=60 pi=[52,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 60 pg[7.e( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60) [5,0,4] r=0 lpr=60 pi=[52,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 60 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.279632568s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 crt=40'39 mlcod 0'0 active pruub 1204.247802734s@ mbc={255={}}] start_peering_interval up [2,1,0] -> [5,0,4], acting [2,1,0] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 60 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.276186943s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 crt=40'39 mlcod 0'0 active pruub 1204.244506836s@ mbc={255={}}] start_peering_interval up [2,1,0] -> [5,0,4], acting [2,1,0] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 60 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.279472351s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1204.247802734s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 60 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.276033401s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1204.244506836s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:05 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 06 08:20:05 np0005548788.localdomain sudo[57296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfrdsqxhtzcmlahmopguklxvvkfioqdk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:05 np0005548788.localdomain sudo[57296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:05 np0005548788.localdomain sudo[57296]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:06 np0005548788.localdomain sudo[57312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfktmyufdjcbqdqjuiiplrcekyaafnes ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:06 np0005548788.localdomain sudo[57312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:06 np0005548788.localdomain python3[57314]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:20:06 np0005548788.localdomain sudo[57312]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 61 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.121302605s) [3,1,5] r=2 lpr=61 pi=[54,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1202.844116211s@ mbc={255={}}] start_peering_interval up [5,3,4] -> [3,1,5], acting [5,3,4] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 61 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.121312141s) [3,1,5] r=2 lpr=61 pi=[54,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1202.843994141s@ mbc={255={}}] start_peering_interval up [5,3,4] -> [3,1,5], acting [5,3,4] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 61 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.121141434s) [3,1,5] r=2 lpr=61 pi=[54,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1202.844116211s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 61 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.121042252s) [3,1,5] r=2 lpr=61 pi=[54,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1202.843994141s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 61 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=60/61 n=2 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60) [5,0,4] r=0 lpr=60 pi=[52,60)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 61 pg[7.e( v 40'39 lc 40'11 (0'0,40'39] local-lis/les=60/61 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60) [5,0,4] r=0 lpr=60 pi=[52,60)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:06 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Dec 06 08:20:06 np0005548788.localdomain sudo[57362]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbhmbdgcqdhpddkimvetmilmlkvbveht ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:06 np0005548788.localdomain sudo[57362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:07 np0005548788.localdomain python3[57364]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:07 np0005548788.localdomain sudo[57362]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:07 np0005548788.localdomain sudo[57380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpsafgxfvacrmjoanxkakrcxjmcwpjyx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:07 np0005548788.localdomain sudo[57380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:07 np0005548788.localdomain python3[57382]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:07 np0005548788.localdomain sudo[57380]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:07 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 06 08:20:07 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 06 08:20:07 np0005548788.localdomain sudo[57442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjzraddaldvseptmphelsckqbswlmkmh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:07 np0005548788.localdomain sudo[57442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:07 np0005548788.localdomain python3[57444]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:07 np0005548788.localdomain sudo[57442]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:07 np0005548788.localdomain sudo[57460]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbungzfjytksryjxplmtavsaatnikksd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:07 np0005548788.localdomain sudo[57460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:08 np0005548788.localdomain python3[57462]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:08 np0005548788.localdomain sudo[57460]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:08 np0005548788.localdomain sudo[57522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elvziiqlwpspyezxpqmvxkqdkccxlauo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:08 np0005548788.localdomain sudo[57522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:08 np0005548788.localdomain python3[57524]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:08 np0005548788.localdomain sudo[57522]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:08 np0005548788.localdomain sudo[57540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icghucpxpifahzvbaawirqqwtmumqqph ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:08 np0005548788.localdomain sudo[57540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:08 np0005548788.localdomain python3[57542]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:08 np0005548788.localdomain sudo[57540]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:09 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 63 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=63 pruub=15.092144012s) [2,0,1] r=-1 lpr=63 pi=[48,63)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1210.505371094s@ mbc={}] start_peering_interval up [0,1,5] -> [2,0,1], acting [0,1,5] -> [2,0,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:09 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 63 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=63 pruub=15.091958046s) [2,0,1] r=-1 lpr=63 pi=[48,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1210.505371094s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:09 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 63 pg[7.8( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=63) [2,0,1] r=0 lpr=63 pi=[48,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:09 np0005548788.localdomain sudo[57602]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inkyzzxnscsmkdcfylleijaqythosloy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:09 np0005548788.localdomain sudo[57602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:09 np0005548788.localdomain python3[57604]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:09 np0005548788.localdomain sudo[57602]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:20:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4382 writes, 19K keys, 4382 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4382 writes, 443 syncs, 9.89 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 977 writes, 3279 keys, 977 commit groups, 1.0 writes per commit group, ingest: 1.66 MB, 0.00 MB/s
                                                          Interval WAL: 977 writes, 237 syncs, 4.12 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:20:09 np0005548788.localdomain sudo[57620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaavuzehegogvkxorryvxchfpwvhzobu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:09 np0005548788.localdomain sudo[57620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:09 np0005548788.localdomain python3[57622]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:09 np0005548788.localdomain sudo[57620]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:10 np0005548788.localdomain sudo[57650]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oincakezzgwuzeppbxxkmtxrwkroffob ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:10 np0005548788.localdomain sudo[57650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:10 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 06 08:20:10 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 06 08:20:10 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 64 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=63/64 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=63) [2,0,1] r=0 lpr=63 pi=[48,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:10 np0005548788.localdomain python3[57652]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:20:10 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:20:10 np0005548788.localdomain systemd-sysv-generator[57677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:20:10 np0005548788.localdomain systemd-rc-local-generator[57673]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:20:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:20:10 np0005548788.localdomain sudo[57650]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:11 np0005548788.localdomain sudo[57736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wensjdcfmmwkirnzeyuvtkbsfshvinij ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:11 np0005548788.localdomain sudo[57736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:11 np0005548788.localdomain python3[57738]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:11 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.c scrub starts
Dec 06 08:20:11 np0005548788.localdomain sudo[57736]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:11 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 65 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=65 pruub=13.046749115s) [5,4,3] r=0 lpr=65 pi=[48,65)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1210.505615234s@ mbc={}] start_peering_interval up [0,1,5] -> [5,4,3], acting [0,1,5] -> [5,4,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:11 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 65 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=65 pruub=13.046749115s) [5,4,3] r=0 lpr=65 pi=[48,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1210.505615234s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:11 np0005548788.localdomain sudo[57754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skuknutsunyqvhfcarsgijisccyzapff ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:11 np0005548788.localdomain sudo[57754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:11 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.c scrub ok
Dec 06 08:20:11 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Dec 06 08:20:11 np0005548788.localdomain python3[57756]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:11 np0005548788.localdomain sudo[57754]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:11 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Dec 06 08:20:11 np0005548788.localdomain sudo[57816]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyzrtjsqwalstxgtqtsuymfymczxexil ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:11 np0005548788.localdomain sudo[57816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:12 np0005548788.localdomain python3[57818]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:12 np0005548788.localdomain sudo[57816]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:12 np0005548788.localdomain sudo[57834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgcrnjxtmqarasiwddqwhbznbpibtewv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:12 np0005548788.localdomain sudo[57834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:12 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 66 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=65/66 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=65) [5,4,3] r=0 lpr=65 pi=[48,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:12 np0005548788.localdomain python3[57836]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:12 np0005548788.localdomain sudo[57834]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:12 np0005548788.localdomain sudo[57864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gixeyziqljnftzwnezbgxkucvpgmocig ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:12 np0005548788.localdomain sudo[57864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:12 np0005548788.localdomain python3[57866]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:20:12 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:20:13 np0005548788.localdomain systemd-rc-local-generator[57888]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:20:13 np0005548788.localdomain systemd-sysv-generator[57894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:20:13 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:20:13 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:20:13 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:20:13 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:20:13 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:20:13 np0005548788.localdomain sudo[57864]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:13 np0005548788.localdomain sudo[57921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glnxfxlawdipnqcrhevwrcoufcdqzkzh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:13 np0005548788.localdomain sudo[57921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:13 np0005548788.localdomain python3[57923]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:20:13 np0005548788.localdomain sudo[57921]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:20:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Cumulative writes: 4979 writes, 22K keys, 4979 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4979 writes, 521 syncs, 9.56 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1732 writes, 6249 keys, 1732 commit groups, 1.0 writes per commit group, ingest: 2.47 MB, 0.00 MB/s
                                                          Interval WAL: 1732 writes, 382 syncs, 4.53 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:20:14 np0005548788.localdomain sudo[57937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-burudsapvqxjkkflzvhpcddvcekdnkyp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:14 np0005548788.localdomain sudo[57937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:14 np0005548788.localdomain sudo[57937]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:15 np0005548788.localdomain sudo[57978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-henlehaxtygqsqhzqhltbgegvqodvsli ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:15 np0005548788.localdomain sudo[57978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:15 np0005548788.localdomain python3[57980]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:20:15 np0005548788.localdomain podman[58049]: 2025-12-06 08:20:15.722682163 +0000 UTC m=+0.076216499 container create 8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step2, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute_init_log, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, version=17.1.12)
Dec 06 08:20:15 np0005548788.localdomain podman[58062]: 2025-12-06 08:20:15.755112327 +0000 UTC m=+0.079675570 container create ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git)
Dec 06 08:20:15 np0005548788.localdomain systemd[1]: Started libpod-conmon-8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc.scope.
Dec 06 08:20:15 np0005548788.localdomain podman[58049]: 2025-12-06 08:20:15.680906547 +0000 UTC m=+0.034440943 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:20:15 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:15 np0005548788.localdomain systemd[1]: Started libpod-conmon-ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40.scope.
Dec 06 08:20:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55120ee4b4cc047c306789233609c6fd6d29f8fc75a55d7a3544aaf3d7f6ad35/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:15 np0005548788.localdomain podman[58049]: 2025-12-06 08:20:15.813526327 +0000 UTC m=+0.167060663 container init 8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, config_id=tripleo_step2, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:20:15 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:15 np0005548788.localdomain podman[58062]: 2025-12-06 08:20:15.716582926 +0000 UTC m=+0.041146159 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:20:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d334edc17c9aa8f0b180da1f8a718e4ac8b472875066b766a2e2e11ebb80c3/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:15 np0005548788.localdomain podman[58049]: 2025-12-06 08:20:15.821002734 +0000 UTC m=+0.174537070 container start 8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=nova_compute_init_log, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']})
Dec 06 08:20:15 np0005548788.localdomain python3[57980]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Dec 06 08:20:15 np0005548788.localdomain podman[58062]: 2025-12-06 08:20:15.826249897 +0000 UTC m=+0.150813100 container init ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_virtqemud_init_logs, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Dec 06 08:20:15 np0005548788.localdomain systemd[1]: libpod-8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc.scope: Deactivated successfully.
Dec 06 08:20:15 np0005548788.localdomain podman[58062]: 2025-12-06 08:20:15.833731335 +0000 UTC m=+0.158294568 container start ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud_init_logs, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, release=1761123044, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64)
Dec 06 08:20:15 np0005548788.localdomain python3[57980]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Dec 06 08:20:15 np0005548788.localdomain systemd[1]: libpod-ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40.scope: Deactivated successfully.
Dec 06 08:20:15 np0005548788.localdomain podman[58089]: 2025-12-06 08:20:15.878506688 +0000 UTC m=+0.043108695 container died 8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, release=1761123044, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, vcs-type=git, io.buildah.version=1.41.4)
Dec 06 08:20:15 np0005548788.localdomain podman[58105]: 2025-12-06 08:20:15.8947292 +0000 UTC m=+0.046030241 container died ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_id=tripleo_step2, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, version=17.1.12, vcs-type=git, container_name=nova_virtqemud_init_logs, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible)
Dec 06 08:20:16 np0005548788.localdomain podman[58092]: 2025-12-06 08:20:16.011381834 +0000 UTC m=+0.173504840 container cleanup 8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: libpod-conmon-8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc.scope: Deactivated successfully.
Dec 06 08:20:16 np0005548788.localdomain podman[58110]: 2025-12-06 08:20:16.074331556 +0000 UTC m=+0.218810548 container cleanup ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: libpod-conmon-ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40.scope: Deactivated successfully.
Dec 06 08:20:16 np0005548788.localdomain podman[58239]: 2025-12-06 08:20:16.267342543 +0000 UTC m=+0.077092185 container create 03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, container_name=create_haproxy_wrapper, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3.scope.
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d637f2269baed45e71c2a75847c6903ddb47387528b995ac3d0fed5bef2ae572/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:16 np0005548788.localdomain podman[58239]: 2025-12-06 08:20:16.220989653 +0000 UTC m=+0.030739325 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:20:16 np0005548788.localdomain podman[58239]: 2025-12-06 08:20:16.325791024 +0000 UTC m=+0.135540636 container init 03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step2, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12)
Dec 06 08:20:16 np0005548788.localdomain podman[58239]: 2025-12-06 08:20:16.335896577 +0000 UTC m=+0.145646219 container start 03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=create_haproxy_wrapper, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:20:16 np0005548788.localdomain podman[58239]: 2025-12-06 08:20:16.336304449 +0000 UTC m=+0.146054081 container attach 03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step2, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:20:16 np0005548788.localdomain podman[58254]: 2025-12-06 08:20:16.365411356 +0000 UTC m=+0.141955062 container create 0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, container_name=create_virtlogd_wrapper, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step2, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6.scope.
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/432c853227b9a47c28cbd9f8638abd2f4ba478bfd57b8f9c2584b83011a05ecd/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:16 np0005548788.localdomain podman[58254]: 2025-12-06 08:20:16.320182 +0000 UTC m=+0.096725726 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:20:16 np0005548788.localdomain podman[58254]: 2025-12-06 08:20:16.428689357 +0000 UTC m=+0.205233033 container init 0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, container_name=create_virtlogd_wrapper, tcib_managed=true)
Dec 06 08:20:16 np0005548788.localdomain podman[58254]: 2025-12-06 08:20:16.437505094 +0000 UTC m=+0.214048800 container start 0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, architecture=x86_64, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=create_virtlogd_wrapper, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:20:16 np0005548788.localdomain podman[58254]: 2025-12-06 08:20:16.437837714 +0000 UTC m=+0.214381440 container attach 0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:35:22Z, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b0d334edc17c9aa8f0b180da1f8a718e4ac8b472875066b766a2e2e11ebb80c3-merged.mount: Deactivated successfully.
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecc8ea79bcb362e6662bbaff6da018e74e35e9dfb45abeb82901c8b5a6c43e40-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-55120ee4b4cc047c306789233609c6fd6d29f8fc75a55d7a3544aaf3d7f6ad35-merged.mount: Deactivated successfully.
Dec 06 08:20:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e11c0064ea054973189b4fb1fffefe03904e10507874fe272b302becbef3adc-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:18 np0005548788.localdomain ovs-vsctl[58346]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Dec 06 08:20:18 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec 06 08:20:18 np0005548788.localdomain systemd[1]: libpod-0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6.scope: Deactivated successfully.
Dec 06 08:20:18 np0005548788.localdomain systemd[1]: libpod-0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6.scope: Consumed 2.071s CPU time.
Dec 06 08:20:18 np0005548788.localdomain podman[58254]: 2025-12-06 08:20:18.508246371 +0000 UTC m=+2.284790117 container died 0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step2)
Dec 06 08:20:18 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec 06 08:20:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-432c853227b9a47c28cbd9f8638abd2f4ba478bfd57b8f9c2584b83011a05ecd-merged.mount: Deactivated successfully.
Dec 06 08:20:18 np0005548788.localdomain podman[58495]: 2025-12-06 08:20:18.582258535 +0000 UTC m=+0.064492518 container cleanup 0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:20:18 np0005548788.localdomain systemd[1]: libpod-conmon-0281343c9005e5548d0b4d3bc0700347183073f755c62050737a263f63c270a6.scope: Deactivated successfully.
Dec 06 08:20:18 np0005548788.localdomain python3[57980]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:20:19 np0005548788.localdomain podman[58534]: 2025-12-06 08:20:19.232096934 +0000 UTC m=+0.059839171 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:46Z)
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: libpod-03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3.scope: Deactivated successfully.
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: libpod-03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3.scope: Consumed 2.189s CPU time.
Dec 06 08:20:19 np0005548788.localdomain podman[58239]: 2025-12-06 08:20:19.359720229 +0000 UTC m=+3.169469831 container died 03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step2, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:20:19 np0005548788.localdomain podman[58564]: 2025-12-06 08:20:19.416912342 +0000 UTC m=+0.049443540 container cleanup 03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step2, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: libpod-conmon-03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3.scope: Deactivated successfully.
Dec 06 08:20:19 np0005548788.localdomain python3[57980]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Dec 06 08:20:19 np0005548788.localdomain podman[58534]: 2025-12-06 08:20:19.467536046 +0000 UTC m=+0.295278273 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd)
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:20:19 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec 06 08:20:19 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 67 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=52/53 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=67 pruub=10.360400200s) [4,2,3] r=1 lpr=67 pi=[52,67)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1220.244873047s@ mbc={}] start_peering_interval up [2,1,0] -> [4,2,3], acting [2,1,0] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:19 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 67 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=52/53 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=67 pruub=10.360207558s) [4,2,3] r=1 lpr=67 pi=[52,67)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1220.244873047s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:19 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d637f2269baed45e71c2a75847c6903ddb47387528b995ac3d0fed5bef2ae572-merged.mount: Deactivated successfully.
Dec 06 08:20:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:19 np0005548788.localdomain sudo[57978]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:19 np0005548788.localdomain sudo[58613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmqexchrkcomzngizbovdszsinebemhm ; /usr/bin/python3
Dec 06 08:20:19 np0005548788.localdomain sudo[58613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:20 np0005548788.localdomain python3[58615]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:20 np0005548788.localdomain sudo[58613]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:20 np0005548788.localdomain sudo[58661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igzjnqiiwwijkygfcumszjuqijcvnzfs ; /usr/bin/python3
Dec 06 08:20:20 np0005548788.localdomain sudo[58661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:20 np0005548788.localdomain sudo[58661]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:20 np0005548788.localdomain sudo[58704]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngygzamfarhodrodwzpnmwyhdnbsimxo ; /usr/bin/python3
Dec 06 08:20:20 np0005548788.localdomain sudo[58704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:21 np0005548788.localdomain sudo[58704]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:21 np0005548788.localdomain sudo[58734]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcqgqdzmivkffjuhurfszgqwjmdptbfi ; /usr/bin/python3
Dec 06 08:20:21 np0005548788.localdomain sudo[58734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:21 np0005548788.localdomain python3[58736]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005548788 step=2 update_config_hash_only=False
Dec 06 08:20:21 np0005548788.localdomain sudo[58734]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:21 np0005548788.localdomain sudo[58750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnadxfjnkalveroqedugnncxxjgfaerj ; /usr/bin/python3
Dec 06 08:20:21 np0005548788.localdomain sudo[58750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:22 np0005548788.localdomain python3[58752]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:22 np0005548788.localdomain sudo[58750]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:22 np0005548788.localdomain sudo[58766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsitiuydeksqrzvtxpputtsdajwdyzzu ; /usr/bin/python3
Dec 06 08:20:22 np0005548788.localdomain sudo[58766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:22 np0005548788.localdomain python3[58768]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:20:22 np0005548788.localdomain sudo[58766]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:22 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Dec 06 08:20:22 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Dec 06 08:20:24 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec 06 08:20:24 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec 06 08:20:25 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 06 08:20:25 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 06 08:20:28 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.3 deep-scrub starts
Dec 06 08:20:28 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.3 deep-scrub ok
Dec 06 08:20:29 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 70 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=56/57 n=1 ec=48/37 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.191361427s) [0,5,4] r=-1 lpr=70 pi=[56,70)/1 crt=40'39 mlcod 40'39 active pruub 1232.898315430s@ mbc={255={}}] start_peering_interval up [2,3,4] -> [0,5,4], acting [2,3,4] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:29 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 70 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=56/57 n=1 ec=48/37 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.191243172s) [0,5,4] r=-1 lpr=70 pi=[56,70)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1232.898315430s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:29 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 06 08:20:29 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 06 08:20:30 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 70 pg[7.c( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=56/56 les/c/f=57/57/0 sis=70) [0,5,4] r=1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:31 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 72 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=58/59 n=1 ec=48/37 lis/c=58/58 les/c/f=59/59/0 sis=72 pruub=13.243067741s) [3,2,1] r=-1 lpr=72 pi=[58,72)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1230.704467773s@ mbc={}] start_peering_interval up [4,5,0] -> [3,2,1], acting [4,5,0] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:31 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 72 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=58/59 n=1 ec=48/37 lis/c=58/58 les/c/f=59/59/0 sis=72 pruub=13.242988586s) [3,2,1] r=-1 lpr=72 pi=[58,72)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1230.704467773s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:32 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 72 pg[7.d( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=58/58 les/c/f=59/59/0 sis=72) [3,2,1] r=1 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:33 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 74 pg[7.e( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74) [2,1,3] r=0 lpr=74 pi=[60,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:33 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 74 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=60/61 n=1 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74 pruub=13.191211700s) [2,1,3] r=-1 lpr=74 pi=[60,74)/1 crt=40'39 mlcod 0'0 active pruub 1232.740234375s@ mbc={255={}}] start_peering_interval up [5,0,4] -> [2,1,3], acting [5,0,4] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:33 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 74 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=60/61 n=1 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74 pruub=13.191120148s) [2,1,3] r=-1 lpr=74 pi=[60,74)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1232.740234375s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:33 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 06 08:20:33 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 06 08:20:34 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 75 pg[7.e( v 40'39 lc 40'11 (0'0,40'39] local-lis/les=74/75 n=1 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74) [2,1,3] r=0 lpr=74 pi=[60,74)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:35 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 76 pg[7.f( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76) [2,4,3] r=0 lpr=76 pi=[61,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:35 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 76 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=61/62 n=1 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76 pruub=12.194279671s) [2,4,3] r=-1 lpr=76 pi=[61,76)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1233.765014648s@ mbc={}] start_peering_interval up [3,1,5] -> [2,4,3], acting [3,1,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:35 np0005548788.localdomain ceph-osd[32690]: osd.5 pg_epoch: 76 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=61/62 n=1 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76 pruub=12.194177628s) [2,4,3] r=-1 lpr=76 pi=[61,76)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1233.765014648s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:36 np0005548788.localdomain ceph-osd[31731]: osd.2 pg_epoch: 77 pg[7.f( v 40'39 lc 40'1 (0'0,40'39] local-lis/les=76/77 n=1 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76) [2,4,3] r=0 lpr=76 pi=[61,76)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(1+2)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:36 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 06 08:20:37 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 06 08:20:37 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec 06 08:20:37 np0005548788.localdomain ceph-osd[32690]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec 06 08:20:39 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 06 08:20:39 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 06 08:20:42 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.f deep-scrub starts
Dec 06 08:20:42 np0005548788.localdomain ceph-osd[31731]: log_channel(cluster) log [DBG] : 7.f deep-scrub ok
Dec 06 08:20:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:20:50 np0005548788.localdomain systemd[1]: tmp-crun.9vyIUC.mount: Deactivated successfully.
Dec 06 08:20:50 np0005548788.localdomain podman[58769]: 2025-12-06 08:20:50.266133468 +0000 UTC m=+0.096480268 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64)
Dec 06 08:20:50 np0005548788.localdomain podman[58769]: 2025-12-06 08:20:50.468040284 +0000 UTC m=+0.298387104 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 06 08:20:50 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:20:57 np0005548788.localdomain sudo[58799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:20:57 np0005548788.localdomain sudo[58799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:20:57 np0005548788.localdomain sudo[58799]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:57 np0005548788.localdomain sudo[58814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:20:57 np0005548788.localdomain sudo[58814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:20:58 np0005548788.localdomain sudo[58814]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:58 np0005548788.localdomain sudo[58861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:20:58 np0005548788.localdomain sudo[58861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:20:58 np0005548788.localdomain sudo[58861]: pam_unix(sudo:session): session closed for user root
Dec 06 08:21:01 np0005548788.localdomain sshd[58876]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:21:02 np0005548788.localdomain sshd[58876]: Received disconnect from 152.32.172.117 port 59734:11: Bye Bye [preauth]
Dec 06 08:21:02 np0005548788.localdomain sshd[58876]: Disconnected from authenticating user root 152.32.172.117 port 59734 [preauth]
Dec 06 08:21:08 np0005548788.localdomain sshd[58878]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:21:10 np0005548788.localdomain sshd[58878]: Received disconnect from 102.140.97.134 port 34628:11: Bye Bye [preauth]
Dec 06 08:21:10 np0005548788.localdomain sshd[58878]: Disconnected from authenticating user root 102.140.97.134 port 34628 [preauth]
Dec 06 08:21:15 np0005548788.localdomain sshd[58880]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:21:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:21:21 np0005548788.localdomain podman[58882]: 2025-12-06 08:21:21.247947413 +0000 UTC m=+0.076827337 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, release=1761123044)
Dec 06 08:21:21 np0005548788.localdomain podman[58882]: 2025-12-06 08:21:21.464923067 +0000 UTC m=+0.293802981 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12)
Dec 06 08:21:21 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:21:24 np0005548788.localdomain sshd[58880]: Received disconnect from 45.78.222.109 port 38862:11: Bye Bye [preauth]
Dec 06 08:21:24 np0005548788.localdomain sshd[58880]: Disconnected from 45.78.222.109 port 38862 [preauth]
Dec 06 08:21:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:21:52 np0005548788.localdomain podman[58912]: 2025-12-06 08:21:52.251949056 +0000 UTC m=+0.083191922 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com)
Dec 06 08:21:52 np0005548788.localdomain podman[58912]: 2025-12-06 08:21:52.444365496 +0000 UTC m=+0.275608392 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, release=1761123044)
Dec 06 08:21:52 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:21:58 np0005548788.localdomain sudo[58942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:21:58 np0005548788.localdomain sudo[58942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:21:58 np0005548788.localdomain sudo[58942]: pam_unix(sudo:session): session closed for user root
Dec 06 08:21:58 np0005548788.localdomain sudo[58957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:21:58 np0005548788.localdomain sudo[58957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:21:59 np0005548788.localdomain podman[59041]: 2025-12-06 08:21:59.551340983 +0000 UTC m=+0.083258274 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 06 08:21:59 np0005548788.localdomain podman[59041]: 2025-12-06 08:21:59.653448574 +0000 UTC m=+0.185365835 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, name=rhceph, vendor=Red Hat, Inc., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Dec 06 08:21:59 np0005548788.localdomain sudo[58957]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:00 np0005548788.localdomain sudo[59109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:22:00 np0005548788.localdomain sudo[59109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:22:00 np0005548788.localdomain sudo[59109]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:00 np0005548788.localdomain sudo[59124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:22:00 np0005548788.localdomain sudo[59124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:22:00 np0005548788.localdomain sudo[59124]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:01 np0005548788.localdomain sudo[59170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:22:01 np0005548788.localdomain sudo[59170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:22:01 np0005548788.localdomain sudo[59170]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:17 np0005548788.localdomain sshd[59185]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:22:18 np0005548788.localdomain sshd[59185]: Received disconnect from 152.32.172.117 port 44394:11: Bye Bye [preauth]
Dec 06 08:22:18 np0005548788.localdomain sshd[59185]: Disconnected from authenticating user root 152.32.172.117 port 44394 [preauth]
Dec 06 08:22:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:22:23 np0005548788.localdomain systemd[1]: tmp-crun.5MzRQl.mount: Deactivated successfully.
Dec 06 08:22:23 np0005548788.localdomain podman[59187]: 2025-12-06 08:22:23.256747301 +0000 UTC m=+0.085314264 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, release=1761123044, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true)
Dec 06 08:22:23 np0005548788.localdomain podman[59187]: 2025-12-06 08:22:23.480354527 +0000 UTC m=+0.308921540 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 06 08:22:23 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:22:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:22:54 np0005548788.localdomain systemd[1]: tmp-crun.5pn0bI.mount: Deactivated successfully.
Dec 06 08:22:54 np0005548788.localdomain podman[59217]: 2025-12-06 08:22:54.267330173 +0000 UTC m=+0.099084544 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:22:54 np0005548788.localdomain podman[59217]: 2025-12-06 08:22:54.449619377 +0000 UTC m=+0.281373788 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team)
Dec 06 08:22:54 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:23:01 np0005548788.localdomain sudo[59246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:23:01 np0005548788.localdomain sudo[59246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:23:01 np0005548788.localdomain sudo[59246]: pam_unix(sudo:session): session closed for user root
Dec 06 08:23:01 np0005548788.localdomain sudo[59261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:23:01 np0005548788.localdomain sudo[59261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:23:02 np0005548788.localdomain sudo[59261]: pam_unix(sudo:session): session closed for user root
Dec 06 08:23:02 np0005548788.localdomain sudo[59308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:23:02 np0005548788.localdomain sudo[59308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:23:02 np0005548788.localdomain sudo[59308]: pam_unix(sudo:session): session closed for user root
Dec 06 08:23:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:23:25 np0005548788.localdomain podman[59323]: 2025-12-06 08:23:25.249311881 +0000 UTC m=+0.080014126 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 06 08:23:25 np0005548788.localdomain podman[59323]: 2025-12-06 08:23:25.472612445 +0000 UTC m=+0.303314670 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:23:25 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:23:29 np0005548788.localdomain sshd[59351]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:23:31 np0005548788.localdomain sshd[59351]: Received disconnect from 152.32.172.117 port 44144:11: Bye Bye [preauth]
Dec 06 08:23:31 np0005548788.localdomain sshd[59351]: Disconnected from authenticating user root 152.32.172.117 port 44144 [preauth]
Dec 06 08:23:51 np0005548788.localdomain sshd[59353]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:23:54 np0005548788.localdomain sshd[59353]: Received disconnect from 45.78.222.109 port 60870:11: Bye Bye [preauth]
Dec 06 08:23:54 np0005548788.localdomain sshd[59353]: Disconnected from authenticating user root 45.78.222.109 port 60870 [preauth]
Dec 06 08:23:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:23:56 np0005548788.localdomain podman[59355]: 2025-12-06 08:23:56.244862025 +0000 UTC m=+0.076880628 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, tcib_managed=true, version=17.1.12)
Dec 06 08:23:56 np0005548788.localdomain podman[59355]: 2025-12-06 08:23:56.435587424 +0000 UTC m=+0.267606047 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_id=tripleo_step1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:23:56 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:24:02 np0005548788.localdomain sudo[59385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:24:02 np0005548788.localdomain sudo[59385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:24:02 np0005548788.localdomain sudo[59385]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:02 np0005548788.localdomain sudo[59400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:24:02 np0005548788.localdomain sudo[59400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:24:03 np0005548788.localdomain sudo[59400]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:04 np0005548788.localdomain sudo[59448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:24:04 np0005548788.localdomain sudo[59448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:24:04 np0005548788.localdomain sudo[59448]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:24:27 np0005548788.localdomain podman[59463]: 2025-12-06 08:24:27.243904259 +0000 UTC m=+0.074008303 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, url=https://www.redhat.com, distribution-scope=public, container_name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true)
Dec 06 08:24:27 np0005548788.localdomain podman[59463]: 2025-12-06 08:24:27.435577852 +0000 UTC m=+0.265681936 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:24:27 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:24:47 np0005548788.localdomain sshd[59495]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:24:48 np0005548788.localdomain sshd[59495]: Received disconnect from 152.32.172.117 port 60956:11: Bye Bye [preauth]
Dec 06 08:24:48 np0005548788.localdomain sshd[59495]: Disconnected from authenticating user root 152.32.172.117 port 60956 [preauth]
Dec 06 08:24:52 np0005548788.localdomain sudo[59542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnpsworioicgrsjuftiblfctrlfpxlif ; /usr/bin/python3
Dec 06 08:24:52 np0005548788.localdomain sudo[59542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:52 np0005548788.localdomain python3[59544]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:24:52 np0005548788.localdomain sudo[59542]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:52 np0005548788.localdomain sudo[59587]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oknpeaozcxaiswgprwwafqpufxdqudng ; /usr/bin/python3
Dec 06 08:24:52 np0005548788.localdomain sudo[59587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:52 np0005548788.localdomain python3[59589]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009492.237106-98733-280818042488304/source _original_basename=tmp6tdv0uah follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:24:52 np0005548788.localdomain sudo[59587]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:53 np0005548788.localdomain sudo[59617]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wipymzofocuqqvreupbclavcktahuxsg ; /usr/bin/python3
Dec 06 08:24:53 np0005548788.localdomain sudo[59617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:53 np0005548788.localdomain python3[59619]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:24:53 np0005548788.localdomain sudo[59617]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:54 np0005548788.localdomain sudo[59667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kswutaxcaccisheldsjlpzaoxainofxo ; /usr/bin/python3
Dec 06 08:24:54 np0005548788.localdomain sudo[59667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:54 np0005548788.localdomain sudo[59667]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:54 np0005548788.localdomain sudo[59685]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyzlupubjgltslghmpwhvuhjusxjxdfg ; /usr/bin/python3
Dec 06 08:24:54 np0005548788.localdomain sudo[59685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:54 np0005548788.localdomain sudo[59685]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:55 np0005548788.localdomain sudo[59789]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxjrpibldivkrucceirtblrrrkygejgt ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009495.1812823-98898-239678167316397/async_wrapper.py 809620404003 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009495.1812823-98898-239678167316397/AnsiballZ_command.py _
Dec 06 08:24:55 np0005548788.localdomain sudo[59789]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:24:55 np0005548788.localdomain ansible-async_wrapper.py[59791]: Invoked with 809620404003 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009495.1812823-98898-239678167316397/AnsiballZ_command.py _
Dec 06 08:24:55 np0005548788.localdomain ansible-async_wrapper.py[59794]: Starting module and watcher
Dec 06 08:24:55 np0005548788.localdomain ansible-async_wrapper.py[59794]: Start watching 59795 (3600)
Dec 06 08:24:55 np0005548788.localdomain ansible-async_wrapper.py[59795]: Start module (59795)
Dec 06 08:24:55 np0005548788.localdomain ansible-async_wrapper.py[59791]: Return async_wrapper task started.
Dec 06 08:24:55 np0005548788.localdomain sudo[59789]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:55 np0005548788.localdomain sudo[59810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojyfygqcjnpjsilwszexepfqgpxsbgig ; /usr/bin/python3
Dec 06 08:24:55 np0005548788.localdomain sudo[59810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:56 np0005548788.localdomain python3[59815]: ansible-ansible.legacy.async_status Invoked with jid=809620404003.59791 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:24:56 np0005548788.localdomain sudo[59810]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:24:58 np0005548788.localdomain podman[59859]: 2025-12-06 08:24:58.248252741 +0000 UTC m=+0.079124920 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, version=17.1.12)
Dec 06 08:24:58 np0005548788.localdomain podman[59859]: 2025-12-06 08:24:58.460730139 +0000 UTC m=+0.291602308 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Dec 06 08:24:58 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    (file & line not available)
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    (file & line not available)
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.12 seconds
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Notice: Applied catalog in 0.05 seconds
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Application:
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    Initial environment: production
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    Converged environment: production
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:          Run mode: user
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Changes:
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Events:
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Resources:
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:             Total: 10
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Time:
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:        Filebucket: 0.00
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:          Schedule: 0.00
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:              File: 0.00
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:              Exec: 0.01
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:            Augeas: 0.01
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    Transaction evaluation: 0.03
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    Catalog application: 0.05
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:    Config retrieval: 0.15
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:          Last run: 1765009499
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:             Total: 0.05
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]: Version:
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:            Config: 1765009499
Dec 06 08:24:59 np0005548788.localdomain puppet-user[59814]:            Puppet: 7.10.0
Dec 06 08:24:59 np0005548788.localdomain ansible-async_wrapper.py[59795]: Module complete (59795)
Dec 06 08:25:00 np0005548788.localdomain ansible-async_wrapper.py[59794]: Done in kid B.
Dec 06 08:25:04 np0005548788.localdomain sudo[59956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:25:04 np0005548788.localdomain sudo[59956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:25:04 np0005548788.localdomain sudo[59956]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:04 np0005548788.localdomain sudo[59971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:25:04 np0005548788.localdomain sudo[59971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:25:05 np0005548788.localdomain sudo[59971]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:05 np0005548788.localdomain sudo[60017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:25:05 np0005548788.localdomain sudo[60017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:25:05 np0005548788.localdomain sudo[60017]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:06 np0005548788.localdomain sudo[60045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzqunpnmkwkdgmhfzkpfualgxeejzjhr ; /usr/bin/python3
Dec 06 08:25:06 np0005548788.localdomain sudo[60045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:06 np0005548788.localdomain python3[60047]: ansible-ansible.legacy.async_status Invoked with jid=809620404003.59791 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:25:06 np0005548788.localdomain sudo[60045]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:06 np0005548788.localdomain sudo[60061]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwbiepbzpjjjktsenhoogkzhvqthfelq ; /usr/bin/python3
Dec 06 08:25:07 np0005548788.localdomain sudo[60061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:07 np0005548788.localdomain python3[60063]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:25:07 np0005548788.localdomain sudo[60061]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:07 np0005548788.localdomain sudo[60077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wadxfvpatsomqbmrzcdnbxncukrlqtpl ; /usr/bin/python3
Dec 06 08:25:07 np0005548788.localdomain sudo[60077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:07 np0005548788.localdomain python3[60079]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:07 np0005548788.localdomain sudo[60077]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:07 np0005548788.localdomain sudo[60127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imhufovngyhhzwbmqhowlehvlternfjn ; /usr/bin/python3
Dec 06 08:25:07 np0005548788.localdomain sudo[60127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:08 np0005548788.localdomain python3[60129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:08 np0005548788.localdomain sudo[60127]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:08 np0005548788.localdomain sudo[60145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptotdjnpcvadimfelzxvnfqjqatpnvie ; /usr/bin/python3
Dec 06 08:25:08 np0005548788.localdomain sudo[60145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:08 np0005548788.localdomain python3[60147]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpm19zo59n recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:25:08 np0005548788.localdomain sudo[60145]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:08 np0005548788.localdomain sudo[60175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyfgnofwlzvcohqhfyrrrcmqeudfwtsf ; /usr/bin/python3
Dec 06 08:25:08 np0005548788.localdomain sudo[60175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:08 np0005548788.localdomain python3[60177]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:08 np0005548788.localdomain sudo[60175]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:08 np0005548788.localdomain sudo[60191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmgepycxzugeenqduhcmgecwyukwnixv ; /usr/bin/python3
Dec 06 08:25:08 np0005548788.localdomain sudo[60191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:09 np0005548788.localdomain sudo[60191]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:09 np0005548788.localdomain sudo[60278]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icxsmlaslxrfgwbzejaidbnitzebdmap ; /usr/bin/python3
Dec 06 08:25:09 np0005548788.localdomain sudo[60278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:09 np0005548788.localdomain python3[60280]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:25:09 np0005548788.localdomain sudo[60278]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:10 np0005548788.localdomain sudo[60297]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izuvawuffgjqavkezmthkzpsmtjenmsc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:10 np0005548788.localdomain sudo[60297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:10 np0005548788.localdomain python3[60299]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:10 np0005548788.localdomain sudo[60297]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:11 np0005548788.localdomain sudo[60313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifnvekwitqzwhruoczoehruhhnlbbyat ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:11 np0005548788.localdomain sudo[60313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:11 np0005548788.localdomain sudo[60313]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:11 np0005548788.localdomain sudo[60329]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cimjafjjqmhfcaqqwanpqxtxrnxvzgra ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:11 np0005548788.localdomain sudo[60329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:11 np0005548788.localdomain python3[60331]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:11 np0005548788.localdomain sudo[60329]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:12 np0005548788.localdomain sudo[60379]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfodbykjlpatwpbcvwlvouapwktgwtnu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:12 np0005548788.localdomain sudo[60379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:12 np0005548788.localdomain python3[60381]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:12 np0005548788.localdomain sudo[60379]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:12 np0005548788.localdomain sudo[60397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzhrenueodoolbnhsktxcxvaxcseabza ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:12 np0005548788.localdomain sudo[60397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:12 np0005548788.localdomain python3[60399]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:12 np0005548788.localdomain sudo[60397]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:13 np0005548788.localdomain sudo[60459]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cylgpyvhhcczmfrqrqyvvjymifipqraw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:13 np0005548788.localdomain sudo[60459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:13 np0005548788.localdomain python3[60461]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:13 np0005548788.localdomain sudo[60459]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:13 np0005548788.localdomain sudo[60477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osirodzdxxqvoyefdumsslojlthnvznd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:13 np0005548788.localdomain sudo[60477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:13 np0005548788.localdomain python3[60479]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:13 np0005548788.localdomain sudo[60477]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:13 np0005548788.localdomain sudo[60539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oklufmkzymbteqmeadujuteiuluefevd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:13 np0005548788.localdomain sudo[60539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:14 np0005548788.localdomain python3[60541]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:14 np0005548788.localdomain sudo[60539]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:14 np0005548788.localdomain sudo[60557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suprnhuvlhlmrcdaehrtnixadttivhwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:14 np0005548788.localdomain sudo[60557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:14 np0005548788.localdomain python3[60559]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:14 np0005548788.localdomain sudo[60557]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:14 np0005548788.localdomain sudo[60619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhtkypxtavfjrazuoeozsuydpmtlgdbz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:14 np0005548788.localdomain sudo[60619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:14 np0005548788.localdomain python3[60621]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:14 np0005548788.localdomain sudo[60619]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:15 np0005548788.localdomain sudo[60637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyftxtpyefaoixdadsltyjpzfvckqqxv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:15 np0005548788.localdomain sudo[60637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:15 np0005548788.localdomain python3[60639]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:15 np0005548788.localdomain sudo[60637]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:15 np0005548788.localdomain sudo[60667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moypdfhfkeckrimnzwvvtivvxkoceraj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:15 np0005548788.localdomain sudo[60667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:15 np0005548788.localdomain python3[60669]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:15 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:15 np0005548788.localdomain systemd-sysv-generator[60694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:15 np0005548788.localdomain systemd-rc-local-generator[60690]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:16 np0005548788.localdomain sudo[60667]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:16 np0005548788.localdomain sudo[60753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqfonfncampromtzkfcjweqerniysydj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:16 np0005548788.localdomain sudo[60753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:16 np0005548788.localdomain python3[60755]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:16 np0005548788.localdomain sudo[60753]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:16 np0005548788.localdomain sudo[60771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aflblbiyngvldlcfoapbxegzapvzcsch ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:16 np0005548788.localdomain sudo[60771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:16 np0005548788.localdomain python3[60773]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:16 np0005548788.localdomain sudo[60771]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:17 np0005548788.localdomain sudo[60833]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtiwueyrjdnyaqjtzfdrguyeabjbxcvk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:17 np0005548788.localdomain sudo[60833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:17 np0005548788.localdomain python3[60835]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:17 np0005548788.localdomain sudo[60833]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:17 np0005548788.localdomain sudo[60851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhvffbybgsuzhkyimifelbofvomzdzye ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:17 np0005548788.localdomain sudo[60851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:17 np0005548788.localdomain python3[60853]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:17 np0005548788.localdomain sudo[60851]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:17 np0005548788.localdomain sudo[60881]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbhxqltzvstirdyrgtxhxuxhxkapnzkh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:17 np0005548788.localdomain sudo[60881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:18 np0005548788.localdomain python3[60883]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:18 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:18 np0005548788.localdomain systemd-sysv-generator[60916]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:18 np0005548788.localdomain systemd-rc-local-generator[60912]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:18 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:25:18 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:25:18 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:25:18 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:25:18 np0005548788.localdomain sudo[60881]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:18 np0005548788.localdomain sudo[60938]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caeibeaqnpsgzysdemwwzobrxditkyzj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:18 np0005548788.localdomain sudo[60938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:19 np0005548788.localdomain python3[60940]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:25:19 np0005548788.localdomain sudo[60938]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:19 np0005548788.localdomain sudo[60954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvnnyvvxcaxubnxiayzsgjippvzebhyy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:19 np0005548788.localdomain sudo[60954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:19 np0005548788.localdomain sudo[60954]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:20 np0005548788.localdomain sudo[60996]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btmhculqrlnihgfvfpqrvvpjxtrqntyf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:20 np0005548788.localdomain sudo[60996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:21 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:25:21 np0005548788.localdomain podman[61138]: 2025-12-06 08:25:21.434090103 +0000 UTC m=+0.063707673 container create da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=nova_statedir_owner, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:25:21 np0005548788.localdomain podman[61132]: 2025-12-06 08:25:21.456171867 +0000 UTC m=+0.089708608 container create 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9.scope.
Dec 06 08:25:21 np0005548788.localdomain podman[61168]: 2025-12-06 08:25:21.480992294 +0000 UTC m=+0.074959390 container create 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439.scope.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d687d9df5c689c1a5e0b342e3438942cdf880bd73e43f7e4784d4fa7fbfaa52/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d687d9df5c689c1a5e0b342e3438942cdf880bd73e43f7e4784d4fa7fbfaa52/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d687d9df5c689c1a5e0b342e3438942cdf880bd73e43f7e4784d4fa7fbfaa52/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548788.localdomain podman[61138]: 2025-12-06 08:25:21.399311946 +0000 UTC m=+0.028929526 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain podman[61152]: 2025-12-06 08:25:21.503155711 +0000 UTC m=+0.106343693 container create d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, container_name=ceilometer_init_log, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain podman[61132]: 2025-12-06 08:25:21.41008314 +0000 UTC m=+0.043619901 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a.scope.
Dec 06 08:25:21 np0005548788.localdomain podman[61168]: 2025-12-06 08:25:21.434146845 +0000 UTC m=+0.028113931 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain podman[61152]: 2025-12-06 08:25:21.441272275 +0000 UTC m=+0.044460247 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07b91d8553238d978835abe46e21a46235f67e708d84225f3bfb3e48b6b851c7/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548788.localdomain podman[61168]: 2025-12-06 08:25:21.550973781 +0000 UTC m=+0.144940887 container init 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=)
Dec 06 08:25:21 np0005548788.localdomain podman[61138]: 2025-12-06 08:25:21.555881073 +0000 UTC m=+0.185498643 container init da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=nova_statedir_owner, release=1761123044)
Dec 06 08:25:21 np0005548788.localdomain podman[61168]: 2025-12-06 08:25:21.559096002 +0000 UTC m=+0.153063078 container start 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:21 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d3b0a004e533211bab6cc44495102b19 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:25:21 np0005548788.localdomain podman[61138]: 2025-12-06 08:25:21.565884493 +0000 UTC m=+0.195502063 container start da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, container_name=nova_statedir_owner, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:25:21 np0005548788.localdomain podman[61138]: 2025-12-06 08:25:21.566152241 +0000 UTC m=+0.195769811 container attach da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_statedir_owner, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:25:21 np0005548788.localdomain sudo[61209]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:21 np0005548788.localdomain sudo[61209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:21 np0005548788.localdomain sudo[61209]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: libpod-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: libpod-da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548788.localdomain podman[61138]: 2025-12-06 08:25:21.643246197 +0000 UTC m=+0.272863777 container died da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute)
Dec 06 08:25:21 np0005548788.localdomain podman[61152]: 2025-12-06 08:25:21.653506355 +0000 UTC m=+0.256694397 container init d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_init_log, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:21 np0005548788.localdomain podman[61152]: 2025-12-06 08:25:21.659980315 +0000 UTC m=+0.263168347 container start d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3, container_name=ceilometer_init_log, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 08:25:21 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: libpod-d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548788.localdomain podman[61227]: 2025-12-06 08:25:21.681659837 +0000 UTC m=+0.049629998 container died 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, container_name=rsyslog, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vcs-type=git)
Dec 06 08:25:21 np0005548788.localdomain podman[61132]: 2025-12-06 08:25:21.734680558 +0000 UTC m=+0.368217279 container init 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, container_name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3)
Dec 06 08:25:21 np0005548788.localdomain podman[61132]: 2025-12-06 08:25:21.739975092 +0000 UTC m=+0.373511813 container start 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']})
Dec 06 08:25:21 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:21 np0005548788.localdomain podman[61261]: 2025-12-06 08:25:21.752595432 +0000 UTC m=+0.075083334 container died d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_init_log, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 06 08:25:21 np0005548788.localdomain sudo[61289]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:21 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:21 np0005548788.localdomain podman[61261]: 2025-12-06 08:25:21.776852063 +0000 UTC m=+0.099339945 container cleanup d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_init_log, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: libpod-conmon-d7b109e125c0c53186985816647a0b8bf43342eaaf8dbb1a306fca8820a87d5a.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:21 np0005548788.localdomain podman[61242]: 2025-12-06 08:25:21.858680416 +0000 UTC m=+0.204274034 container cleanup da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_statedir_owner, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: libpod-conmon-da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548788.localdomain podman[61227]: 2025-12-06 08:25:21.894750723 +0000 UTC m=+0.262720914 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=rsyslog, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: libpod-conmon-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Queued start job for default target Main User Target.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Created slice User Application Slice.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Reached target Paths.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Reached target Timers.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Starting D-Bus User Message Bus Socket...
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Starting Create User's Volatile Files and Directories...
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Reached target Sockets.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Finished Create User's Volatile Files and Directories.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Reached target Basic System.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Reached target Main User Target.
Dec 06 08:25:21 np0005548788.localdomain systemd[61306]: Startup finished in 113ms.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:25:21 np0005548788.localdomain systemd[1]: Started Session c1 of User root.
Dec 06 08:25:21 np0005548788.localdomain sudo[61289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:22 np0005548788.localdomain sudo[61289]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Dec 06 08:25:22 np0005548788.localdomain podman[61438]: 2025-12-06 08:25:22.390120588 +0000 UTC m=+0.080368489 container create 33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z)
Dec 06 08:25:22 np0005548788.localdomain podman[61460]: 2025-12-06 08:25:22.421749917 +0000 UTC m=+0.073037532 container create 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=collectd, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started libpod-conmon-33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09.scope.
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5d687d9df5c689c1a5e0b342e3438942cdf880bd73e43f7e4784d4fa7fbfaa52-merged.mount: Deactivated successfully.
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da0a7ac801218d24892df25f2890d9a09e14023d2779aa5f99b7294da42b83f9-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:22 np0005548788.localdomain podman[61438]: 2025-12-06 08:25:22.347799778 +0000 UTC m=+0.038047709 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4ccaf6b9e00129f62b1a5ca75d97cdf0ef2bf4949fad3aa47a39da7b3075511/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4ccaf6b9e00129f62b1a5ca75d97cdf0ef2bf4949fad3aa47a39da7b3075511/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4ccaf6b9e00129f62b1a5ca75d97cdf0ef2bf4949fad3aa47a39da7b3075511/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4ccaf6b9e00129f62b1a5ca75d97cdf0ef2bf4949fad3aa47a39da7b3075511/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started libpod-conmon-33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.scope.
Dec 06 08:25:22 np0005548788.localdomain podman[61438]: 2025-12-06 08:25:22.457567326 +0000 UTC m=+0.147815207 container init 33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 06 08:25:22 np0005548788.localdomain podman[61438]: 2025-12-06 08:25:22.465851962 +0000 UTC m=+0.156099823 container start 33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:22 np0005548788.localdomain podman[61460]: 2025-12-06 08:25:22.382560014 +0000 UTC m=+0.033847719 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:25:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdd5208a191e3dd329f7e505764b52a58d757ae8eaee9e9d3bc670d6f12b2b08/merged/scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdd5208a191e3dd329f7e505764b52a58d757ae8eaee9e9d3bc670d6f12b2b08/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:25:22 np0005548788.localdomain podman[61460]: 2025-12-06 08:25:22.512558638 +0000 UTC m=+0.163846273 container init 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:25:22 np0005548788.localdomain sudo[61490]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:22 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: Started Session c2 of User root.
Dec 06 08:25:22 np0005548788.localdomain podman[61460]: 2025-12-06 08:25:22.541655419 +0000 UTC m=+0.192943044 container start 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, tcib_managed=true, io.openshift.expose-services=)
Dec 06 08:25:22 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:25:22 np0005548788.localdomain sudo[61490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:22 np0005548788.localdomain sudo[61490]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Dec 06 08:25:22 np0005548788.localdomain podman[61491]: 2025-12-06 08:25:22.626222606 +0000 UTC m=+0.078696537 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:22 np0005548788.localdomain podman[61491]: 2025-12-06 08:25:22.635444902 +0000 UTC m=+0.087918823 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, io.openshift.expose-services=)
Dec 06 08:25:22 np0005548788.localdomain podman[61491]: unhealthy
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:22 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Failed with result 'exit-code'.
Dec 06 08:25:23 np0005548788.localdomain podman[61580]: 2025-12-06 08:25:23.024029181 +0000 UTC m=+0.074548928 container create e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, container_name=nova_virtsecretd, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started libpod-conmon-e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199.scope.
Dec 06 08:25:23 np0005548788.localdomain podman[61580]: 2025-12-06 08:25:22.980850395 +0000 UTC m=+0.031370202 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain podman[61580]: 2025-12-06 08:25:23.094917536 +0000 UTC m=+0.145437293 container init e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1761123044, build-date=2025-11-19T00:35:22Z, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, container_name=nova_virtsecretd, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:23 np0005548788.localdomain podman[61580]: 2025-12-06 08:25:23.105037829 +0000 UTC m=+0.155557576 container start e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtsecretd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Dec 06 08:25:23 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548788.localdomain sudo[61601]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:23 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started Session c3 of User root.
Dec 06 08:25:23 np0005548788.localdomain sudo[61601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:23 np0005548788.localdomain sudo[61601]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Dec 06 08:25:23 np0005548788.localdomain podman[61723]: 2025-12-06 08:25:23.576667888 +0000 UTC m=+0.100733008 container create 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:25:23 np0005548788.localdomain podman[61730]: 2025-12-06 08:25:23.615426768 +0000 UTC m=+0.111277165 container create 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, vcs-type=git, release=1761123044, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 08:25:23 np0005548788.localdomain podman[61723]: 2025-12-06 08:25:23.531641244 +0000 UTC m=+0.055706394 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started libpod-conmon-6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.scope.
Dec 06 08:25:23 np0005548788.localdomain podman[61730]: 2025-12-06 08:25:23.542829091 +0000 UTC m=+0.038679518 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started libpod-conmon-46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0.scope.
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c49c691de3a79d7c64c1e2b1baf2d52b814d8ec4049f7fba3b2602f1480e6a/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42c49c691de3a79d7c64c1e2b1baf2d52b814d8ec4049f7fba3b2602f1480e6a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548788.localdomain podman[61730]: 2025-12-06 08:25:23.664770246 +0000 UTC m=+0.160620623 container init 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, architecture=x86_64, container_name=nova_virtnodedevd)
Dec 06 08:25:23 np0005548788.localdomain podman[61730]: 2025-12-06 08:25:23.676419346 +0000 UTC m=+0.172269773 container start 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtnodedevd, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:25:23 np0005548788.localdomain podman[61723]: 2025-12-06 08:25:23.678815691 +0000 UTC m=+0.202880801 container init 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 08:25:23 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548788.localdomain sudo[61764]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:23 np0005548788.localdomain sudo[61760]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:25:23 np0005548788.localdomain podman[61723]: 2025-12-06 08:25:23.713727851 +0000 UTC m=+0.237792971 container start 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, distribution-scope=public)
Dec 06 08:25:23 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started Session c4 of User root.
Dec 06 08:25:23 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: Started Session c5 of User root.
Dec 06 08:25:23 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c6dd5c7aeba6260998a0bbde3ab20933 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:25:23 np0005548788.localdomain sudo[61764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:23 np0005548788.localdomain sudo[61760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:23 np0005548788.localdomain sudo[61764]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Dec 06 08:25:23 np0005548788.localdomain sudo[61760]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Dec 06 08:25:23 np0005548788.localdomain kernel: Loading iSCSI transport class v2.0-870.
Dec 06 08:25:23 np0005548788.localdomain podman[61771]: 2025-12-06 08:25:23.845009395 +0000 UTC m=+0.123822824 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Dec 06 08:25:23 np0005548788.localdomain podman[61771]: 2025-12-06 08:25:23.877756559 +0000 UTC m=+0.156569968 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Dec 06 08:25:23 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:25:24 np0005548788.localdomain podman[61901]: 2025-12-06 08:25:24.184630628 +0000 UTC m=+0.085978572 container create e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtstoraged, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, tcib_managed=true)
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: Started libpod-conmon-e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1.scope.
Dec 06 08:25:24 np0005548788.localdomain podman[61901]: 2025-12-06 08:25:24.140759761 +0000 UTC m=+0.042107775 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain podman[61901]: 2025-12-06 08:25:24.255758641 +0000 UTC m=+0.157106585 container init e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:25:24 np0005548788.localdomain podman[61901]: 2025-12-06 08:25:24.269078133 +0000 UTC m=+0.170426087 container start e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 08:25:24 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:24 np0005548788.localdomain sudo[61921]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:24 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: Started Session c6 of User root.
Dec 06 08:25:24 np0005548788.localdomain sudo[61921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:24 np0005548788.localdomain sudo[61921]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Dec 06 08:25:24 np0005548788.localdomain podman[62003]: 2025-12-06 08:25:24.749249217 +0000 UTC m=+0.094844087 container create 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, version=17.1.12, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: Started libpod-conmon-77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732.scope.
Dec 06 08:25:24 np0005548788.localdomain podman[62003]: 2025-12-06 08:25:24.701724356 +0000 UTC m=+0.047319256 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:24 np0005548788.localdomain podman[62003]: 2025-12-06 08:25:24.823038651 +0000 UTC m=+0.168633491 container init 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtqemud, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:25:24 np0005548788.localdomain podman[62003]: 2025-12-06 08:25:24.835237899 +0000 UTC m=+0.180832739 container start 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtqemud, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:24 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:24 np0005548788.localdomain sudo[62022]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:24 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: Started Session c7 of User root.
Dec 06 08:25:24 np0005548788.localdomain sudo[62022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:24 np0005548788.localdomain sudo[62022]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:24 np0005548788.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Dec 06 08:25:25 np0005548788.localdomain podman[62109]: 2025-12-06 08:25:25.309962454 +0000 UTC m=+0.092136023 container create eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt)
Dec 06 08:25:25 np0005548788.localdomain systemd[1]: Started libpod-conmon-eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9.scope.
Dec 06 08:25:25 np0005548788.localdomain podman[62109]: 2025-12-06 08:25:25.26326289 +0000 UTC m=+0.045436519 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:25 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548788.localdomain podman[62109]: 2025-12-06 08:25:25.376656349 +0000 UTC m=+0.158829998 container init eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:25:25 np0005548788.localdomain podman[62109]: 2025-12-06 08:25:25.386325428 +0000 UTC m=+0.168499037 container start eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 06 08:25:25 np0005548788.localdomain python3[60998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:25 np0005548788.localdomain sudo[62129]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:25 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:25 np0005548788.localdomain systemd[1]: Started Session c8 of User root.
Dec 06 08:25:25 np0005548788.localdomain sudo[62129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:25 np0005548788.localdomain sudo[62129]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:25 np0005548788.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Dec 06 08:25:25 np0005548788.localdomain sudo[60996]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:25 np0005548788.localdomain sudo[62186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnmlqwyhrxeiuqledbjzfglwuabrwtbh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:25 np0005548788.localdomain sudo[62186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:25 np0005548788.localdomain python3[62188]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:26 np0005548788.localdomain sudo[62186]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:26 np0005548788.localdomain sudo[62202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtyfshshahqsrzzrykkzswzrbplxtpdc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:26 np0005548788.localdomain sudo[62202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:26 np0005548788.localdomain python3[62204]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:26 np0005548788.localdomain sudo[62202]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:26 np0005548788.localdomain sudo[62218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upozmxclzxvedoztpflbwtiyjxjmwqzf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:26 np0005548788.localdomain sudo[62218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:26 np0005548788.localdomain python3[62220]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:26 np0005548788.localdomain sudo[62218]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:26 np0005548788.localdomain sudo[62234]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpzcvexgrvhdpudogdjnentlftqyfuya ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:26 np0005548788.localdomain sudo[62234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:26 np0005548788.localdomain python3[62236]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:26 np0005548788.localdomain sudo[62234]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:26 np0005548788.localdomain sudo[62250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkrllhzhcmyrtdmrnsxkhdvcyxpogsio ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:26 np0005548788.localdomain sudo[62250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548788.localdomain python3[62252]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548788.localdomain sudo[62250]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548788.localdomain sudo[62266]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epqxdwmrtahwncdwztfumtffegwwspoz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548788.localdomain sudo[62266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548788.localdomain python3[62268]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548788.localdomain sudo[62266]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548788.localdomain sudo[62283]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vruvldpakrnrzigfqnnfcgsclsbzgnwi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548788.localdomain sudo[62283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548788.localdomain python3[62285]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548788.localdomain sudo[62283]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548788.localdomain sudo[62299]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yckewjbhbxppbmouybpsceslvsugpwrx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548788.localdomain sudo[62299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548788.localdomain python3[62301]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548788.localdomain sudo[62299]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548788.localdomain sudo[62316]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwgpwejgzxnihihfjdqibefrmfmaexph ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548788.localdomain sudo[62316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548788.localdomain python3[62318]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:28 np0005548788.localdomain sudo[62316]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548788.localdomain sudo[62332]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjeqsidgtbdalxnazodeubaufingyqgh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548788.localdomain sudo[62332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548788.localdomain python3[62334]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:28 np0005548788.localdomain sudo[62332]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548788.localdomain sudo[62348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajiotwtcvfkemoiatpnomkektecnaaxo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548788.localdomain sudo[62348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548788.localdomain python3[62350]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:28 np0005548788.localdomain sudo[62348]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548788.localdomain sudo[62364]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzfxeqwckeoyzekcbwgolzloggmmznqm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548788.localdomain sudo[62364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:25:28 np0005548788.localdomain systemd[1]: tmp-crun.suM8k7.mount: Deactivated successfully.
Dec 06 08:25:28 np0005548788.localdomain podman[62367]: 2025-12-06 08:25:28.766124954 +0000 UTC m=+0.079881004 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:25:28 np0005548788.localdomain python3[62366]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:28 np0005548788.localdomain sudo[62364]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548788.localdomain sudo[62409]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjfndiibfsuxlbncmbakxhnjypatnvxd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548788.localdomain sudo[62409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548788.localdomain podman[62367]: 2025-12-06 08:25:28.98371129 +0000 UTC m=+0.297467270 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public)
Dec 06 08:25:28 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:25:29 np0005548788.localdomain python3[62411]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548788.localdomain sudo[62409]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548788.localdomain sudo[62426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyycshbnnbjasdugnrfqlcziphrcfycx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548788.localdomain sudo[62426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548788.localdomain python3[62428]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548788.localdomain sudo[62426]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548788.localdomain sudo[62442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nficdtraphnwleecfvhgsikysyokqusf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548788.localdomain sudo[62442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548788.localdomain python3[62444]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548788.localdomain sudo[62442]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548788.localdomain sudo[62458]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afqkmolrektrbujunvbbzxbmdzvqooxg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548788.localdomain sudo[62458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548788.localdomain python3[62460]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548788.localdomain sudo[62458]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548788.localdomain sudo[62474]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctrxiwsemtidhynhxplupjpycpcpeuic ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548788.localdomain sudo[62474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:30 np0005548788.localdomain python3[62476]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:30 np0005548788.localdomain sudo[62474]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:30 np0005548788.localdomain sudo[62490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqwytdyanmphutdfgjnbgfcvgnknwlwa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:30 np0005548788.localdomain sudo[62490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:30 np0005548788.localdomain python3[62492]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:30 np0005548788.localdomain sudo[62490]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:30 np0005548788.localdomain sudo[62551]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oksutnxthmzznxnjaermbtstxbksrxvn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:30 np0005548788.localdomain sudo[62551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:31 np0005548788.localdomain python3[62553]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:31 np0005548788.localdomain sudo[62551]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:31 np0005548788.localdomain sudo[62580]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rervzuqaewxvivhgplhtqbrdgoscebwy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:31 np0005548788.localdomain sudo[62580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:31 np0005548788.localdomain python3[62582]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:31 np0005548788.localdomain sudo[62580]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:31 np0005548788.localdomain sudo[62609]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txiejdegezrwhvaxkdhpgvcbwyacgpuc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:31 np0005548788.localdomain sudo[62609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:32 np0005548788.localdomain python3[62611]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:32 np0005548788.localdomain sudo[62609]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:32 np0005548788.localdomain sudo[62638]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjycpvakxkseilptappqfphkgqcgxclq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:32 np0005548788.localdomain sudo[62638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:32 np0005548788.localdomain python3[62640]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:32 np0005548788.localdomain sudo[62638]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:33 np0005548788.localdomain sudo[62667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrwxwyxggpczvaxymgtmouzzqukwjxrw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:33 np0005548788.localdomain sudo[62667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:33 np0005548788.localdomain python3[62669]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:33 np0005548788.localdomain sudo[62667]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:33 np0005548788.localdomain sudo[62696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfbjsudfpegmvyecwyqwulzyyqaqwjgx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:33 np0005548788.localdomain sudo[62696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:33 np0005548788.localdomain python3[62698]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:33 np0005548788.localdomain sudo[62696]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:34 np0005548788.localdomain sudo[62725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avmxvvqxtkzyqwhlvxynruboqvtymbum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:34 np0005548788.localdomain sudo[62725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:34 np0005548788.localdomain python3[62727]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:34 np0005548788.localdomain sudo[62725]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:34 np0005548788.localdomain sudo[62754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntiwhmtwxtmkbvmcksweehcjdjtbacos ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:34 np0005548788.localdomain sudo[62754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:34 np0005548788.localdomain python3[62756]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:34 np0005548788.localdomain sudo[62754]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:35 np0005548788.localdomain sudo[62783]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntqtneezotrvpupihxoswgzdfunxwmnm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:35 np0005548788.localdomain sudo[62783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:35 np0005548788.localdomain python3[62785]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009530.463093-100144-138133533106746/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:35 np0005548788.localdomain sudo[62783]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:35 np0005548788.localdomain sudo[62799]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkdrdzpvlkeifffaxbtlnjaipdasiuoc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:35 np0005548788.localdomain sudo[62799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Activating special unit Exit the Session...
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped target Main User Target.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped target Basic System.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped target Paths.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped target Sockets.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped target Timers.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Closed D-Bus User Message Bus Socket.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Stopped Create User's Volatile Files and Directories.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Removed slice User Application Slice.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Reached target Shutdown.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Finished Exit the Session.
Dec 06 08:25:35 np0005548788.localdomain systemd[61306]: Reached target Exit the Session.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 08:25:35 np0005548788.localdomain python3[62801]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:35 np0005548788.localdomain systemd-sysv-generator[62827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:35 np0005548788.localdomain systemd-rc-local-generator[62824]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:36 np0005548788.localdomain sudo[62799]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:36 np0005548788.localdomain sudo[62853]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaswqyixthxalolgseuzhqcqkxcowlxf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:36 np0005548788.localdomain sudo[62853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:36 np0005548788.localdomain python3[62855]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:36 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:36 np0005548788.localdomain systemd-rc-local-generator[62884]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:36 np0005548788.localdomain systemd-sysv-generator[62888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:36 np0005548788.localdomain systemd[1]: Starting collectd container...
Dec 06 08:25:37 np0005548788.localdomain systemd[1]: Started collectd container.
Dec 06 08:25:37 np0005548788.localdomain sudo[62853]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:37 np0005548788.localdomain sudo[62921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxwjlzzosztgrjjaxgneamodclgfupwe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:37 np0005548788.localdomain sudo[62921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:37 np0005548788.localdomain python3[62923]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:38 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:38 np0005548788.localdomain systemd-rc-local-generator[62949]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:38 np0005548788.localdomain systemd-sysv-generator[62953]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:39 np0005548788.localdomain systemd[1]: Starting iscsid container...
Dec 06 08:25:39 np0005548788.localdomain systemd[1]: Started iscsid container.
Dec 06 08:25:39 np0005548788.localdomain sudo[62921]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:39 np0005548788.localdomain sudo[62987]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-accqxtkymfgnxitxbkvayxxtlmnikxgp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:39 np0005548788.localdomain sudo[62987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:39 np0005548788.localdomain python3[62989]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:40 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:40 np0005548788.localdomain systemd-rc-local-generator[63015]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:40 np0005548788.localdomain systemd-sysv-generator[63018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:40 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:41 np0005548788.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Dec 06 08:25:41 np0005548788.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Dec 06 08:25:41 np0005548788.localdomain sudo[62987]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:41 np0005548788.localdomain sudo[63053]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gubgamevdrzhgfsfatpedatvdmxeczju ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:41 np0005548788.localdomain sudo[63053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:41 np0005548788.localdomain python3[63055]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:41 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:42 np0005548788.localdomain systemd-rc-local-generator[63082]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:42 np0005548788.localdomain systemd-sysv-generator[63087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:42 np0005548788.localdomain systemd[1]: Starting nova_virtnodedevd container...
Dec 06 08:25:42 np0005548788.localdomain tripleo-start-podman-container[63095]: Creating additional drop-in dependency for "nova_virtnodedevd" (46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0)
Dec 06 08:25:42 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:42 np0005548788.localdomain systemd-sysv-generator[63155]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:42 np0005548788.localdomain systemd-rc-local-generator[63151]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:42 np0005548788.localdomain systemd[1]: Started nova_virtnodedevd container.
Dec 06 08:25:42 np0005548788.localdomain sudo[63053]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:43 np0005548788.localdomain sudo[63177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvyblljjzknedenbfsvziwrsoosyznhl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:43 np0005548788.localdomain sudo[63177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:43 np0005548788.localdomain python3[63179]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:43 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:43 np0005548788.localdomain systemd-rc-local-generator[63202]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:43 np0005548788.localdomain systemd-sysv-generator[63210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:43 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:43 np0005548788.localdomain systemd[1]: Starting nova_virtproxyd container...
Dec 06 08:25:43 np0005548788.localdomain tripleo-start-podman-container[63219]: Creating additional drop-in dependency for "nova_virtproxyd" (eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9)
Dec 06 08:25:43 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:44 np0005548788.localdomain systemd-rc-local-generator[63280]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:44 np0005548788.localdomain systemd-sysv-generator[63283]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:44 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:44 np0005548788.localdomain systemd[1]: Started nova_virtproxyd container.
Dec 06 08:25:44 np0005548788.localdomain sudo[63177]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:44 np0005548788.localdomain sudo[63302]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elqsydxpncvnhvlfztwwfnmcfswnxmdf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:44 np0005548788.localdomain sudo[63302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:44 np0005548788.localdomain python3[63304]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:44 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:44 np0005548788.localdomain systemd-sysv-generator[63337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:44 np0005548788.localdomain systemd-rc-local-generator[63333]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:45 np0005548788.localdomain systemd[1]: Starting nova_virtqemud container...
Dec 06 08:25:45 np0005548788.localdomain tripleo-start-podman-container[63344]: Creating additional drop-in dependency for "nova_virtqemud" (77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732)
Dec 06 08:25:45 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:45 np0005548788.localdomain systemd-rc-local-generator[63405]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:45 np0005548788.localdomain systemd-sysv-generator[63408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:45 np0005548788.localdomain systemd[1]: Started nova_virtqemud container.
Dec 06 08:25:45 np0005548788.localdomain sudo[63302]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:45 np0005548788.localdomain sudo[63427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-molkrflzunqunobjqtchsiuqcakgjmkw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:46 np0005548788.localdomain sudo[63427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:46 np0005548788.localdomain python3[63429]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:46 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:46 np0005548788.localdomain systemd-rc-local-generator[63459]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:46 np0005548788.localdomain systemd-sysv-generator[63463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:46 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:46 np0005548788.localdomain systemd[1]: Starting nova_virtsecretd container...
Dec 06 08:25:46 np0005548788.localdomain tripleo-start-podman-container[63470]: Creating additional drop-in dependency for "nova_virtsecretd" (e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199)
Dec 06 08:25:46 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:46 np0005548788.localdomain systemd-sysv-generator[63526]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:46 np0005548788.localdomain systemd-rc-local-generator[63522]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:46 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:47 np0005548788.localdomain systemd[1]: Started nova_virtsecretd container.
Dec 06 08:25:47 np0005548788.localdomain sudo[63427]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:47 np0005548788.localdomain sudo[63550]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnwfbohjxtudcbmdfcnwqcjaghuribuy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:47 np0005548788.localdomain sudo[63550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:47 np0005548788.localdomain python3[63552]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:47 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:47 np0005548788.localdomain systemd-rc-local-generator[63579]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:47 np0005548788.localdomain systemd-sysv-generator[63582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:47 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:47 np0005548788.localdomain systemd[1]: Starting nova_virtstoraged container...
Dec 06 08:25:48 np0005548788.localdomain tripleo-start-podman-container[63592]: Creating additional drop-in dependency for "nova_virtstoraged" (e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1)
Dec 06 08:25:48 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:48 np0005548788.localdomain systemd-rc-local-generator[63651]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:48 np0005548788.localdomain systemd-sysv-generator[63655]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:48 np0005548788.localdomain systemd[1]: Started nova_virtstoraged container.
Dec 06 08:25:48 np0005548788.localdomain sudo[63550]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:49 np0005548788.localdomain sudo[63674]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orkkebkzidfvqlkehqlbpolhqmkajuus ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:49 np0005548788.localdomain sudo[63674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:49 np0005548788.localdomain python3[63676]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:49 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:25:49 np0005548788.localdomain systemd-rc-local-generator[63705]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:49 np0005548788.localdomain systemd-sysv-generator[63709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:49 np0005548788.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:49 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:49 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:49 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:49 np0005548788.localdomain podman[63716]: 2025-12-06 08:25:49.929734185 +0000 UTC m=+0.133450972 container init 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., container_name=rsyslog, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:49 np0005548788.localdomain podman[63716]: 2025-12-06 08:25:49.941038975 +0000 UTC m=+0.144755762 container start 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, container_name=rsyslog, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Dec 06 08:25:49 np0005548788.localdomain podman[63716]: rsyslog
Dec 06 08:25:49 np0005548788.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:49 np0005548788.localdomain sudo[63735]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:49 np0005548788.localdomain sudo[63735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:49 np0005548788.localdomain sudo[63674]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:50 np0005548788.localdomain sudo[63735]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: libpod-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:50 np0005548788.localdomain podman[63748]: 2025-12-06 08:25:50.097088225 +0000 UTC m=+0.042698632 container died 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 08:25:50 np0005548788.localdomain podman[63748]: 2025-12-06 08:25:50.12243474 +0000 UTC m=+0.068045107 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:50 np0005548788.localdomain podman[63765]: 2025-12-06 08:25:50.215624675 +0000 UTC m=+0.060002349 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, release=1761123044, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:50 np0005548788.localdomain podman[63765]: rsyslog
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:50 np0005548788.localdomain sudo[63789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdkstrxszptwulyrdeiwahuefbwwslqb ; /usr/bin/python3
Dec 06 08:25:50 np0005548788.localdomain sudo[63789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:50 np0005548788.localdomain python3[63791]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:50 np0005548788.localdomain sudo[63789]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:50 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:50 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:50 np0005548788.localdomain podman[63792]: 2025-12-06 08:25:50.533135654 +0000 UTC m=+0.102145213 container init 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Dec 06 08:25:50 np0005548788.localdomain podman[63792]: 2025-12-06 08:25:50.543711601 +0000 UTC m=+0.112721160 container start 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, container_name=rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:25:50 np0005548788.localdomain podman[63792]: rsyslog
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:50 np0005548788.localdomain sudo[63812]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:50 np0005548788.localdomain sudo[63812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:50 np0005548788.localdomain sudo[63812]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: libpod-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:50 np0005548788.localdomain podman[63816]: 2025-12-06 08:25:50.718541503 +0000 UTC m=+0.055454858 container died 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:25:50 np0005548788.localdomain podman[63816]: 2025-12-06 08:25:50.745277131 +0000 UTC m=+0.082190456 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, io.openshift.expose-services=)
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:50 np0005548788.localdomain podman[63829]: 2025-12-06 08:25:50.833739359 +0000 UTC m=+0.062595058 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, io.buildah.version=1.41.4, container_name=rsyslog, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 06 08:25:50 np0005548788.localdomain podman[63829]: rsyslog
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91-merged.mount: Deactivated successfully.
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:50 np0005548788.localdomain sudo[63884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyquownnbjjfzskhwstizqldkyxwsnvk ; /usr/bin/python3
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Dec 06 08:25:50 np0005548788.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:51 np0005548788.localdomain sudo[63884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:51 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548788.localdomain podman[63887]: 2025-12-06 08:25:51.151275379 +0000 UTC m=+0.131254374 container init 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:25:51 np0005548788.localdomain podman[63887]: 2025-12-06 08:25:51.161582688 +0000 UTC m=+0.141561663 container start 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc.)
Dec 06 08:25:51 np0005548788.localdomain podman[63887]: rsyslog
Dec 06 08:25:51 np0005548788.localdomain sudo[63884]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:51 np0005548788.localdomain sudo[63906]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:51 np0005548788.localdomain sudo[63906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:51 np0005548788.localdomain sudo[63906]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: libpod-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:51 np0005548788.localdomain podman[63923]: 2025-12-06 08:25:51.331636612 +0000 UTC m=+0.045452158 container died 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, container_name=rsyslog, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Dec 06 08:25:51 np0005548788.localdomain podman[63923]: 2025-12-06 08:25:51.357595456 +0000 UTC m=+0.071410962 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, container_name=rsyslog, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container)
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:51 np0005548788.localdomain sudo[63970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxwfomnsqzactoxsrnwzceluavkiarbh ; /usr/bin/python3
Dec 06 08:25:51 np0005548788.localdomain sudo[63970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:51 np0005548788.localdomain podman[63960]: 2025-12-06 08:25:51.451553205 +0000 UTC m=+0.063713284 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=rsyslog, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12)
Dec 06 08:25:51 np0005548788.localdomain podman[63960]: rsyslog
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:51 np0005548788.localdomain sudo[63970]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:51 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548788.localdomain podman[63995]: 2025-12-06 08:25:51.804836991 +0000 UTC m=+0.125544178 container init 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team)
Dec 06 08:25:51 np0005548788.localdomain podman[63995]: 2025-12-06 08:25:51.814736537 +0000 UTC m=+0.135443704 container start 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, container_name=rsyslog, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044)
Dec 06 08:25:51 np0005548788.localdomain podman[63995]: rsyslog
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:51 np0005548788.localdomain sudo[64027]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:51 np0005548788.localdomain sudo[64027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:51 np0005548788.localdomain sudo[64028]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnptvskothfunjjhotrakrphqloajmlf ; /usr/bin/python3
Dec 06 08:25:51 np0005548788.localdomain sudo[64028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:51 np0005548788.localdomain sudo[64027]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548788.localdomain systemd[1]: libpod-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:51 np0005548788.localdomain podman[64033]: 2025-12-06 08:25:51.98571449 +0000 UTC m=+0.079279465 container died 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, release=1761123044, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tmp-crun.bejcjL.mount: Deactivated successfully.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:52 np0005548788.localdomain podman[64033]: 2025-12-06 08:25:52.022570961 +0000 UTC m=+0.116135896 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step3)
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:52 np0005548788.localdomain python3[64031]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005548788 step=3 update_config_hash_only=False
Dec 06 08:25:52 np0005548788.localdomain sudo[64028]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:52 np0005548788.localdomain podman[64045]: 2025-12-06 08:25:52.103826627 +0000 UTC m=+0.059081050 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, tcib_managed=true, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:25:52 np0005548788.localdomain podman[64045]: rsyslog
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:52 np0005548788.localdomain sudo[64084]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgyvhpmxmkyizhtmxaxhlvzllsintahs ; /usr/bin/python3
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:52 np0005548788.localdomain sudo[64084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:52 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:52 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:52 np0005548788.localdomain podman[64059]: 2025-12-06 08:25:52.532248959 +0000 UTC m=+0.111532834 container init 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:25:52 np0005548788.localdomain podman[64059]: 2025-12-06 08:25:52.53811054 +0000 UTC m=+0.117394435 container start 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, distribution-scope=public)
Dec 06 08:25:52 np0005548788.localdomain podman[64059]: rsyslog
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:52 np0005548788.localdomain sudo[64093]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:52 np0005548788.localdomain sudo[64093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:52 np0005548788.localdomain sudo[64093]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: libpod-9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c.scope: Deactivated successfully.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:25:52 np0005548788.localdomain podman[64096]: 2025-12-06 08:25:52.671123847 +0000 UTC m=+0.042817236 container died 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 08:25:52 np0005548788.localdomain podman[64096]: 2025-12-06 08:25:52.689731734 +0000 UTC m=+0.061425113 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 06 08:25:52 np0005548788.localdomain python3[64090]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:52 np0005548788.localdomain sudo[64084]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:52 np0005548788.localdomain podman[64107]: 2025-12-06 08:25:52.736186351 +0000 UTC m=+0.074375493 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, release=1761123044, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:25:52 np0005548788.localdomain podman[64118]: 2025-12-06 08:25:52.766690065 +0000 UTC m=+0.053818046 container cleanup 9b0c16f1f97ae99de849af6d851bed8575a14b2f0f4f28cb2b6de5316dffcf1c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd3b0a004e533211bab6cc44495102b19'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:52 np0005548788.localdomain podman[64118]: rsyslog
Dec 06 08:25:52 np0005548788.localdomain podman[64107]: 2025-12-06 08:25:52.770571186 +0000 UTC m=+0.108760328 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91-merged.mount: Deactivated successfully.
Dec 06 08:25:52 np0005548788.localdomain sudo[64153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muaotkrvpaqhkcbakaoeetoegchhgfuw ; /usr/bin/python3
Dec 06 08:25:52 np0005548788.localdomain sudo[64153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:52 np0005548788.localdomain systemd[1]: Failed to start rsyslog container.
Dec 06 08:25:53 np0005548788.localdomain python3[64155]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:25:53 np0005548788.localdomain sudo[64153]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:25:54 np0005548788.localdomain podman[64156]: 2025-12-06 08:25:54.248579799 +0000 UTC m=+0.080432721 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, container_name=iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 08:25:54 np0005548788.localdomain podman[64156]: 2025-12-06 08:25:54.282469298 +0000 UTC m=+0.114322220 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:25:54 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:25:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:25:59 np0005548788.localdomain podman[64174]: 2025-12-06 08:25:59.234352408 +0000 UTC m=+0.067093919 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:25:59 np0005548788.localdomain podman[64174]: 2025-12-06 08:25:59.43761038 +0000 UTC m=+0.270351921 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 08:25:59 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:26:03 np0005548788.localdomain sshd[64203]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:26:04 np0005548788.localdomain sshd[64203]: Received disconnect from 152.32.172.117 port 41046:11: Bye Bye [preauth]
Dec 06 08:26:04 np0005548788.localdomain sshd[64203]: Disconnected from authenticating user root 152.32.172.117 port 41046 [preauth]
Dec 06 08:26:05 np0005548788.localdomain sudo[64205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:26:05 np0005548788.localdomain sudo[64205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:26:05 np0005548788.localdomain sudo[64205]: pam_unix(sudo:session): session closed for user root
Dec 06 08:26:05 np0005548788.localdomain sudo[64220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:26:05 np0005548788.localdomain sudo[64220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:26:06 np0005548788.localdomain sudo[64220]: pam_unix(sudo:session): session closed for user root
Dec 06 08:26:07 np0005548788.localdomain sudo[64266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:26:07 np0005548788.localdomain sudo[64266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:26:07 np0005548788.localdomain sudo[64266]: pam_unix(sudo:session): session closed for user root
Dec 06 08:26:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:26:23 np0005548788.localdomain podman[64281]: 2025-12-06 08:26:23.233881358 +0000 UTC m=+0.066089567 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:26:23 np0005548788.localdomain podman[64281]: 2025-12-06 08:26:23.268331474 +0000 UTC m=+0.100539713 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1)
Dec 06 08:26:23 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:26:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:26:25 np0005548788.localdomain podman[64300]: 2025-12-06 08:26:25.25553668 +0000 UTC m=+0.083639580 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1)
Dec 06 08:26:25 np0005548788.localdomain podman[64300]: 2025-12-06 08:26:25.268661076 +0000 UTC m=+0.096763986 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, container_name=iscsid, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 08:26:25 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:26:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:26:30 np0005548788.localdomain systemd[1]: tmp-crun.2ESoIV.mount: Deactivated successfully.
Dec 06 08:26:30 np0005548788.localdomain podman[64319]: 2025-12-06 08:26:30.258938965 +0000 UTC m=+0.092418382 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:26:30 np0005548788.localdomain podman[64319]: 2025-12-06 08:26:30.494725322 +0000 UTC m=+0.328204679 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, container_name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:26:30 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:26:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:26:54 np0005548788.localdomain systemd[1]: tmp-crun.eLFvwF.mount: Deactivated successfully.
Dec 06 08:26:54 np0005548788.localdomain podman[64349]: 2025-12-06 08:26:54.267465648 +0000 UTC m=+0.099737017 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:26:54 np0005548788.localdomain podman[64349]: 2025-12-06 08:26:54.274922618 +0000 UTC m=+0.107193887 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=collectd, release=1761123044)
Dec 06 08:26:54 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:26:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:26:56 np0005548788.localdomain podman[64370]: 2025-12-06 08:26:56.244354557 +0000 UTC m=+0.070921432 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:26:56 np0005548788.localdomain podman[64370]: 2025-12-06 08:26:56.253106946 +0000 UTC m=+0.079673901 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:26:56 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:27:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:27:01 np0005548788.localdomain podman[64389]: 2025-12-06 08:27:01.244446114 +0000 UTC m=+0.073820900 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 06 08:27:01 np0005548788.localdomain podman[64389]: 2025-12-06 08:27:01.458540636 +0000 UTC m=+0.287915312 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:27:01 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:27:07 np0005548788.localdomain sudo[64418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:27:07 np0005548788.localdomain sudo[64418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:27:07 np0005548788.localdomain sudo[64418]: pam_unix(sudo:session): session closed for user root
Dec 06 08:27:07 np0005548788.localdomain sudo[64433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:27:07 np0005548788.localdomain sudo[64433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:27:07 np0005548788.localdomain sudo[64433]: pam_unix(sudo:session): session closed for user root
Dec 06 08:27:08 np0005548788.localdomain sudo[64480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:27:08 np0005548788.localdomain sudo[64480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:27:08 np0005548788.localdomain sudo[64480]: pam_unix(sudo:session): session closed for user root
Dec 06 08:27:16 np0005548788.localdomain sshd[64495]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:27:17 np0005548788.localdomain sshd[64495]: Received disconnect from 152.32.172.117 port 54966:11: Bye Bye [preauth]
Dec 06 08:27:17 np0005548788.localdomain sshd[64495]: Disconnected from authenticating user root 152.32.172.117 port 54966 [preauth]
Dec 06 08:27:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:27:25 np0005548788.localdomain podman[64497]: 2025-12-06 08:27:25.25953634 +0000 UTC m=+0.089685218 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12)
Dec 06 08:27:25 np0005548788.localdomain podman[64497]: 2025-12-06 08:27:25.300700986 +0000 UTC m=+0.130849854 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 06 08:27:25 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:27:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:27:27 np0005548788.localdomain podman[64517]: 2025-12-06 08:27:27.246862199 +0000 UTC m=+0.078142743 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 08:27:27 np0005548788.localdomain podman[64517]: 2025-12-06 08:27:27.255748563 +0000 UTC m=+0.087029127 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, container_name=iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:27:27 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:27:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:27:32 np0005548788.localdomain systemd[1]: tmp-crun.rnIE8M.mount: Deactivated successfully.
Dec 06 08:27:32 np0005548788.localdomain podman[64535]: 2025-12-06 08:27:32.253265462 +0000 UTC m=+0.084885301 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Dec 06 08:27:32 np0005548788.localdomain podman[64535]: 2025-12-06 08:27:32.428953862 +0000 UTC m=+0.260574001 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.)
Dec 06 08:27:32 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:27:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:27:56 np0005548788.localdomain systemd[1]: tmp-crun.WNTa0o.mount: Deactivated successfully.
Dec 06 08:27:56 np0005548788.localdomain podman[64564]: 2025-12-06 08:27:56.265280531 +0000 UTC m=+0.093580887 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Dec 06 08:27:56 np0005548788.localdomain podman[64564]: 2025-12-06 08:27:56.279593831 +0000 UTC m=+0.107894227 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:27:56 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:27:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:27:58 np0005548788.localdomain podman[64585]: 2025-12-06 08:27:58.236333091 +0000 UTC m=+0.069470867 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12)
Dec 06 08:27:58 np0005548788.localdomain podman[64585]: 2025-12-06 08:27:58.274633798 +0000 UTC m=+0.107771534 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:27:58 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:28:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:28:03 np0005548788.localdomain systemd[1]: tmp-crun.lixlxX.mount: Deactivated successfully.
Dec 06 08:28:03 np0005548788.localdomain podman[64604]: 2025-12-06 08:28:03.257629741 +0000 UTC m=+0.089587375 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:28:03 np0005548788.localdomain podman[64604]: 2025-12-06 08:28:03.471859916 +0000 UTC m=+0.303817570 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64)
Dec 06 08:28:03 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:28:08 np0005548788.localdomain sudo[64633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:28:08 np0005548788.localdomain sudo[64633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:28:08 np0005548788.localdomain sudo[64633]: pam_unix(sudo:session): session closed for user root
Dec 06 08:28:08 np0005548788.localdomain sudo[64648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:28:08 np0005548788.localdomain sudo[64648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:28:09 np0005548788.localdomain sudo[64648]: pam_unix(sudo:session): session closed for user root
Dec 06 08:28:10 np0005548788.localdomain sudo[64695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:28:10 np0005548788.localdomain sudo[64695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:28:10 np0005548788.localdomain sudo[64695]: pam_unix(sudo:session): session closed for user root
Dec 06 08:28:26 np0005548788.localdomain sshd[64710]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:28:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:28:27 np0005548788.localdomain podman[64712]: 2025-12-06 08:28:27.250556736 +0000 UTC m=+0.080691343 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Dec 06 08:28:27 np0005548788.localdomain podman[64712]: 2025-12-06 08:28:27.265656169 +0000 UTC m=+0.095790776 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:28:27 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:28:28 np0005548788.localdomain sshd[64710]: Received disconnect from 152.32.172.117 port 45140:11: Bye Bye [preauth]
Dec 06 08:28:28 np0005548788.localdomain sshd[64710]: Disconnected from authenticating user root 152.32.172.117 port 45140 [preauth]
Dec 06 08:28:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:28:28 np0005548788.localdomain podman[64732]: 2025-12-06 08:28:28.521522954 +0000 UTC m=+0.075702698 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z)
Dec 06 08:28:28 np0005548788.localdomain podman[64732]: 2025-12-06 08:28:28.558564863 +0000 UTC m=+0.112744577 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3)
Dec 06 08:28:28 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:28:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:28:34 np0005548788.localdomain podman[64751]: 2025-12-06 08:28:34.255649437 +0000 UTC m=+0.087469330 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, url=https://www.redhat.com)
Dec 06 08:28:34 np0005548788.localdomain podman[64751]: 2025-12-06 08:28:34.463774554 +0000 UTC m=+0.295594487 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:28:34 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:28:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:28:58 np0005548788.localdomain systemd[1]: tmp-crun.1PJ4NR.mount: Deactivated successfully.
Dec 06 08:28:58 np0005548788.localdomain podman[64780]: 2025-12-06 08:28:58.251592928 +0000 UTC m=+0.079273794 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd)
Dec 06 08:28:58 np0005548788.localdomain podman[64780]: 2025-12-06 08:28:58.285645467 +0000 UTC m=+0.113326313 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, vcs-type=git, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:28:58 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:28:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:28:59 np0005548788.localdomain podman[64799]: 2025-12-06 08:28:59.255276933 +0000 UTC m=+0.076406726 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, release=1761123044, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=iscsid)
Dec 06 08:28:59 np0005548788.localdomain podman[64799]: 2025-12-06 08:28:59.287387833 +0000 UTC m=+0.108517646 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 08:28:59 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:29:04 np0005548788.localdomain sshd[64818]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:29:05 np0005548788.localdomain podman[64820]: 2025-12-06 08:29:05.257787509 +0000 UTC m=+0.086196887 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:29:05 np0005548788.localdomain podman[64820]: 2025-12-06 08:29:05.479046559 +0000 UTC m=+0.307455897 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:29:05 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:29:10 np0005548788.localdomain sudo[64850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:29:10 np0005548788.localdomain sudo[64850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:10 np0005548788.localdomain sudo[64850]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:10 np0005548788.localdomain sudo[64865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:29:10 np0005548788.localdomain sudo[64865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:11 np0005548788.localdomain sudo[64865]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:11 np0005548788.localdomain sudo[64912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:29:11 np0005548788.localdomain sudo[64912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:11 np0005548788.localdomain sudo[64912]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:11 np0005548788.localdomain sudo[64927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 08:29:11 np0005548788.localdomain sudo[64927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:12 np0005548788.localdomain sudo[64927]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:14 np0005548788.localdomain sshd[64818]: Connection closed by 45.78.222.109 port 33654 [preauth]
Dec 06 08:29:17 np0005548788.localdomain sudo[64960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:29:17 np0005548788.localdomain sudo[64960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:17 np0005548788.localdomain sudo[64960]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:29:29 np0005548788.localdomain systemd[1]: tmp-crun.v7HcqX.mount: Deactivated successfully.
Dec 06 08:29:29 np0005548788.localdomain podman[64976]: 2025-12-06 08:29:29.276403215 +0000 UTC m=+0.098438405 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:29:29 np0005548788.localdomain podman[64976]: 2025-12-06 08:29:29.289721845 +0000 UTC m=+0.111756995 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:29:29 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:29:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:29:29 np0005548788.localdomain podman[64996]: 2025-12-06 08:29:29.409444525 +0000 UTC m=+0.075940802 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Dec 06 08:29:29 np0005548788.localdomain podman[64996]: 2025-12-06 08:29:29.447672284 +0000 UTC m=+0.114168591 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:29:29 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:29:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:29:36 np0005548788.localdomain podman[65015]: 2025-12-06 08:29:36.245958076 +0000 UTC m=+0.078316555 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Dec 06 08:29:36 np0005548788.localdomain podman[65015]: 2025-12-06 08:29:36.422094114 +0000 UTC m=+0.254452593 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:29:36 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:29:38 np0005548788.localdomain sshd[65043]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:39 np0005548788.localdomain sshd[65043]: Received disconnect from 152.32.172.117 port 34076:11: Bye Bye [preauth]
Dec 06 08:29:39 np0005548788.localdomain sshd[65043]: Disconnected from authenticating user root 152.32.172.117 port 34076 [preauth]
Dec 06 08:29:44 np0005548788.localdomain sudo[65090]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvqbluaufnyvjmynhdpjsmolhonybphw ; /usr/bin/python3
Dec 06 08:29:44 np0005548788.localdomain sudo[65090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:44 np0005548788.localdomain python3[65092]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:44 np0005548788.localdomain sudo[65090]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:45 np0005548788.localdomain sudo[65135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqzvxioofeiqligktiolxemdjfilwfwp ; /usr/bin/python3
Dec 06 08:29:45 np0005548788.localdomain sudo[65135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:45 np0005548788.localdomain python3[65137]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009784.6442432-107256-49805301390643/source _original_basename=tmp3pwm9a3b follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:45 np0005548788.localdomain sudo[65135]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:46 np0005548788.localdomain sudo[65197]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlshcfwwvsjdmyuomdxnjycwhouukhtc ; /usr/bin/python3
Dec 06 08:29:46 np0005548788.localdomain sudo[65197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:46 np0005548788.localdomain python3[65199]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:46 np0005548788.localdomain sudo[65197]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:46 np0005548788.localdomain sudo[65240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkmywwmaflhrmotrfjrtztcsudtofiqs ; /usr/bin/python3
Dec 06 08:29:46 np0005548788.localdomain sudo[65240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:46 np0005548788.localdomain python3[65242]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009786.2068074-107369-216374979004412/source _original_basename=tmpqy4h2a5x follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:46 np0005548788.localdomain sudo[65240]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:47 np0005548788.localdomain sudo[65302]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxgcmpprmyhnnvzkjxekiqrhcgyvfehi ; /usr/bin/python3
Dec 06 08:29:47 np0005548788.localdomain sudo[65302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:47 np0005548788.localdomain python3[65304]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:47 np0005548788.localdomain sudo[65302]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:47 np0005548788.localdomain sudo[65345]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phrxvlflphiboxocouzmhjpayjpmbqfu ; /usr/bin/python3
Dec 06 08:29:47 np0005548788.localdomain sudo[65345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:47 np0005548788.localdomain python3[65347]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009787.1262112-107419-120185185134392/source _original_basename=tmpfu6cm3rd follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:47 np0005548788.localdomain sudo[65345]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:48 np0005548788.localdomain sudo[65407]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmoymaqcrkdxdqyetznnjainlwjwnboa ; /usr/bin/python3
Dec 06 08:29:48 np0005548788.localdomain sudo[65407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:48 np0005548788.localdomain python3[65409]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:48 np0005548788.localdomain sudo[65407]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:48 np0005548788.localdomain sudo[65450]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osbeihyvwrbttnolfxbrbnfnlyfgbcuz ; /usr/bin/python3
Dec 06 08:29:48 np0005548788.localdomain sudo[65450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:48 np0005548788.localdomain python3[65452]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009788.1152794-107480-98701101537042/source _original_basename=tmp3_etqf0i follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:48 np0005548788.localdomain sudo[65450]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:49 np0005548788.localdomain sudo[65480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huaxppvbjtcrlgstexnqurtqwqwfaldw ; /usr/bin/python3
Dec 06 08:29:49 np0005548788.localdomain sudo[65480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:49 np0005548788.localdomain python3[65482]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 08:29:49 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:29:49 np0005548788.localdomain systemd-sysv-generator[65513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:49 np0005548788.localdomain systemd-rc-local-generator[65510]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:49 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:29:49 np0005548788.localdomain systemd-rc-local-generator[65537]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:49 np0005548788.localdomain systemd-sysv-generator[65546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:49 np0005548788.localdomain sudo[65480]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:50 np0005548788.localdomain sudo[65570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnzfdozdapqlaujdwyvqktxacuvabczx ; /usr/bin/python3
Dec 06 08:29:50 np0005548788.localdomain sudo[65570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:50 np0005548788.localdomain python3[65572]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:29:50 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:29:50 np0005548788.localdomain systemd-rc-local-generator[65595]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:50 np0005548788.localdomain systemd-sysv-generator[65599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:50 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:50 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:29:50 np0005548788.localdomain systemd-rc-local-generator[65638]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:50 np0005548788.localdomain systemd-sysv-generator[65642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:50 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:51 np0005548788.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Dec 06 08:29:51 np0005548788.localdomain sudo[65570]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:51 np0005548788.localdomain sudo[65661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwdknqwgrfgsyqqoeenrmsrwqubjpgcj ; /usr/bin/python3
Dec 06 08:29:51 np0005548788.localdomain sudo[65661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:51 np0005548788.localdomain python3[65663]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:29:51 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:29:51 np0005548788.localdomain systemd-rc-local-generator[65688]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:51 np0005548788.localdomain systemd-sysv-generator[65691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:51 np0005548788.localdomain sudo[65661]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:52 np0005548788.localdomain sudo[65745]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnuazzsmcrnfipoddazmymiaufehjorr ; /usr/bin/python3
Dec 06 08:29:52 np0005548788.localdomain sudo[65745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:52 np0005548788.localdomain python3[65747]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:52 np0005548788.localdomain sudo[65745]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:52 np0005548788.localdomain sudo[65788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iirzaepriwxourpikkdcdwskxoeigahw ; /usr/bin/python3
Dec 06 08:29:52 np0005548788.localdomain sudo[65788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:52 np0005548788.localdomain python3[65790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009791.9030154-107615-203212173859089/source _original_basename=tmpgilfttv2 follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:52 np0005548788.localdomain sudo[65788]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:52 np0005548788.localdomain sudo[65818]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztxlgdbitwfnvatalqikvevzelaoxwmc ; /usr/bin/python3
Dec 06 08:29:52 np0005548788.localdomain sudo[65818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:53 np0005548788.localdomain python3[65820]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:29:53 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:29:53 np0005548788.localdomain systemd-rc-local-generator[65845]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:53 np0005548788.localdomain systemd-sysv-generator[65850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:53 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:53 np0005548788.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Dec 06 08:29:53 np0005548788.localdomain sudo[65818]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:53 np0005548788.localdomain sudo[65873]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsgtmutragznumtsssmzfyetdxycslma ; /usr/bin/python3
Dec 06 08:29:53 np0005548788.localdomain sudo[65873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:53 np0005548788.localdomain python3[65875]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:29:53 np0005548788.localdomain sudo[65873]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:54 np0005548788.localdomain sudo[65923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlkhqnsjnmirloahbebxobgbrsxezzan ; /usr/bin/python3
Dec 06 08:29:54 np0005548788.localdomain sudo[65923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:54 np0005548788.localdomain sudo[65923]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:54 np0005548788.localdomain sudo[65941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pymqmxtjbgjktnmnwmswoawdbpjlonro ; /usr/bin/python3
Dec 06 08:29:54 np0005548788.localdomain sudo[65941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:54 np0005548788.localdomain sudo[65941]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:55 np0005548788.localdomain sudo[66045]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsxccquexqszonzifczbauntjlrtsvet ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009794.876398-107705-245505995814625/async_wrapper.py 993301104345 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009794.876398-107705-245505995814625/AnsiballZ_command.py _
Dec 06 08:29:55 np0005548788.localdomain sudo[66045]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:29:55 np0005548788.localdomain ansible-async_wrapper.py[66047]: Invoked with 993301104345 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009794.876398-107705-245505995814625/AnsiballZ_command.py _
Dec 06 08:29:55 np0005548788.localdomain ansible-async_wrapper.py[66050]: Starting module and watcher
Dec 06 08:29:55 np0005548788.localdomain ansible-async_wrapper.py[66050]: Start watching 66051 (3600)
Dec 06 08:29:55 np0005548788.localdomain ansible-async_wrapper.py[66051]: Start module (66051)
Dec 06 08:29:55 np0005548788.localdomain ansible-async_wrapper.py[66047]: Return async_wrapper task started.
Dec 06 08:29:55 np0005548788.localdomain sudo[66045]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:55 np0005548788.localdomain sudo[66066]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgsppsrrghmhettyiufgyebslhzwvwlc ; /usr/bin/python3
Dec 06 08:29:55 np0005548788.localdomain sudo[66066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:55 np0005548788.localdomain python3[66068]: ansible-ansible.legacy.async_status Invoked with jid=993301104345.66047 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:29:55 np0005548788.localdomain sudo[66066]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (file & line not available)
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (file & line not available)
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:29:59 np0005548788.localdomain puppet-user[66071]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.19 seconds
Dec 06 08:30:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:30:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:30:00 np0005548788.localdomain podman[66189]: 2025-12-06 08:30:00.280073141 +0000 UTC m=+0.100035375 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:30:00 np0005548788.localdomain podman[66190]: 2025-12-06 08:30:00.320902589 +0000 UTC m=+0.141130781 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64)
Dec 06 08:30:00 np0005548788.localdomain podman[66190]: 2025-12-06 08:30:00.360873351 +0000 UTC m=+0.181101573 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.12, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:30:00 np0005548788.localdomain podman[66189]: 2025-12-06 08:30:00.374498671 +0000 UTC m=+0.194460845 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:00 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:30:00 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:30:00 np0005548788.localdomain ansible-async_wrapper.py[66050]: 66051 still running (3600)
Dec 06 08:30:05 np0005548788.localdomain ansible-async_wrapper.py[66050]: 66051 still running (3595)
Dec 06 08:30:05 np0005548788.localdomain sudo[66305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkwbxgtscumhfvdfoedhckkcmswtsstp ; /usr/bin/python3
Dec 06 08:30:05 np0005548788.localdomain sudo[66305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:06 np0005548788.localdomain python3[66307]: ansible-ansible.legacy.async_status Invoked with jid=993301104345.66047 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:30:06 np0005548788.localdomain sudo[66305]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:30:07 np0005548788.localdomain podman[66311]: 2025-12-06 08:30:07.004292831 +0000 UTC m=+0.080763131 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12)
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: tmp-crun.HRnJBU.mount: Deactivated successfully.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:07 np0005548788.localdomain podman[66311]: 2025-12-06 08:30:07.201580242 +0000 UTC m=+0.278050532 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 06 08:30:07 np0005548788.localdomain systemd-sysv-generator[66422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:07 np0005548788.localdomain systemd-rc-local-generator[66419]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:30:07 np0005548788.localdomain systemd[1]: run-r198809d0c6f246bfb3f87b7048443601.service: Deactivated successfully.
Dec 06 08:30:08 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Dec 06 08:30:08 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}582311ee213f27fc2754baaecf8b258a6dcc8c1f6a63e0ab420c7e433b146ab7'
Dec 06 08:30:08 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Dec 06 08:30:08 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Dec 06 08:30:08 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Dec 06 08:30:08 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Dec 06 08:30:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:30:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4516 writes, 20K keys, 4516 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4516 writes, 510 syncs, 8.85 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 134 writes, 387 keys, 134 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 134 writes, 67 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:30:10 np0005548788.localdomain ansible-async_wrapper.py[66050]: 66051 still running (3590)
Dec 06 08:30:13 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Dec 06 08:30:13 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:30:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.2 total, 600.0 interval
                                                          Cumulative writes: 5111 writes, 22K keys, 5111 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5111 writes, 587 syncs, 8.71 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 132 writes, 320 keys, 132 commit groups, 1.0 writes per commit group, ingest: 0.28 MB, 0.00 MB/s
                                                          Interval WAL: 132 writes, 66 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:30:13 np0005548788.localdomain systemd-sysv-generator[67466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:13 np0005548788.localdomain systemd-rc-local-generator[67463]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Dec 06 08:30:14 np0005548788.localdomain snmpd[67478]: Can't find directory of RPM packages
Dec 06 08:30:14 np0005548788.localdomain snmpd[67478]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:14 np0005548788.localdomain systemd-rc-local-generator[67506]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:14 np0005548788.localdomain systemd-sysv-generator[67510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:14 np0005548788.localdomain systemd-sysv-generator[67542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:14 np0005548788.localdomain systemd-rc-local-generator[67538]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Notice: Applied catalog in 15.53 seconds
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Application:
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:    Initial environment: production
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:    Converged environment: production
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:          Run mode: user
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Changes:
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:             Total: 8
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Events:
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:           Success: 8
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:             Total: 8
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Resources:
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:         Restarted: 1
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:           Changed: 8
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:       Out of sync: 8
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:             Total: 19
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Time:
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:        Filebucket: 0.00
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:          Schedule: 0.00
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:            Augeas: 0.01
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:              File: 0.06
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:    Config retrieval: 0.25
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:           Service: 1.16
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:    Transaction evaluation: 15.52
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:    Catalog application: 15.53
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:          Last run: 1765009814
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:              Exec: 5.06
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:           Package: 9.06
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:             Total: 15.53
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]: Version:
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:            Config: 1765009799
Dec 06 08:30:14 np0005548788.localdomain puppet-user[66071]:            Puppet: 7.10.0
Dec 06 08:30:15 np0005548788.localdomain ansible-async_wrapper.py[66051]: Module complete (66051)
Dec 06 08:30:15 np0005548788.localdomain ansible-async_wrapper.py[66050]: Done in kid B.
Dec 06 08:30:16 np0005548788.localdomain sudo[67564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enjxtljymovvewyllxxpnxiprepdhiqw ; /usr/bin/python3
Dec 06 08:30:16 np0005548788.localdomain sudo[67564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:16 np0005548788.localdomain python3[67566]: ansible-ansible.legacy.async_status Invoked with jid=993301104345.66047 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:30:16 np0005548788.localdomain sudo[67564]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:16 np0005548788.localdomain sudo[67580]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftkrmmfpqbjkynwwfxjudddjrcpgxgay ; /usr/bin/python3
Dec 06 08:30:16 np0005548788.localdomain sudo[67580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:17 np0005548788.localdomain python3[67582]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:30:17 np0005548788.localdomain sudo[67580]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548788.localdomain sudo[67596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewuogyrjwulewwriifaqitxjurebnsku ; /usr/bin/python3
Dec 06 08:30:17 np0005548788.localdomain sudo[67596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:17 np0005548788.localdomain sudo[67599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:30:17 np0005548788.localdomain python3[67598]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:17 np0005548788.localdomain sudo[67599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:17 np0005548788.localdomain sudo[67599]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548788.localdomain sudo[67596]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548788.localdomain sudo[67616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:30:17 np0005548788.localdomain sudo[67616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:17 np0005548788.localdomain sudo[67686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbhiqqsrvgdrsquamtjgzxtezfujbybt ; /usr/bin/python3
Dec 06 08:30:17 np0005548788.localdomain sudo[67686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:17 np0005548788.localdomain sudo[67616]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548788.localdomain python3[67692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:17 np0005548788.localdomain sudo[67686]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548788.localdomain sudo[67699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:30:17 np0005548788.localdomain sudo[67699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:17 np0005548788.localdomain sudo[67699]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548788.localdomain sudo[67716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:30:18 np0005548788.localdomain sudo[67716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:18 np0005548788.localdomain sudo[67743]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmgzafwcgxryrxxskyfgpgctexwknwth ; /usr/bin/python3
Dec 06 08:30:18 np0005548788.localdomain sudo[67743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:18 np0005548788.localdomain python3[67746]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpmbw3onxh recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:30:18 np0005548788.localdomain sudo[67743]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548788.localdomain sudo[67789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isussnqewstbiopcclupukaafjyheoyq ; /usr/bin/python3
Dec 06 08:30:18 np0005548788.localdomain sudo[67789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:18 np0005548788.localdomain python3[67791]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:18 np0005548788.localdomain sudo[67789]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548788.localdomain sudo[67716]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548788.localdomain sudo[67822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpwvckchwlrvyupcbsyxjalzkjjhjeku ; /usr/bin/python3
Dec 06 08:30:18 np0005548788.localdomain sudo[67822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:19 np0005548788.localdomain sudo[67822]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:19 np0005548788.localdomain sudo[67909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktnifcvpnepqiqcajmjtpttumoitrcca ; /usr/bin/python3
Dec 06 08:30:19 np0005548788.localdomain sudo[67909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:19 np0005548788.localdomain python3[67911]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:30:19 np0005548788.localdomain sudo[67909]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:20 np0005548788.localdomain sudo[67928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqlvhqjevwseelcqheamuoprkzadngqx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:20 np0005548788.localdomain sudo[67928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:20 np0005548788.localdomain python3[67930]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:20 np0005548788.localdomain sudo[67928]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:20 np0005548788.localdomain sudo[67944]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qktiuuprkffabtkgrlytfcazvuysfbqn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:20 np0005548788.localdomain sudo[67944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:21 np0005548788.localdomain sudo[67944]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:21 np0005548788.localdomain sudo[67960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovioapcqnyorynrypfmxpxspicuioybx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:21 np0005548788.localdomain sudo[67960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:21 np0005548788.localdomain python3[67962]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:21 np0005548788.localdomain sudo[67960]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:21 np0005548788.localdomain sudo[67965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:30:21 np0005548788.localdomain sudo[67965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:21 np0005548788.localdomain sudo[67965]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:21 np0005548788.localdomain sudo[68025]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aguwxwtnxeajglttlfgawewnfutgjoxa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:21 np0005548788.localdomain sudo[68025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:22 np0005548788.localdomain python3[68027]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:22 np0005548788.localdomain sudo[68025]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:22 np0005548788.localdomain sudo[68043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwmzchxzngprdyorbptqgwakxwhsfedh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:22 np0005548788.localdomain sudo[68043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:22 np0005548788.localdomain python3[68045]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:22 np0005548788.localdomain sudo[68043]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:22 np0005548788.localdomain sudo[68105]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pglytbhyebifluxpytafnykjbgfsjzci ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:22 np0005548788.localdomain sudo[68105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:22 np0005548788.localdomain python3[68107]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:22 np0005548788.localdomain sudo[68105]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:23 np0005548788.localdomain sudo[68123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pajbzhsdvwpqlkqnodpqwmxsowsmicsb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:23 np0005548788.localdomain sudo[68123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:23 np0005548788.localdomain python3[68125]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:23 np0005548788.localdomain sudo[68123]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:23 np0005548788.localdomain sudo[68185]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snylvofsmyarwrcqyogyupdfkwxzymtu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:23 np0005548788.localdomain sudo[68185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:23 np0005548788.localdomain python3[68187]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:23 np0005548788.localdomain sudo[68185]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:23 np0005548788.localdomain sudo[68203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxrvtgpcvgvtwbefpdnexxusnvtbaqsb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:23 np0005548788.localdomain sudo[68203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:24 np0005548788.localdomain python3[68205]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:24 np0005548788.localdomain sudo[68203]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:24 np0005548788.localdomain sudo[68265]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncqftxxsprxlzqohxhfhnenowaulbczl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:24 np0005548788.localdomain sudo[68265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:24 np0005548788.localdomain python3[68267]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:24 np0005548788.localdomain sudo[68265]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:24 np0005548788.localdomain sudo[68283]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfemrtcvdqbdrbfcfkruhwopecqevpnf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:24 np0005548788.localdomain sudo[68283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:24 np0005548788.localdomain python3[68285]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:24 np0005548788.localdomain sudo[68283]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:25 np0005548788.localdomain sudo[68313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljcpdatmgneickzowpatashalriusjbw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:25 np0005548788.localdomain sudo[68313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:25 np0005548788.localdomain python3[68315]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:25 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:25 np0005548788.localdomain systemd-rc-local-generator[68337]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:25 np0005548788.localdomain systemd-sysv-generator[68344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:25 np0005548788.localdomain sudo[68313]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:26 np0005548788.localdomain sudo[68398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wghizlkidfqxozzatfkgsezimpdiffax ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:26 np0005548788.localdomain sudo[68398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:26 np0005548788.localdomain python3[68400]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:26 np0005548788.localdomain sudo[68398]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:26 np0005548788.localdomain sudo[68416]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geliquxmkkswfoxcpvegolgumvdbzrds ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:26 np0005548788.localdomain sudo[68416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:26 np0005548788.localdomain python3[68418]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:26 np0005548788.localdomain sudo[68416]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:27 np0005548788.localdomain sudo[68478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtqnngsrpxmgfrclvjmlofehdnuwhfkv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:27 np0005548788.localdomain sudo[68478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:27 np0005548788.localdomain python3[68480]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:27 np0005548788.localdomain sudo[68478]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:27 np0005548788.localdomain sudo[68496]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jajfkpfvybicukilwoxqnlznknqypcsc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:27 np0005548788.localdomain sudo[68496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:27 np0005548788.localdomain python3[68498]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:27 np0005548788.localdomain sudo[68496]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:27 np0005548788.localdomain sudo[68526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmzthjgobrriadfoeuabrsurbugncpof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:27 np0005548788.localdomain sudo[68526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:27 np0005548788.localdomain python3[68528]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:27 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:28 np0005548788.localdomain systemd-rc-local-generator[68551]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:28 np0005548788.localdomain systemd-sysv-generator[68558]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:28 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:28 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:30:28 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:30:28 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:30:28 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:30:28 np0005548788.localdomain sudo[68526]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:28 np0005548788.localdomain sudo[68585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldgoufacbjvpdppiienekzmhqqazvedm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:28 np0005548788.localdomain sudo[68585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:28 np0005548788.localdomain python3[68587]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:30:28 np0005548788.localdomain sudo[68585]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:29 np0005548788.localdomain sudo[68601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkwuagvybbxdeezpvnrxmdkqrbomjptv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:29 np0005548788.localdomain sudo[68601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:29 np0005548788.localdomain sudo[68601]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:30 np0005548788.localdomain sudo[68643]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewnlnspfizxlociohcwlrtynzroxtbky ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:30 np0005548788.localdomain sudo[68643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:30:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:30:30 np0005548788.localdomain systemd[1]: tmp-crun.Oq5vTj.mount: Deactivated successfully.
Dec 06 08:30:30 np0005548788.localdomain podman[68646]: 2025-12-06 08:30:30.926314953 +0000 UTC m=+0.099669273 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, container_name=collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3)
Dec 06 08:30:30 np0005548788.localdomain podman[68646]: 2025-12-06 08:30:30.964019595 +0000 UTC m=+0.137373925 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 08:30:30 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: tmp-crun.3kp2Tw.mount: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain podman[68647]: 2025-12-06 08:30:31.019833105 +0000 UTC m=+0.192058791 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.12, container_name=iscsid, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T23:44:13Z)
Dec 06 08:30:31 np0005548788.localdomain podman[68647]: 2025-12-06 08:30:31.026261623 +0000 UTC m=+0.198487399 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:30:31 np0005548788.localdomain podman[68832]: 2025-12-06 08:30:31.312256848 +0000 UTC m=+0.069747241 container create e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.scope.
Dec 06 08:30:31 np0005548788.localdomain podman[68860]: 2025-12-06 08:30:31.342622634 +0000 UTC m=+0.076762827 container create 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.openshift.expose-services=, container_name=logrotate_crond, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c957b2bd9594b3144e0926de62e77156dbabeefc8e2acf756f315a98b85b5f52/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548788.localdomain podman[68882]: 2025-12-06 08:30:31.361783134 +0000 UTC m=+0.072075812 container create 0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.scope.
Dec 06 08:30:31 np0005548788.localdomain podman[68833]: 2025-12-06 08:30:31.370806102 +0000 UTC m=+0.122970611 container create 0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, container_name=configure_cms_options, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab.scope.
Dec 06 08:30:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/836e61d982b1d47c704516d9b7a84248dd1953565408f432976e6b13308663a5/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548788.localdomain podman[68832]: 2025-12-06 08:30:31.276818726 +0000 UTC m=+0.034309129 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37cb8325f9fcc3916540f70c0b29aa44083977f48bf4e198eec6f67fb80ec7c4/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37cb8325f9fcc3916540f70c0b29aa44083977f48bf4e198eec6f67fb80ec7c4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37cb8325f9fcc3916540f70c0b29aa44083977f48bf4e198eec6f67fb80ec7c4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548788.localdomain podman[68872]: 2025-12-06 08:30:31.394085801 +0000 UTC m=+0.114782339 container create 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b.scope.
Dec 06 08:30:31 np0005548788.localdomain podman[68882]: 2025-12-06 08:30:31.400266351 +0000 UTC m=+0.110559069 container init 0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_libvirt_init_secret, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:30:31 np0005548788.localdomain podman[68833]: 2025-12-06 08:30:31.305619023 +0000 UTC m=+0.057783572 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548788.localdomain podman[68860]: 2025-12-06 08:30:31.306022326 +0000 UTC m=+0.040162509 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:30:31 np0005548788.localdomain podman[68882]: 2025-12-06 08:30:31.40901265 +0000 UTC m=+0.119305318 container start 0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:30:31 np0005548788.localdomain podman[68882]: 2025-12-06 08:30:31.409857976 +0000 UTC m=+0.120150684 container attach 0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vcs-type=git, architecture=x86_64, version=17.1.12, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=nova_libvirt_init_secret, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:30:31 np0005548788.localdomain podman[68872]: 2025-12-06 08:30:31.322901276 +0000 UTC m=+0.043597824 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 06 08:30:31 np0005548788.localdomain podman[68882]: 2025-12-06 08:30:31.323213445 +0000 UTC m=+0.033506143 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:30:31 np0005548788.localdomain podman[68832]: 2025-12-06 08:30:31.425138798 +0000 UTC m=+0.182629191 container init e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:31 np0005548788.localdomain sudo[68934]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:31 np0005548788.localdomain sudo[68934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:30:31 np0005548788.localdomain podman[68833]: 2025-12-06 08:30:31.465368737 +0000 UTC m=+0.217533236 container init 0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=configure_cms_options, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:30:31 np0005548788.localdomain podman[68860]: 2025-12-06 08:30:31.468636248 +0000 UTC m=+0.202776441 container init 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-cron, distribution-scope=public, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Dec 06 08:30:31 np0005548788.localdomain podman[68833]: 2025-12-06 08:30:31.478230074 +0000 UTC m=+0.230394593 container start 0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=configure_cms_options, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64)
Dec 06 08:30:31 np0005548788.localdomain podman[68833]: 2025-12-06 08:30:31.479487822 +0000 UTC m=+0.231652321 container attach 0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team)
Dec 06 08:30:31 np0005548788.localdomain sudo[68956]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:31 np0005548788.localdomain sudo[68956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:30:31 np0005548788.localdomain sudo[68934]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:31 np0005548788.localdomain podman[68860]: 2025-12-06 08:30:31.501748678 +0000 UTC m=+0.235888871 container start 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container)
Dec 06 08:30:31 np0005548788.localdomain podman[68832]: 2025-12-06 08:30:31.508112894 +0000 UTC m=+0.265603297 container start e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team)
Dec 06 08:30:31 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:30:31 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1d18e9db1b81af61c21222485fd9085f --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.scope.
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78c1ff78886d1d879971aeaf25162bcf2766722ebe427ed02a36d32d0aa52834/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548788.localdomain sudo[68956]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:31 np0005548788.localdomain crond[68953]: (CRON) STARTUP (1.5.7)
Dec 06 08:30:31 np0005548788.localdomain crond[68953]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 96% if used.)
Dec 06 08:30:31 np0005548788.localdomain crond[68953]: (CRON) INFO (running with inotify support)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: libpod-0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain podman[68882]: 2025-12-06 08:30:31.561486429 +0000 UTC m=+0.271779117 container died 0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, managed_by=tripleo_ansible, container_name=nova_libvirt_init_secret, io.buildah.version=1.41.4)
Dec 06 08:30:31 np0005548788.localdomain podman[68937]: 2025-12-06 08:30:31.572887841 +0000 UTC m=+0.119005709 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:30:31 np0005548788.localdomain ovs-vsctl[69019]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: libpod-0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:30:31 np0005548788.localdomain podman[68872]: 2025-12-06 08:30:31.612272264 +0000 UTC m=+0.332968802 container init 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:30:31 np0005548788.localdomain sudo[69036]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:31 np0005548788.localdomain podman[68872]: 2025-12-06 08:30:31.632269872 +0000 UTC m=+0.352966400 container start 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Dec 06 08:30:31 np0005548788.localdomain sudo[69036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:31 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1d18e9db1b81af61c21222485fd9085f --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 06 08:30:31 np0005548788.localdomain podman[68833]: 2025-12-06 08:30:31.651037849 +0000 UTC m=+0.403202378 container died 0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, config_id=tripleo_step4, container_name=configure_cms_options, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']})
Dec 06 08:30:31 np0005548788.localdomain podman[68937]: 2025-12-06 08:30:31.662519673 +0000 UTC m=+0.208637541 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 08:30:31 np0005548788.localdomain podman[68937]: unhealthy
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Failed with result 'exit-code'.
Dec 06 08:30:31 np0005548788.localdomain sudo[69036]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:31 np0005548788.localdomain podman[69004]: 2025-12-06 08:30:31.695995875 +0000 UTC m=+0.124521978 container cleanup 0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_libvirt_init_secret, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: libpod-conmon-0775bb318cb3441ae1bc413c399f63aa235dcda71324b49162de4457a1f50cab.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain podman[69020]: 2025-12-06 08:30:31.732395867 +0000 UTC m=+0.128708628 container cleanup 0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=configure_cms_options, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: libpod-conmon-0e426c3e9cbc45d5d2bae46f3a736779929dfc481981e2b149070788340a487b.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain podman[68962]: 2025-12-06 08:30:31.735829463 +0000 UTC m=+0.235596663 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:30:31 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Dec 06 08:30:31 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Dec 06 08:30:31 np0005548788.localdomain podman[68962]: 2025-12-06 08:30:31.81747218 +0000 UTC m=+0.317239350 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, name=rhosp17/openstack-cron, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:30:31 np0005548788.localdomain podman[69037]: 2025-12-06 08:30:31.855650956 +0000 UTC m=+0.222476458 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, release=1761123044, container_name=ceilometer_agent_compute)
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:30:31 np0005548788.localdomain podman[69037]: 2025-12-06 08:30:31.942540424 +0000 UTC m=+0.309365926 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:30:31 np0005548788.localdomain podman[69037]: unhealthy
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:31 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Failed with result 'exit-code'.
Dec 06 08:30:32 np0005548788.localdomain podman[69203]: 2025-12-06 08:30:32.089389791 +0000 UTC m=+0.121114625 container create 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:30:32 np0005548788.localdomain podman[69229]: 2025-12-06 08:30:32.123900164 +0000 UTC m=+0.095913457 container create 76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: Started libpod-conmon-48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.scope.
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: Started libpod-conmon-76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7.scope.
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:32 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20be973d9ceea79ef95f10e6d248e592805035801284dea3de186096ff60ff28/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:32 np0005548788.localdomain podman[69229]: 2025-12-06 08:30:32.153521867 +0000 UTC m=+0.125535170 container init 76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=setup_ovs_manager, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:30:32 np0005548788.localdomain podman[69229]: 2025-12-06 08:30:32.160130121 +0000 UTC m=+0.132143414 container start 76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 08:30:32 np0005548788.localdomain podman[69203]: 2025-12-06 08:30:32.059772397 +0000 UTC m=+0.091497221 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:30:32 np0005548788.localdomain podman[69229]: 2025-12-06 08:30:32.160453641 +0000 UTC m=+0.132466934 container attach 76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=setup_ovs_manager, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 08:30:32 np0005548788.localdomain podman[69229]: 2025-12-06 08:30:32.07542762 +0000 UTC m=+0.047440933 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:30:32 np0005548788.localdomain podman[69203]: 2025-12-06 08:30:32.220968456 +0000 UTC m=+0.252693310 container init 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, vcs-type=git)
Dec 06 08:30:32 np0005548788.localdomain sudo[69259]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:30:32 np0005548788.localdomain podman[69203]: 2025-12-06 08:30:32.244574103 +0000 UTC m=+0.276298927 container start 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, vcs-type=git)
Dec 06 08:30:32 np0005548788.localdomain sudo[69259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:32 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:30:32 np0005548788.localdomain sudo[69259]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:32 np0005548788.localdomain sshd[69293]: Server listening on 0.0.0.0 port 2022.
Dec 06 08:30:32 np0005548788.localdomain sshd[69293]: Server listening on :: port 2022.
Dec 06 08:30:32 np0005548788.localdomain podman[69260]: 2025-12-06 08:30:32.392329838 +0000 UTC m=+0.128712069 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, vcs-type=git, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true)
Dec 06 08:30:32 np0005548788.localdomain sudo[69305]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmplbe8bian/privsep.sock
Dec 06 08:30:32 np0005548788.localdomain sudo[69305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 08:30:32 np0005548788.localdomain podman[69260]: 2025-12-06 08:30:32.735243166 +0000 UTC m=+0.471625447 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, container_name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:30:32 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:30:32 np0005548788.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 06 08:30:32 np0005548788.localdomain sudo[69305]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:34 np0005548788.localdomain ovs-vsctl[69435]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: libpod-76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7.scope: Deactivated successfully.
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: libpod-76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7.scope: Consumed 2.871s CPU time.
Dec 06 08:30:35 np0005548788.localdomain podman[69436]: 2025-12-06 08:30:35.132854944 +0000 UTC m=+0.053784418 container died 76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7-userdata-shm.mount: Deactivated successfully.
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-99147ebdc96193466a769a6087d1e6a493b0ef22c4a7c12a11b9ab309a2616d8-merged.mount: Deactivated successfully.
Dec 06 08:30:35 np0005548788.localdomain podman[69436]: 2025-12-06 08:30:35.177299234 +0000 UTC m=+0.098228638 container cleanup 76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, container_name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:14:25Z, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: libpod-conmon-76b9ee0a308d74127ad1ff47acdcaf068ed373799fd7a870b5f3d41de87cc2e7.scope: Deactivated successfully.
Dec 06 08:30:35 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Dec 06 08:30:35 np0005548788.localdomain podman[69545]: 2025-12-06 08:30:35.662272102 +0000 UTC m=+0.076816559 container create 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started libpod-conmon-6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.scope.
Dec 06 08:30:35 np0005548788.localdomain podman[69545]: 2025-12-06 08:30:35.624834538 +0000 UTC m=+0.039378995 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:35 np0005548788.localdomain podman[69562]: 2025-12-06 08:30:35.729443802 +0000 UTC m=+0.100197119 container create 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 06 08:30:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d144d97f229d6ea4f1176db9ee5f38246c70badf684dca92813d3fa706be8063/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d144d97f229d6ea4f1176db9ee5f38246c70badf684dca92813d3fa706be8063/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d144d97f229d6ea4f1176db9ee5f38246c70badf684dca92813d3fa706be8063/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started libpod-conmon-215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.scope.
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:30:35 np0005548788.localdomain podman[69545]: 2025-12-06 08:30:35.785350285 +0000 UTC m=+0.199894732 container init 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:35 np0005548788.localdomain podman[69562]: 2025-12-06 08:30:35.694178345 +0000 UTC m=+0.064931672 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:30:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aeebb59c2ae55208b2c5ff933ebd16346303bee7a4c212ebe435c707b85240b/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aeebb59c2ae55208b2c5ff933ebd16346303bee7a4c212ebe435c707b85240b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aeebb59c2ae55208b2c5ff933ebd16346303bee7a4c212ebe435c707b85240b/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:30:35 np0005548788.localdomain podman[69545]: 2025-12-06 08:30:35.822267173 +0000 UTC m=+0.236811620 container start 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, release=1761123044, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ovn-controller)
Dec 06 08:30:35 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:30:35 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:30:35 np0005548788.localdomain podman[69562]: 2025-12-06 08:30:35.864100782 +0000 UTC m=+0.234854119 container init 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:30:35 np0005548788.localdomain sudo[69611]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:35 np0005548788.localdomain sudo[69611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:30:35 np0005548788.localdomain systemd[69612]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:35 np0005548788.localdomain podman[69590]: 2025-12-06 08:30:35.932542472 +0000 UTC m=+0.098096894 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 08:30:35 np0005548788.localdomain podman[69590]: 2025-12-06 08:30:35.949456074 +0000 UTC m=+0.115010516 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:30:35 np0005548788.localdomain podman[69590]: unhealthy
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:35 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 08:30:35 np0005548788.localdomain sudo[69611]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:36 np0005548788.localdomain podman[69562]: 2025-12-06 08:30:36.004450538 +0000 UTC m=+0.375203845 container start 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:30:36 np0005548788.localdomain python3[68645]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=dc659970751309b021f4b1201ffad0ee --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Queued start job for default target Main User Target.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Created slice User Application Slice.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Reached target Paths.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Reached target Timers.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Starting D-Bus User Message Bus Socket...
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Starting Create User's Volatile Files and Directories...
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Finished Create User's Volatile Files and Directories.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Reached target Sockets.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Reached target Basic System.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Reached target Main User Target.
Dec 06 08:30:36 np0005548788.localdomain systemd[69612]: Startup finished in 144ms.
Dec 06 08:30:36 np0005548788.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:30:36 np0005548788.localdomain systemd[1]: Started Session c9 of User root.
Dec 06 08:30:36 np0005548788.localdomain podman[69617]: 2025-12-06 08:30:36.089632174 +0000 UTC m=+0.180709321 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:30:36 np0005548788.localdomain podman[69617]: 2025-12-06 08:30:36.095382431 +0000 UTC m=+0.186459528 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 06 08:30:36 np0005548788.localdomain podman[69617]: unhealthy
Dec 06 08:30:36 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:36 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 08:30:36 np0005548788.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Dec 06 08:30:36 np0005548788.localdomain kernel: device br-int entered promiscuous mode
Dec 06 08:30:36 np0005548788.localdomain NetworkManager[5968]: <info>  [1765009836.2258] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Dec 06 08:30:36 np0005548788.localdomain systemd-udevd[69704]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 08:30:36 np0005548788.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Dec 06 08:30:36 np0005548788.localdomain NetworkManager[5968]: <info>  [1765009836.2566] device (genev_sys_6081): carrier: link connected
Dec 06 08:30:36 np0005548788.localdomain NetworkManager[5968]: <info>  [1765009836.2569] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Dec 06 08:30:36 np0005548788.localdomain sudo[68643]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:36 np0005548788.localdomain sudo[69724]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyjdddompcdjexfughaoaucdnusufbjs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:36 np0005548788.localdomain sudo[69724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:36 np0005548788.localdomain python3[69726]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:36 np0005548788.localdomain sudo[69724]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:36 np0005548788.localdomain sudo[69740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxrwqdmwhapjdkaoksusnxlujambtkbc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:36 np0005548788.localdomain sudo[69740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:36 np0005548788.localdomain python3[69742]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:36 np0005548788.localdomain sudo[69740]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548788.localdomain sudo[69756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfzxdonyzljjpisenugvgrpekwtabflg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548788.localdomain sudo[69756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548788.localdomain python3[69758]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548788.localdomain sudo[69756]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548788.localdomain sudo[69772]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvkxemzcfmlvprhhdntuilrieqlhndgw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548788.localdomain sudo[69772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548788.localdomain python3[69774]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548788.localdomain sudo[69772]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548788.localdomain sudo[69788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okhdpvmnbuoaeyzgykglswsugakrcinx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:30:37 np0005548788.localdomain sudo[69788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548788.localdomain sudo[69791]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmpnyiyo09o/privsep.sock
Dec 06 08:30:37 np0005548788.localdomain sudo[69791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 06 08:30:37 np0005548788.localdomain podman[69792]: 2025-12-06 08:30:37.76986812 +0000 UTC m=+0.122447884 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 06 08:30:37 np0005548788.localdomain python3[69793]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548788.localdomain sudo[69788]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548788.localdomain sudo[69837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywgzmzfqfwimzkvyeelzierlwojowjgz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548788.localdomain sudo[69837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548788.localdomain podman[69792]: 2025-12-06 08:30:37.982572516 +0000 UTC m=+0.335152320 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, architecture=x86_64)
Dec 06 08:30:37 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:30:38 np0005548788.localdomain python3[69839]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:38 np0005548788.localdomain sudo[69837]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548788.localdomain sudo[69853]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvaptteuqbrbufvulrqvomabpsyxfmpj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548788.localdomain sudo[69853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548788.localdomain sudo[69791]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548788.localdomain python3[69855]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548788.localdomain sudo[69853]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548788.localdomain sudo[69871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imlhmtjvltteqdrdgljcjryoagpsokcf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548788.localdomain sudo[69871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548788.localdomain python3[69873]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548788.localdomain sudo[69871]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548788.localdomain sudo[69889]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wckzidwwbhepactvdpbqwlxaooehzkkp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548788.localdomain sudo[69889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548788.localdomain python3[69891]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548788.localdomain sudo[69889]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548788.localdomain sudo[69905]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyhlhguqpkmuvcpbcrrgvqcmeuacikpg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548788.localdomain sudo[69905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:39 np0005548788.localdomain python3[69907]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:39 np0005548788.localdomain sudo[69905]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:39 np0005548788.localdomain sudo[69921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shzhjtctgmbmheghbsokobobcoekzvdu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:39 np0005548788.localdomain sudo[69921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:39 np0005548788.localdomain python3[69923]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:39 np0005548788.localdomain sudo[69921]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:39 np0005548788.localdomain sudo[69937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqglnwbwdgzoeucuyfprmeaaafnhntnd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:39 np0005548788.localdomain sudo[69937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:39 np0005548788.localdomain python3[69939]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:39 np0005548788.localdomain sudo[69937]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:40 np0005548788.localdomain sudo[69998]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhpphrzdsmpjnqktlbxslvyjxgetxbvw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:40 np0005548788.localdomain sudo[69998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:40 np0005548788.localdomain python3[70000]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.724362-109033-123828371572035/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:40 np0005548788.localdomain sudo[69998]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:40 np0005548788.localdomain sudo[70027]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iffavqqdefggrnlbhbpcbvktfvzztfcg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:40 np0005548788.localdomain sudo[70027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:40 np0005548788.localdomain python3[70029]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.724362-109033-123828371572035/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:40 np0005548788.localdomain sudo[70027]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:41 np0005548788.localdomain sudo[70056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsaaselymafcwoogovydlnfowlmlhard ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:41 np0005548788.localdomain sudo[70056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:41 np0005548788.localdomain python3[70058]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.724362-109033-123828371572035/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:41 np0005548788.localdomain sudo[70056]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:41 np0005548788.localdomain sudo[70085]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmvsmzbtnsylxkemhigubtrupcwejxeu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:41 np0005548788.localdomain sudo[70085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:41 np0005548788.localdomain python3[70087]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.724362-109033-123828371572035/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:41 np0005548788.localdomain sudo[70085]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:42 np0005548788.localdomain sudo[70114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwpcetxbtelhipilhpdpclfcnhbymnwf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:42 np0005548788.localdomain sudo[70114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:42 np0005548788.localdomain python3[70116]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.724362-109033-123828371572035/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:42 np0005548788.localdomain sudo[70114]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:42 np0005548788.localdomain sudo[70143]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojdeivrjzmzmiebzoumffitotuvgqocl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:42 np0005548788.localdomain sudo[70143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:42 np0005548788.localdomain python3[70145]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.724362-109033-123828371572035/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:42 np0005548788.localdomain sudo[70143]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:43 np0005548788.localdomain sudo[70159]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asmuuhxahepmfplheihrzuwibnxllgum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:43 np0005548788.localdomain sudo[70159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:43 np0005548788.localdomain python3[70161]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:30:43 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:43 np0005548788.localdomain systemd-rc-local-generator[70182]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:43 np0005548788.localdomain systemd-sysv-generator[70187]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:43 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:43 np0005548788.localdomain sudo[70159]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:44 np0005548788.localdomain sudo[70211]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uodbhegohjxddgjjencqrpifzvozppjz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:44 np0005548788.localdomain sudo[70211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:44 np0005548788.localdomain python3[70213]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:45 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:45 np0005548788.localdomain systemd-rc-local-generator[70243]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:45 np0005548788.localdomain systemd-sysv-generator[70246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:45 np0005548788.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 06 08:30:46 np0005548788.localdomain tripleo-start-podman-container[70253]: Creating additional drop-in dependency for "ceilometer_agent_compute" (57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669)
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:46 np0005548788.localdomain systemd-rc-local-generator[70307]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:46 np0005548788.localdomain systemd-sysv-generator[70311]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Activating special unit Exit the Session...
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped target Main User Target.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped target Basic System.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped target Paths.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped target Sockets.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped target Timers.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Closed D-Bus User Message Bus Socket.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Stopped Create User's Volatile Files and Directories.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Removed slice User Application Slice.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Reached target Shutdown.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Finished Exit the Session.
Dec 06 08:30:46 np0005548788.localdomain systemd[69612]: Reached target Exit the Session.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 08:30:46 np0005548788.localdomain sudo[70211]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 08:30:46 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 08:30:46 np0005548788.localdomain sudo[70336]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emmlpiwidmmdnooopgwwniftomshuoec ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:46 np0005548788.localdomain sudo[70336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:47 np0005548788.localdomain python3[70338]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:47 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:47 np0005548788.localdomain systemd-sysv-generator[70365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:47 np0005548788.localdomain systemd-rc-local-generator[70361]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:47 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:47 np0005548788.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Dec 06 08:30:47 np0005548788.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Dec 06 08:30:47 np0005548788.localdomain sudo[70336]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:47 np0005548788.localdomain sudo[70404]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsgzgaqxpdcvsmyjlxzcbnwnvhrxvvgp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:47 np0005548788.localdomain sudo[70404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:48 np0005548788.localdomain python3[70406]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:48 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:48 np0005548788.localdomain systemd-sysv-generator[70436]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:48 np0005548788.localdomain systemd-rc-local-generator[70432]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:48 np0005548788.localdomain systemd[1]: Starting logrotate_crond container...
Dec 06 08:30:48 np0005548788.localdomain systemd[1]: Started logrotate_crond container.
Dec 06 08:30:48 np0005548788.localdomain sudo[70404]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:48 np0005548788.localdomain sudo[70472]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aafzfzhyafscrqtmzylbonvujoqfyiyo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:48 np0005548788.localdomain sudo[70472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:49 np0005548788.localdomain python3[70474]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:49 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:49 np0005548788.localdomain systemd-rc-local-generator[70500]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:49 np0005548788.localdomain systemd-sysv-generator[70503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:49 np0005548788.localdomain systemd[1]: Starting nova_migration_target container...
Dec 06 08:30:49 np0005548788.localdomain systemd[1]: Started nova_migration_target container.
Dec 06 08:30:49 np0005548788.localdomain sudo[70472]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:50 np0005548788.localdomain sudo[70540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blxvnuqbteuieummexcxbthavcqfdvgq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:50 np0005548788.localdomain sudo[70540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:50 np0005548788.localdomain python3[70542]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:50 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:50 np0005548788.localdomain systemd-rc-local-generator[70566]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:50 np0005548788.localdomain systemd-sysv-generator[70572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:50 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:50 np0005548788.localdomain systemd[1]: Starting ovn_controller container...
Dec 06 08:30:50 np0005548788.localdomain tripleo-start-podman-container[70582]: Creating additional drop-in dependency for "ovn_controller" (6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef)
Dec 06 08:30:50 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:50 np0005548788.localdomain systemd-sysv-generator[70646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:50 np0005548788.localdomain systemd-rc-local-generator[70642]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:51 np0005548788.localdomain systemd[1]: Started ovn_controller container.
Dec 06 08:30:51 np0005548788.localdomain sudo[70540]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:51 np0005548788.localdomain sudo[70665]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbghqofjkmfopqglvlartfyfyyxgxcxu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:51 np0005548788.localdomain sudo[70665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:51 np0005548788.localdomain python3[70667]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:51 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:30:51 np0005548788.localdomain systemd-rc-local-generator[70692]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:51 np0005548788.localdomain systemd-sysv-generator[70698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:52 np0005548788.localdomain systemd[1]: Starting ovn_metadata_agent container...
Dec 06 08:30:52 np0005548788.localdomain systemd[1]: Started ovn_metadata_agent container.
Dec 06 08:30:52 np0005548788.localdomain sudo[70665]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:52 np0005548788.localdomain sudo[70746]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evvuimfcoeorscgzbagtqvprxsssrhxg ; /usr/bin/python3
Dec 06 08:30:52 np0005548788.localdomain sudo[70746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:52 np0005548788.localdomain python3[70748]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:52 np0005548788.localdomain sudo[70746]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:53 np0005548788.localdomain sudo[70794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhaquvabmiklkdjzigrglkurlbblldbm ; /usr/bin/python3
Dec 06 08:30:53 np0005548788.localdomain sudo[70794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:53 np0005548788.localdomain sudo[70794]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:53 np0005548788.localdomain sudo[70837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxbruyfqbybcfetobbiuxbypnmqehekq ; /usr/bin/python3
Dec 06 08:30:53 np0005548788.localdomain sudo[70837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:53 np0005548788.localdomain sudo[70837]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:53 np0005548788.localdomain sudo[70867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eppgzpdbkpdqextmgrczcjwnpiwmuoge ; /usr/bin/python3
Dec 06 08:30:53 np0005548788.localdomain sudo[70867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:54 np0005548788.localdomain python3[70869]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005548788 step=4 update_config_hash_only=False
Dec 06 08:30:54 np0005548788.localdomain sudo[70867]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:54 np0005548788.localdomain sudo[70884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adwtifxuusqhvwxkzpcmebryymxwgxac ; /usr/bin/python3
Dec 06 08:30:54 np0005548788.localdomain sudo[70884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:54 np0005548788.localdomain python3[70886]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:54 np0005548788.localdomain sudo[70884]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:54 np0005548788.localdomain sudo[70900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poiryxshlkpihcmpexlbnbkveppdowxh ; /usr/bin/python3
Dec 06 08:30:54 np0005548788.localdomain sudo[70900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:54 np0005548788.localdomain python3[70902]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:30:54 np0005548788.localdomain sudo[70900]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:31:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:31:01 np0005548788.localdomain podman[70904]: 2025-12-06 08:31:01.240741107 +0000 UTC m=+0.073160786 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-collectd, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3)
Dec 06 08:31:01 np0005548788.localdomain podman[70904]: 2025-12-06 08:31:01.248427983 +0000 UTC m=+0.080847762 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, container_name=collectd, io.openshift.expose-services=)
Dec 06 08:31:01 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:31:01 np0005548788.localdomain podman[70905]: 2025-12-06 08:31:01.309347501 +0000 UTC m=+0.137074166 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 08:31:01 np0005548788.localdomain podman[70905]: 2025-12-06 08:31:01.344699621 +0000 UTC m=+0.172426306 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:01 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:31:02 np0005548788.localdomain podman[70944]: 2025-12-06 08:31:02.265418568 +0000 UTC m=+0.088843239 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:32Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:31:02 np0005548788.localdomain podman[70944]: 2025-12-06 08:31:02.274650663 +0000 UTC m=+0.098075394 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: tmp-crun.t914Fm.mount: Deactivated successfully.
Dec 06 08:31:02 np0005548788.localdomain podman[70946]: 2025-12-06 08:31:02.318156343 +0000 UTC m=+0.137601182 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Dec 06 08:31:02 np0005548788.localdomain podman[70945]: 2025-12-06 08:31:02.384482978 +0000 UTC m=+0.206522277 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible)
Dec 06 08:31:02 np0005548788.localdomain podman[70946]: 2025-12-06 08:31:02.400221073 +0000 UTC m=+0.219665922 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:31:02 np0005548788.localdomain podman[70945]: 2025-12-06 08:31:02.417605309 +0000 UTC m=+0.239644588 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:31:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:31:02 np0005548788.localdomain podman[71015]: 2025-12-06 08:31:02.980527619 +0000 UTC m=+0.063638003 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Dec 06 08:31:03 np0005548788.localdomain systemd[1]: tmp-crun.XmLavw.mount: Deactivated successfully.
Dec 06 08:31:03 np0005548788.localdomain podman[71015]: 2025-12-06 08:31:03.416985062 +0000 UTC m=+0.500095476 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:03 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:31:04 np0005548788.localdomain sshd[71040]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:31:05 np0005548788.localdomain sshd[71040]: Received disconnect from 152.32.172.117 port 35302:11: Bye Bye [preauth]
Dec 06 08:31:05 np0005548788.localdomain sshd[71040]: Disconnected from authenticating user root 152.32.172.117 port 35302 [preauth]
Dec 06 08:31:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:31:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:31:06 np0005548788.localdomain systemd[1]: tmp-crun.uLpe4E.mount: Deactivated successfully.
Dec 06 08:31:06 np0005548788.localdomain podman[71043]: 2025-12-06 08:31:06.240013011 +0000 UTC m=+0.066031886 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_controller, version=17.1.12, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z)
Dec 06 08:31:06 np0005548788.localdomain podman[71043]: 2025-12-06 08:31:06.266676533 +0000 UTC m=+0.092695408 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vcs-type=git, container_name=ovn_controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:31:06 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:31:06 np0005548788.localdomain podman[71042]: 2025-12-06 08:31:06.35486191 +0000 UTC m=+0.180538795 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public)
Dec 06 08:31:06 np0005548788.localdomain podman[71042]: 2025-12-06 08:31:06.404778069 +0000 UTC m=+0.230454994 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=)
Dec 06 08:31:06 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:31:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:31:08 np0005548788.localdomain systemd[1]: tmp-crun.vMtXAA.mount: Deactivated successfully.
Dec 06 08:31:08 np0005548788.localdomain podman[71089]: 2025-12-06 08:31:08.26553281 +0000 UTC m=+0.098002852 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 06 08:31:08 np0005548788.localdomain podman[71089]: 2025-12-06 08:31:08.481113594 +0000 UTC m=+0.313583596 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:31:08 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:31:14 np0005548788.localdomain snmpd[67478]: empty variable list in _query
Dec 06 08:31:14 np0005548788.localdomain snmpd[67478]: empty variable list in _query
Dec 06 08:31:20 np0005548788.localdomain sshd[71118]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:31:21 np0005548788.localdomain sshd[71118]: Invalid user xiqiao from 150.95.85.24 port 36704
Dec 06 08:31:21 np0005548788.localdomain sudo[71120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:31:21 np0005548788.localdomain sudo[71120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:21 np0005548788.localdomain sudo[71120]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:22 np0005548788.localdomain sshd[71118]: Received disconnect from 150.95.85.24 port 36704:11:  [preauth]
Dec 06 08:31:22 np0005548788.localdomain sshd[71118]: Disconnected from invalid user xiqiao 150.95.85.24 port 36704 [preauth]
Dec 06 08:31:22 np0005548788.localdomain sudo[71135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:31:22 np0005548788.localdomain sudo[71135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:22 np0005548788.localdomain sudo[71135]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:22 np0005548788.localdomain sudo[71182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:31:22 np0005548788.localdomain sudo[71182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:22 np0005548788.localdomain sudo[71182]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:23 np0005548788.localdomain sudo[71197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 08:31:23 np0005548788.localdomain sudo[71197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 2025-12-06 08:31:23.59228052 +0000 UTC m=+0.075525758 container create d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_wescoff, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7)
Dec 06 08:31:23 np0005548788.localdomain systemd[1]: Started libpod-conmon-d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40.scope.
Dec 06 08:31:23 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 2025-12-06 08:31:23.561497212 +0000 UTC m=+0.044742490 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 2025-12-06 08:31:23.663575938 +0000 UTC m=+0.146821186 container init d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_wescoff, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 2025-12-06 08:31:23.673494634 +0000 UTC m=+0.156739872 container start d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_wescoff, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 2025-12-06 08:31:23.673741772 +0000 UTC m=+0.156987020 container attach d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_wescoff, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, ceph=True, name=rhceph, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7)
Dec 06 08:31:23 np0005548788.localdomain friendly_wescoff[71269]: 167 167
Dec 06 08:31:23 np0005548788.localdomain systemd[1]: libpod-d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40.scope: Deactivated successfully.
Dec 06 08:31:23 np0005548788.localdomain podman[71254]: 2025-12-06 08:31:23.677999973 +0000 UTC m=+0.161245281 container died d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_wescoff, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, ceph=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, distribution-scope=public, name=rhceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:31:23 np0005548788.localdomain podman[71274]: 2025-12-06 08:31:23.767593294 +0000 UTC m=+0.079023426 container remove d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_wescoff, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main)
Dec 06 08:31:23 np0005548788.localdomain systemd[1]: libpod-conmon-d7269231535cb90407990d5e6211156349b9a7fd204c8b320664f42e18207b40.scope: Deactivated successfully.
Dec 06 08:31:23 np0005548788.localdomain podman[71296]: 
Dec 06 08:31:23 np0005548788.localdomain podman[71296]: 2025-12-06 08:31:23.953309138 +0000 UTC m=+0.058020909 container create 6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_kapitsa, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, release=1763362218)
Dec 06 08:31:23 np0005548788.localdomain systemd[1]: Started libpod-conmon-6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388.scope.
Dec 06 08:31:24 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:31:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9e04f074d3a714c0a74cd7ce686838e95fcb00ba03091cfe62dda8030432539/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:31:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9e04f074d3a714c0a74cd7ce686838e95fcb00ba03091cfe62dda8030432539/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:31:24 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9e04f074d3a714c0a74cd7ce686838e95fcb00ba03091cfe62dda8030432539/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:31:24 np0005548788.localdomain podman[71296]: 2025-12-06 08:31:24.01370261 +0000 UTC m=+0.118414381 container init 6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_kapitsa, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, release=1763362218, build-date=2025-11-26T19:44:28Z, version=7, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Dec 06 08:31:24 np0005548788.localdomain podman[71296]: 2025-12-06 08:31:24.023555523 +0000 UTC m=+0.128267294 container start 6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_kapitsa, io.openshift.expose-services=, version=7, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:24 np0005548788.localdomain podman[71296]: 2025-12-06 08:31:24.02378094 +0000 UTC m=+0.128492711 container attach 6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_kapitsa, version=7, io.openshift.tags=rhceph ceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Dec 06 08:31:24 np0005548788.localdomain podman[71296]: 2025-12-06 08:31:23.924584113 +0000 UTC m=+0.029295954 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:31:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ebd2a03dfa543739cdd9c455f6b769bc68531561f22686a357b1b4b000bc3f05-merged.mount: Deactivated successfully.
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]: [
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:     {
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "available": false,
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "ceph_device": false,
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "lsm_data": {},
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "lvs": [],
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "path": "/dev/sr0",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "rejected_reasons": [
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "Insufficient space (<5GB)",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "Has a FileSystem"
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         ],
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         "sys_api": {
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "actuators": null,
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "device_nodes": "sr0",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "human_readable_size": "482.00 KB",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "id_bus": "ata",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "model": "QEMU DVD-ROM",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "nr_requests": "2",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "partitions": {},
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "path": "/dev/sr0",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "removable": "1",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "rev": "2.5+",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "ro": "0",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "rotational": "1",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "sas_address": "",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "sas_device_handle": "",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "scheduler_mode": "mq-deadline",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "sectors": 0,
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "sectorsize": "2048",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "size": 493568.0,
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "support_discard": "0",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "type": "disk",
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:             "vendor": "QEMU"
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:         }
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]:     }
Dec 06 08:31:24 np0005548788.localdomain angry_kapitsa[71312]: ]
Dec 06 08:31:25 np0005548788.localdomain systemd[1]: libpod-6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388.scope: Deactivated successfully.
Dec 06 08:31:25 np0005548788.localdomain systemd[1]: libpod-6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388.scope: Consumed 1.037s CPU time.
Dec 06 08:31:25 np0005548788.localdomain podman[71296]: 2025-12-06 08:31:25.018256351 +0000 UTC m=+1.122968192 container died 6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_kapitsa, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, release=1763362218, version=7, architecture=x86_64, RELEASE=main, vcs-type=git, build-date=2025-11-26T19:44:28Z)
Dec 06 08:31:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a9e04f074d3a714c0a74cd7ce686838e95fcb00ba03091cfe62dda8030432539-merged.mount: Deactivated successfully.
Dec 06 08:31:25 np0005548788.localdomain podman[73249]: 2025-12-06 08:31:25.125834707 +0000 UTC m=+0.096478395 container remove 6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_kapitsa, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, version=7, vcs-type=git, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 06 08:31:25 np0005548788.localdomain systemd[1]: libpod-conmon-6756816ec96b63f221a6d74cb5bcb920273791fd800afbc86e105a583af79388.scope: Deactivated successfully.
Dec 06 08:31:25 np0005548788.localdomain sudo[71197]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:25 np0005548788.localdomain sudo[73262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:31:25 np0005548788.localdomain sudo[73262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:25 np0005548788.localdomain sudo[73262]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:31:32 np0005548788.localdomain podman[73278]: 2025-12-06 08:31:32.25828833 +0000 UTC m=+0.078766338 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:31:32 np0005548788.localdomain podman[73278]: 2025-12-06 08:31:32.294129375 +0000 UTC m=+0.114607413 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid)
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: tmp-crun.oYBc3h.mount: Deactivated successfully.
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:31:32 np0005548788.localdomain podman[73277]: 2025-12-06 08:31:32.313564674 +0000 UTC m=+0.132905308 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 06 08:31:32 np0005548788.localdomain podman[73277]: 2025-12-06 08:31:32.323983055 +0000 UTC m=+0.143323729 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, vcs-type=git)
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:31:32 np0005548788.localdomain podman[73316]: 2025-12-06 08:31:32.405567559 +0000 UTC m=+0.069005757 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, architecture=x86_64, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:31:32 np0005548788.localdomain podman[73316]: 2025-12-06 08:31:32.442745836 +0000 UTC m=+0.106184074 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:31:32 np0005548788.localdomain podman[73347]: 2025-12-06 08:31:32.555331415 +0000 UTC m=+0.076710495 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12)
Dec 06 08:31:32 np0005548788.localdomain podman[73336]: 2025-12-06 08:31:32.52045892 +0000 UTC m=+0.078313374 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4)
Dec 06 08:31:32 np0005548788.localdomain podman[73336]: 2025-12-06 08:31:32.603566592 +0000 UTC m=+0.161420976 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 06 08:31:32 np0005548788.localdomain podman[73347]: 2025-12-06 08:31:32.610099693 +0000 UTC m=+0.131478803 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:31:32 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:31:33 np0005548788.localdomain systemd[1]: tmp-crun.spZqW4.mount: Deactivated successfully.
Dec 06 08:31:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:31:34 np0005548788.localdomain podman[73390]: 2025-12-06 08:31:34.264747852 +0000 UTC m=+0.085576248 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:31:34 np0005548788.localdomain podman[73390]: 2025-12-06 08:31:34.653566496 +0000 UTC m=+0.474394902 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4)
Dec 06 08:31:34 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:31:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:31:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:31:37 np0005548788.localdomain systemd[1]: tmp-crun.DftuyR.mount: Deactivated successfully.
Dec 06 08:31:37 np0005548788.localdomain podman[73413]: 2025-12-06 08:31:37.260190835 +0000 UTC m=+0.086935461 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:31:37 np0005548788.localdomain podman[73412]: 2025-12-06 08:31:37.288133646 +0000 UTC m=+0.119247946 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 06 08:31:37 np0005548788.localdomain podman[73413]: 2025-12-06 08:31:37.309610588 +0000 UTC m=+0.136355174 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Dec 06 08:31:37 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:31:37 np0005548788.localdomain podman[73412]: 2025-12-06 08:31:37.329873593 +0000 UTC m=+0.160987883 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 06 08:31:37 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:31:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:31:39 np0005548788.localdomain systemd[1]: tmp-crun.s2UyhZ.mount: Deactivated successfully.
Dec 06 08:31:39 np0005548788.localdomain podman[73460]: 2025-12-06 08:31:39.239719707 +0000 UTC m=+0.065270473 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:31:39 np0005548788.localdomain podman[73460]: 2025-12-06 08:31:39.440882147 +0000 UTC m=+0.266432923 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 08:31:39 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: tmp-crun.823BBo.mount: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain podman[73493]: 2025-12-06 08:32:03.298283744 +0000 UTC m=+0.092695968 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: tmp-crun.dFI5p7.mount: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain podman[73491]: 2025-12-06 08:32:03.304622659 +0000 UTC m=+0.098572159 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.expose-services=)
Dec 06 08:32:03 np0005548788.localdomain podman[73493]: 2025-12-06 08:32:03.304155075 +0000 UTC m=+0.098567399 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:32:03 np0005548788.localdomain podman[73491]: 2025-12-06 08:32:03.312263835 +0000 UTC m=+0.106213355 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain podman[73492]: 2025-12-06 08:32:03.380901071 +0000 UTC m=+0.175459079 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z)
Dec 06 08:32:03 np0005548788.localdomain podman[73490]: 2025-12-06 08:32:03.447933247 +0000 UTC m=+0.244666352 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:32:03 np0005548788.localdomain podman[73494]: 2025-12-06 08:32:03.493332446 +0000 UTC m=+0.282210259 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true)
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain podman[73492]: 2025-12-06 08:32:03.525974022 +0000 UTC m=+0.320532110 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:32:03 np0005548788.localdomain podman[73494]: 2025-12-06 08:32:03.550646042 +0000 UTC m=+0.339523855 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_ipmi, release=1761123044, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:32:03 np0005548788.localdomain podman[73490]: 2025-12-06 08:32:03.579865074 +0000 UTC m=+0.376598139 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, container_name=logrotate_crond, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, version=17.1.12)
Dec 06 08:32:03 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:32:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:32:05 np0005548788.localdomain systemd[1]: tmp-crun.tu9GhP.mount: Deactivated successfully.
Dec 06 08:32:05 np0005548788.localdomain podman[73598]: 2025-12-06 08:32:05.238101593 +0000 UTC m=+0.070579877 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:32:05 np0005548788.localdomain podman[73598]: 2025-12-06 08:32:05.573567822 +0000 UTC m=+0.406046106 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:32:05 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:32:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:32:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:32:08 np0005548788.localdomain systemd[1]: tmp-crun.EfZG4R.mount: Deactivated successfully.
Dec 06 08:32:08 np0005548788.localdomain podman[73621]: 2025-12-06 08:32:08.276515241 +0000 UTC m=+0.106667869 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 06 08:32:08 np0005548788.localdomain podman[73622]: 2025-12-06 08:32:08.331269939 +0000 UTC m=+0.154611808 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:32:08 np0005548788.localdomain podman[73621]: 2025-12-06 08:32:08.341297787 +0000 UTC m=+0.171450415 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:32:08 np0005548788.localdomain podman[73622]: 2025-12-06 08:32:08.35208943 +0000 UTC m=+0.175431289 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, architecture=x86_64, distribution-scope=public, container_name=ovn_controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Dec 06 08:32:08 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:32:08 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:32:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:32:10 np0005548788.localdomain podman[73664]: 2025-12-06 08:32:10.258799377 +0000 UTC m=+0.089644774 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, version=17.1.12)
Dec 06 08:32:10 np0005548788.localdomain podman[73664]: 2025-12-06 08:32:10.484743861 +0000 UTC m=+0.315589248 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:32:10 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:32:19 np0005548788.localdomain sshd[73693]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:32:20 np0005548788.localdomain sshd[73693]: Received disconnect from 152.32.172.117 port 56526:11: Bye Bye [preauth]
Dec 06 08:32:20 np0005548788.localdomain sshd[73693]: Disconnected from authenticating user root 152.32.172.117 port 56526 [preauth]
Dec 06 08:32:25 np0005548788.localdomain sudo[73695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:32:25 np0005548788.localdomain sudo[73695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:25 np0005548788.localdomain sudo[73695]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:25 np0005548788.localdomain sudo[73710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:32:25 np0005548788.localdomain sudo[73710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:26 np0005548788.localdomain podman[73796]: 2025-12-06 08:32:26.726783014 +0000 UTC m=+0.093946726 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container)
Dec 06 08:32:26 np0005548788.localdomain podman[73796]: 2025-12-06 08:32:26.837614291 +0000 UTC m=+0.204777973 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, version=7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z)
Dec 06 08:32:27 np0005548788.localdomain sudo[73710]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:27 np0005548788.localdomain sudo[73864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:32:27 np0005548788.localdomain sudo[73864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:27 np0005548788.localdomain sudo[73864]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:27 np0005548788.localdomain sudo[73879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:32:27 np0005548788.localdomain sudo[73879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:28 np0005548788.localdomain sudo[73879]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:28 np0005548788.localdomain sudo[73926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:32:28 np0005548788.localdomain sudo[73926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:28 np0005548788.localdomain sudo[73926]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:32:34 np0005548788.localdomain podman[73943]: 2025-12-06 08:32:34.283127722 +0000 UTC m=+0.102622704 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: tmp-crun.9lqPTw.mount: Deactivated successfully.
Dec 06 08:32:34 np0005548788.localdomain podman[73943]: 2025-12-06 08:32:34.344110031 +0000 UTC m=+0.163605063 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, vcs-type=git, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:32:34 np0005548788.localdomain podman[73944]: 2025-12-06 08:32:34.436481688 +0000 UTC m=+0.251533633 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Dec 06 08:32:34 np0005548788.localdomain podman[73944]: 2025-12-06 08:32:34.444699242 +0000 UTC m=+0.259751197 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, version=17.1.12, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:32:34 np0005548788.localdomain podman[73945]: 2025-12-06 08:32:34.52151001 +0000 UTC m=+0.334039298 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, release=1761123044)
Dec 06 08:32:34 np0005548788.localdomain podman[73945]: 2025-12-06 08:32:34.548690567 +0000 UTC m=+0.361219855 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:32:34 np0005548788.localdomain podman[73942]: 2025-12-06 08:32:34.347452095 +0000 UTC m=+0.166808533 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, container_name=collectd, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:32:34 np0005548788.localdomain podman[73941]: 2025-12-06 08:32:34.636682049 +0000 UTC m=+0.456844281 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:32:34 np0005548788.localdomain podman[73941]: 2025-12-06 08:32:34.674697101 +0000 UTC m=+0.494859333 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044)
Dec 06 08:32:34 np0005548788.localdomain podman[73942]: 2025-12-06 08:32:34.683668128 +0000 UTC m=+0.503024556 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:32:34 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:32:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:32:36 np0005548788.localdomain podman[74049]: 2025-12-06 08:32:36.246600029 +0000 UTC m=+0.080690888 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target)
Dec 06 08:32:36 np0005548788.localdomain podman[74049]: 2025-12-06 08:32:36.63985958 +0000 UTC m=+0.473950439 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044)
Dec 06 08:32:36 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:32:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:32:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:32:39 np0005548788.localdomain systemd[1]: tmp-crun.YHFP5x.mount: Deactivated successfully.
Dec 06 08:32:39 np0005548788.localdomain podman[74074]: 2025-12-06 08:32:39.27117418 +0000 UTC m=+0.093897854 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 08:32:39 np0005548788.localdomain podman[74074]: 2025-12-06 08:32:39.325755503 +0000 UTC m=+0.148479237 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4)
Dec 06 08:32:39 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:32:39 np0005548788.localdomain podman[74073]: 2025-12-06 08:32:39.327276689 +0000 UTC m=+0.151634824 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:32:39 np0005548788.localdomain podman[74073]: 2025-12-06 08:32:39.411103243 +0000 UTC m=+0.235461338 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true)
Dec 06 08:32:39 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:32:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:32:41 np0005548788.localdomain podman[74120]: 2025-12-06 08:32:41.269414759 +0000 UTC m=+0.093625296 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:32:41 np0005548788.localdomain podman[74120]: 2025-12-06 08:32:41.495798097 +0000 UTC m=+0.320008604 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:32:41 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:33:05 np0005548788.localdomain podman[74148]: 2025-12-06 08:33:05.260824212 +0000 UTC m=+0.079949081 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, distribution-scope=public, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-cron)
Dec 06 08:33:05 np0005548788.localdomain podman[74150]: 2025-12-06 08:33:05.313321799 +0000 UTC m=+0.132313474 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12)
Dec 06 08:33:05 np0005548788.localdomain podman[74149]: 2025-12-06 08:33:05.383137542 +0000 UTC m=+0.198650392 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 08:33:05 np0005548788.localdomain podman[74149]: 2025-12-06 08:33:05.393468935 +0000 UTC m=+0.208981775 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 08:33:05 np0005548788.localdomain podman[74148]: 2025-12-06 08:33:05.398758225 +0000 UTC m=+0.217883104 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:33:05 np0005548788.localdomain podman[74155]: 2025-12-06 08:33:05.343031519 +0000 UTC m=+0.149728702 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 06 08:33:05 np0005548788.localdomain podman[74155]: 2025-12-06 08:33:05.477500707 +0000 UTC m=+0.284197780 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:12:45Z, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:33:05 np0005548788.localdomain podman[74151]: 2025-12-06 08:33:05.532796051 +0000 UTC m=+0.341287118 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Dec 06 08:33:05 np0005548788.localdomain podman[74151]: 2025-12-06 08:33:05.543552985 +0000 UTC m=+0.352044082 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z)
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:33:05 np0005548788.localdomain podman[74150]: 2025-12-06 08:33:05.598877309 +0000 UTC m=+0.417869044 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:33:05 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:33:06 np0005548788.localdomain systemd[1]: tmp-crun.MgTufK.mount: Deactivated successfully.
Dec 06 08:33:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:33:07 np0005548788.localdomain podman[74252]: 2025-12-06 08:33:07.249351175 +0000 UTC m=+0.078028961 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, container_name=nova_migration_target)
Dec 06 08:33:07 np0005548788.localdomain podman[74252]: 2025-12-06 08:33:07.626555058 +0000 UTC m=+0.455232824 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:33:07 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:33:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:33:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:33:10 np0005548788.localdomain podman[74275]: 2025-12-06 08:33:10.25552534 +0000 UTC m=+0.083998032 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:33:10 np0005548788.localdomain systemd[1]: tmp-crun.HgU6Ue.mount: Deactivated successfully.
Dec 06 08:33:10 np0005548788.localdomain podman[74275]: 2025-12-06 08:33:10.31862996 +0000 UTC m=+0.147102672 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:33:10 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:33:10 np0005548788.localdomain podman[74276]: 2025-12-06 08:33:10.321912319 +0000 UTC m=+0.146911716 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller)
Dec 06 08:33:10 np0005548788.localdomain podman[74276]: 2025-12-06 08:33:10.405668013 +0000 UTC m=+0.230667350 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:33:10 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:33:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:33:12 np0005548788.localdomain systemd[1]: tmp-crun.E48sjq.mount: Deactivated successfully.
Dec 06 08:33:12 np0005548788.localdomain podman[74324]: 2025-12-06 08:33:12.241317722 +0000 UTC m=+0.075162585 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Dec 06 08:33:12 np0005548788.localdomain podman[74324]: 2025-12-06 08:33:12.428237368 +0000 UTC m=+0.262082141 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:33:12 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:33:21 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:33:21 np0005548788.localdomain recover_tripleo_nova_virtqemud[74354]: 62021
Dec 06 08:33:21 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:33:21 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:33:28 np0005548788.localdomain sudo[74355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:33:28 np0005548788.localdomain sudo[74355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:33:28 np0005548788.localdomain sudo[74355]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:28 np0005548788.localdomain sudo[74370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:33:28 np0005548788.localdomain sudo[74370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:33:29 np0005548788.localdomain sudo[74370]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:30 np0005548788.localdomain sudo[74417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:33:30 np0005548788.localdomain sudo[74417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:33:30 np0005548788.localdomain sudo[74417]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:33:36 np0005548788.localdomain podman[74434]: 2025-12-06 08:33:36.279496181 +0000 UTC m=+0.097814541 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:33:36 np0005548788.localdomain podman[74434]: 2025-12-06 08:33:36.312593672 +0000 UTC m=+0.130912022 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git)
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:33:36 np0005548788.localdomain podman[74435]: 2025-12-06 08:33:36.332813174 +0000 UTC m=+0.149772412 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12)
Dec 06 08:33:36 np0005548788.localdomain podman[74433]: 2025-12-06 08:33:36.375916918 +0000 UTC m=+0.198471276 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:33:36 np0005548788.localdomain podman[74433]: 2025-12-06 08:33:36.387698415 +0000 UTC m=+0.210252783 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container)
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:33:36 np0005548788.localdomain podman[74436]: 2025-12-06 08:33:36.465694125 +0000 UTC m=+0.280870320 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc.)
Dec 06 08:33:36 np0005548788.localdomain podman[74436]: 2025-12-06 08:33:36.517583454 +0000 UTC m=+0.332759649 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:33:36 np0005548788.localdomain podman[74432]: 2025-12-06 08:33:36.527718801 +0000 UTC m=+0.350284589 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:33:36 np0005548788.localdomain podman[74432]: 2025-12-06 08:33:36.538800497 +0000 UTC m=+0.361366255 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step4, version=17.1.12)
Dec 06 08:33:36 np0005548788.localdomain podman[74435]: 2025-12-06 08:33:36.548037466 +0000 UTC m=+0.364996694 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid)
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:33:36 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:33:36 np0005548788.localdomain sshd[74538]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:33:38 np0005548788.localdomain sshd[74538]: Received disconnect from 152.32.172.117 port 42048:11: Bye Bye [preauth]
Dec 06 08:33:38 np0005548788.localdomain sshd[74538]: Disconnected from authenticating user root 152.32.172.117 port 42048 [preauth]
Dec 06 08:33:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:33:38 np0005548788.localdomain podman[74540]: 2025-12-06 08:33:38.228495039 +0000 UTC m=+0.084204999 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, version=17.1.12, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 08:33:38 np0005548788.localdomain podman[74540]: 2025-12-06 08:33:38.595761031 +0000 UTC m=+0.451470941 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:33:38 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:33:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:33:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:33:41 np0005548788.localdomain podman[74564]: 2025-12-06 08:33:41.264533747 +0000 UTC m=+0.091252823 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:33:41 np0005548788.localdomain podman[74564]: 2025-12-06 08:33:41.311815396 +0000 UTC m=+0.138534412 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true)
Dec 06 08:33:41 np0005548788.localdomain podman[74563]: 2025-12-06 08:33:41.311305511 +0000 UTC m=+0.140853492 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:33:41 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:33:41 np0005548788.localdomain podman[74563]: 2025-12-06 08:33:41.391817747 +0000 UTC m=+0.221365659 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Dec 06 08:33:41 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:33:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:33:43 np0005548788.localdomain podman[74611]: 2025-12-06 08:33:43.250350549 +0000 UTC m=+0.084853738 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd)
Dec 06 08:33:43 np0005548788.localdomain podman[74611]: 2025-12-06 08:33:43.449928418 +0000 UTC m=+0.284431557 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:33:43 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:33:47 np0005548788.localdomain sudo[74686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lglisyqpanynubilafpkxlvucsxtsmcl ; /usr/bin/python3
Dec 06 08:33:47 np0005548788.localdomain sudo[74686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:47 np0005548788.localdomain python3[74688]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:33:48 np0005548788.localdomain sudo[74686]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:48 np0005548788.localdomain sudo[74731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tntbrtgmvbmlpbgzucuhyxqdaodlgrme ; /usr/bin/python3
Dec 06 08:33:48 np0005548788.localdomain sudo[74731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:48 np0005548788.localdomain python3[74733]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010027.678537-113368-137974659413020/source _original_basename=tmp485kajby follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:33:48 np0005548788.localdomain sudo[74731]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:49 np0005548788.localdomain sudo[74761]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-outbvfozcndxqoqletuxhcjhyydqzjsz ; /usr/bin/python3
Dec 06 08:33:49 np0005548788.localdomain sudo[74761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:49 np0005548788.localdomain python3[74763]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:33:49 np0005548788.localdomain sudo[74761]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:49 np0005548788.localdomain sudo[74811]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zofsjdvfjocexmyxzlknvqprfpzrofuq ; /usr/bin/python3
Dec 06 08:33:49 np0005548788.localdomain sudo[74811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:50 np0005548788.localdomain sudo[74811]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:50 np0005548788.localdomain sudo[74829]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkrucbhwzjiemueuszevtlmuprmqcvbg ; /usr/bin/python3
Dec 06 08:33:50 np0005548788.localdomain sudo[74829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:50 np0005548788.localdomain sudo[74829]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:51 np0005548788.localdomain sudo[74933]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjvhhuutzwprixzyxafvslkjxivwmzlz ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010030.6430557-113608-90099260215991/async_wrapper.py 940598834029 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010030.6430557-113608-90099260215991/AnsiballZ_command.py _
Dec 06 08:33:51 np0005548788.localdomain sudo[74933]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:33:51 np0005548788.localdomain ansible-async_wrapper.py[74935]: Invoked with 940598834029 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010030.6430557-113608-90099260215991/AnsiballZ_command.py _
Dec 06 08:33:51 np0005548788.localdomain ansible-async_wrapper.py[74938]: Starting module and watcher
Dec 06 08:33:51 np0005548788.localdomain ansible-async_wrapper.py[74938]: Start watching 74939 (3600)
Dec 06 08:33:51 np0005548788.localdomain ansible-async_wrapper.py[74939]: Start module (74939)
Dec 06 08:33:51 np0005548788.localdomain ansible-async_wrapper.py[74935]: Return async_wrapper task started.
Dec 06 08:33:51 np0005548788.localdomain sudo[74933]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:51 np0005548788.localdomain sudo[74955]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tixurxcxfwtczhbvhpqjryemuuwngkfr ; /usr/bin/python3
Dec 06 08:33:51 np0005548788.localdomain sudo[74955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:51 np0005548788.localdomain python3[74959]: ansible-ansible.legacy.async_status Invoked with jid=940598834029.74935 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:33:51 np0005548788.localdomain sudo[74955]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (file & line not available)
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (file & line not available)
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Notice: Compiled catalog for np0005548788.localdomain in environment production in 0.23 seconds
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Notice: Applied catalog in 0.36 seconds
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Application:
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    Initial environment: production
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    Converged environment: production
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:          Run mode: user
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Changes:
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Events:
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Resources:
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:             Total: 19
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Time:
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:          Schedule: 0.00
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:           Package: 0.00
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:            Augeas: 0.01
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:              Exec: 0.01
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:              File: 0.02
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:           Service: 0.08
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    Config retrieval: 0.30
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    Transaction evaluation: 0.35
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:    Catalog application: 0.36
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:          Last run: 1765010035
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:        Filebucket: 0.00
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:             Total: 0.37
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]: Version:
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:            Config: 1765010035
Dec 06 08:33:55 np0005548788.localdomain puppet-user[74957]:            Puppet: 7.10.0
Dec 06 08:33:55 np0005548788.localdomain ansible-async_wrapper.py[74939]: Module complete (74939)
Dec 06 08:33:56 np0005548788.localdomain ansible-async_wrapper.py[74938]: Done in kid B.
Dec 06 08:34:01 np0005548788.localdomain sudo[75096]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhokqtydjpwengtlpimnakwvsampcfzn ; /usr/bin/python3
Dec 06 08:34:01 np0005548788.localdomain sudo[75096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:01 np0005548788.localdomain python3[75098]: ansible-ansible.legacy.async_status Invoked with jid=940598834029.74935 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:34:01 np0005548788.localdomain sudo[75096]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:02 np0005548788.localdomain sudo[75112]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpyvflxtmzskaytiqtqwnqeaulwsjswi ; /usr/bin/python3
Dec 06 08:34:02 np0005548788.localdomain sudo[75112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:02 np0005548788.localdomain python3[75114]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:34:02 np0005548788.localdomain sudo[75112]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:02 np0005548788.localdomain sudo[75128]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibodgaqmthbionhtqtovkyflzybvwknk ; /usr/bin/python3
Dec 06 08:34:02 np0005548788.localdomain sudo[75128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:02 np0005548788.localdomain python3[75130]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:34:02 np0005548788.localdomain sudo[75128]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:03 np0005548788.localdomain sudo[75178]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysepclbjbwencriotaqazxbcbhrphkyu ; /usr/bin/python3
Dec 06 08:34:03 np0005548788.localdomain sudo[75178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:03 np0005548788.localdomain python3[75180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:03 np0005548788.localdomain sudo[75178]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:03 np0005548788.localdomain sudo[75196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbptapjtoogchijgmeyydxnbmqhswfhd ; /usr/bin/python3
Dec 06 08:34:03 np0005548788.localdomain sudo[75196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:03 np0005548788.localdomain python3[75198]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpro5y_5i1 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:34:03 np0005548788.localdomain sudo[75196]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:04 np0005548788.localdomain sudo[75226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkiuygetbrplolcogmooxjamdshtmyfa ; /usr/bin/python3
Dec 06 08:34:04 np0005548788.localdomain sudo[75226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:04 np0005548788.localdomain python3[75228]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:04 np0005548788.localdomain sudo[75226]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:04 np0005548788.localdomain sudo[75242]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnyowbbuejhwysledeybelcnvxalcanh ; /usr/bin/python3
Dec 06 08:34:04 np0005548788.localdomain sudo[75242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:04 np0005548788.localdomain sudo[75242]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:05 np0005548788.localdomain sudo[75331]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzuwgkiryebckzfehiuvfmigrwsxsjmq ; /usr/bin/python3
Dec 06 08:34:05 np0005548788.localdomain sudo[75331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:05 np0005548788.localdomain python3[75333]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:34:05 np0005548788.localdomain sudo[75331]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:05 np0005548788.localdomain sudo[75350]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnusdhtljhlokqzbzbthkiqrhfokklcu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:05 np0005548788.localdomain sudo[75350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:06 np0005548788.localdomain python3[75352]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:06 np0005548788.localdomain sudo[75350]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:06 np0005548788.localdomain sudo[75366]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxmgnxgehsleifebztqzhpcsbtgdfpcx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:06 np0005548788.localdomain sudo[75366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:06 np0005548788.localdomain sudo[75366]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:06 np0005548788.localdomain sudo[75382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztksckuelrrjzrvxguslktdpclhtxfkz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:06 np0005548788.localdomain sudo[75382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:34:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:34:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:34:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:34:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:34:06 np0005548788.localdomain podman[75388]: 2025-12-06 08:34:06.82460033 +0000 UTC m=+0.077386903 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:34:06 np0005548788.localdomain systemd[1]: tmp-crun.QUwjF0.mount: Deactivated successfully.
Dec 06 08:34:06 np0005548788.localdomain python3[75384]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:34:06 np0005548788.localdomain podman[75389]: 2025-12-06 08:34:06.892323369 +0000 UTC m=+0.144847354 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Dec 06 08:34:06 np0005548788.localdomain sudo[75382]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:07 np0005548788.localdomain podman[75387]: 2025-12-06 08:34:07.002710328 +0000 UTC m=+0.257190763 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Dec 06 08:34:07 np0005548788.localdomain podman[75389]: 2025-12-06 08:34:07.012509095 +0000 UTC m=+0.265033090 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:34:07 np0005548788.localdomain podman[75385]: 2025-12-06 08:34:06.969927547 +0000 UTC m=+0.224399981 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-cron, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:34:07 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:34:07 np0005548788.localdomain podman[75385]: 2025-12-06 08:34:07.05002782 +0000 UTC m=+0.304500244 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:34:07 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:34:07 np0005548788.localdomain podman[75386]: 2025-12-06 08:34:07.092569287 +0000 UTC m=+0.348388552 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 06 08:34:07 np0005548788.localdomain podman[75386]: 2025-12-06 08:34:07.105657263 +0000 UTC m=+0.361476528 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, managed_by=tripleo_ansible)
Dec 06 08:34:07 np0005548788.localdomain podman[75388]: 2025-12-06 08:34:07.114782239 +0000 UTC m=+0.367568802 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:34:07 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:34:07 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:34:07 np0005548788.localdomain podman[75387]: 2025-12-06 08:34:07.161117241 +0000 UTC m=+0.415597656 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:34:07 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:34:07 np0005548788.localdomain sudo[75539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwdcghacsmtgzdbqoqancotdutwunupv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:07 np0005548788.localdomain sudo[75539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:07 np0005548788.localdomain python3[75541]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:07 np0005548788.localdomain sudo[75539]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:07 np0005548788.localdomain sudo[75557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfnonmjyaokawxfovjoxnxwnnyxtyuva ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:07 np0005548788.localdomain sudo[75557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:07 np0005548788.localdomain python3[75559]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:07 np0005548788.localdomain sudo[75557]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:07 np0005548788.localdomain systemd[1]: tmp-crun.Lq7SD4.mount: Deactivated successfully.
Dec 06 08:34:08 np0005548788.localdomain sudo[75619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxdngrscicfwbldmmwgsjaqnwhbvhpxj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:08 np0005548788.localdomain sudo[75619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:08 np0005548788.localdomain python3[75621]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:08 np0005548788.localdomain sudo[75619]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:08 np0005548788.localdomain sudo[75637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjohtqfnpohevkbnmsbfaqakmlwnzdwh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:08 np0005548788.localdomain sudo[75637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:08 np0005548788.localdomain python3[75639]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:08 np0005548788.localdomain sudo[75637]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:08 np0005548788.localdomain sudo[75699]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iagajxuqtxakmkfgvblsgazoggcarmzg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:08 np0005548788.localdomain sudo[75699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:34:09 np0005548788.localdomain podman[75702]: 2025-12-06 08:34:09.097861509 +0000 UTC m=+0.087617602 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:34:09 np0005548788.localdomain python3[75701]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:09 np0005548788.localdomain sudo[75699]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:09 np0005548788.localdomain sudo[75739]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgciwopmgyiemfnlbwtyljlqnttrajgf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:09 np0005548788.localdomain sudo[75739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:09 np0005548788.localdomain python3[75742]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:09 np0005548788.localdomain podman[75702]: 2025-12-06 08:34:09.471254436 +0000 UTC m=+0.461010549 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 06 08:34:09 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:34:09 np0005548788.localdomain sudo[75739]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:09 np0005548788.localdomain sudo[75802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fssiqgohcecmvsyajctrxljifwtonsqh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:09 np0005548788.localdomain sudo[75802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:10 np0005548788.localdomain python3[75804]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:10 np0005548788.localdomain sudo[75802]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:10 np0005548788.localdomain sudo[75820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsrzhemtfutmgnofaexzetmvpvajaukg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:10 np0005548788.localdomain sudo[75820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:10 np0005548788.localdomain python3[75822]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:10 np0005548788.localdomain sudo[75820]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:10 np0005548788.localdomain sudo[75850]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsfarojekdhqrhejlvbcmbwiuoiahaeh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:10 np0005548788.localdomain sudo[75850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:10 np0005548788.localdomain python3[75852]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:34:10 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:34:10 np0005548788.localdomain systemd-sysv-generator[75882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:34:10 np0005548788.localdomain systemd-rc-local-generator[75877]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:34:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:34:11 np0005548788.localdomain sudo[75850]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:11 np0005548788.localdomain sudo[75936]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blppoalktylntzkegwewcfwennigomfy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:11 np0005548788.localdomain sudo[75936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:34:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:34:11 np0005548788.localdomain podman[75938]: 2025-12-06 08:34:11.668104314 +0000 UTC m=+0.092265322 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1)
Dec 06 08:34:11 np0005548788.localdomain podman[75940]: 2025-12-06 08:34:11.714512348 +0000 UTC m=+0.138752409 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 06 08:34:11 np0005548788.localdomain podman[75938]: 2025-12-06 08:34:11.719032945 +0000 UTC m=+0.143193863 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:34:11 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:34:11 np0005548788.localdomain python3[75939]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:11 np0005548788.localdomain podman[75940]: 2025-12-06 08:34:11.74761034 +0000 UTC m=+0.171850401 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 08:34:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:34:11 np0005548788.localdomain sudo[75936]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:11 np0005548788.localdomain sudo[75999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aocoshrofohbvsajkhnfxtcjpthuabuf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:11 np0005548788.localdomain sudo[75999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:12 np0005548788.localdomain python3[76001]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:12 np0005548788.localdomain sudo[75999]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:12 np0005548788.localdomain sudo[76061]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itnibckevapamoggrotqztpfdqyxkuac ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:12 np0005548788.localdomain sudo[76061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:12 np0005548788.localdomain python3[76063]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:12 np0005548788.localdomain sudo[76061]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:12 np0005548788.localdomain sudo[76079]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vausfpcotrutfavspoeraqpfmljfmabs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:12 np0005548788.localdomain sudo[76079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:12 np0005548788.localdomain python3[76081]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:12 np0005548788.localdomain sudo[76079]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:13 np0005548788.localdomain sudo[76109]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxlqybouoxaguxqfdmkglmlrphymtsnc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:13 np0005548788.localdomain sudo[76109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:13 np0005548788.localdomain python3[76111]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:34:13 np0005548788.localdomain systemd-rc-local-generator[76134]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:34:13 np0005548788.localdomain systemd-sysv-generator[76140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:34:13 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:34:13 np0005548788.localdomain podman[76148]: 2025-12-06 08:34:13.875362547 +0000 UTC m=+0.105841453 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 06 08:34:13 np0005548788.localdomain sudo[76109]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:14 np0005548788.localdomain podman[76148]: 2025-12-06 08:34:14.082771793 +0000 UTC m=+0.313250749 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:34:14 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:34:14 np0005548788.localdomain sudo[76194]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wilzmevacjoeqameruljcjlmjqtoapex ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:14 np0005548788.localdomain sudo[76194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:14 np0005548788.localdomain python3[76196]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:34:14 np0005548788.localdomain sudo[76194]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:14 np0005548788.localdomain sudo[76210]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peniqesbymdjudadhorxgdafxjutrsbu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:14 np0005548788.localdomain sudo[76210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:15 np0005548788.localdomain sudo[76210]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:16 np0005548788.localdomain sudo[76252]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atmeyhwknysczhhysinuhmxkbctyamcx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:16 np0005548788.localdomain sudo[76252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:16 np0005548788.localdomain python3[76254]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:34:16 np0005548788.localdomain podman[76292]: 2025-12-06 08:34:16.63426411 +0000 UTC m=+0.068725650 container create 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.scope.
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:34:16 np0005548788.localdomain podman[76292]: 2025-12-06 08:34:16.597024183 +0000 UTC m=+0.031485743 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:34:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45bd41b522a65233004a5964b2c23041fe2bca744b4069684cef8b44acf2fd/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45bd41b522a65233004a5964b2c23041fe2bca744b4069684cef8b44acf2fd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45bd41b522a65233004a5964b2c23041fe2bca744b4069684cef8b44acf2fd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45bd41b522a65233004a5964b2c23041fe2bca744b4069684cef8b44acf2fd/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45bd41b522a65233004a5964b2c23041fe2bca744b4069684cef8b44acf2fd/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:34:16 np0005548788.localdomain podman[76292]: 2025-12-06 08:34:16.730368118 +0000 UTC m=+0.164829688 container init 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, release=1761123044, container_name=nova_compute, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 06 08:34:16 np0005548788.localdomain sudo[76312]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:34:16 np0005548788.localdomain podman[76292]: 2025-12-06 08:34:16.774116981 +0000 UTC m=+0.208578521 container start 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-type=git)
Dec 06 08:34:16 np0005548788.localdomain systemd-logind[765]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:34:16 np0005548788.localdomain python3[76254]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:34:16 np0005548788.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:34:16 np0005548788.localdomain systemd[76335]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:34:16 np0005548788.localdomain podman[76313]: 2025-12-06 08:34:16.945442825 +0000 UTC m=+0.157381373 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Queued start job for default target Main User Target.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Created slice User Application Slice.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Reached target Paths.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Reached target Timers.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Starting D-Bus User Message Bus Socket...
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Starting Create User's Volatile Files and Directories...
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Reached target Sockets.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Finished Create User's Volatile Files and Directories.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Reached target Basic System.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Reached target Main User Target.
Dec 06 08:34:17 np0005548788.localdomain systemd[76335]: Startup finished in 167ms.
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:34:17 np0005548788.localdomain podman[76313]: 2025-12-06 08:34:17.048598086 +0000 UTC m=+0.260536664 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:34:17 np0005548788.localdomain podman[76313]: unhealthy
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: Started Session c10 of User root.
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 08:34:17 np0005548788.localdomain sudo[76312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 06 08:34:17 np0005548788.localdomain sudo[76312]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Dec 06 08:34:17 np0005548788.localdomain podman[76408]: 2025-12-06 08:34:17.337250049 +0000 UTC m=+0.079885187 container create bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_wait_for_compute_service, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:34:17 np0005548788.localdomain podman[76408]: 2025-12-06 08:34:17.292931948 +0000 UTC m=+0.035567076 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5.scope.
Dec 06 08:34:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 08:34:17 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b38e1b3ae4684200fe2ade3176809882413b165be2193c39c12f1ac0f693972/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:17 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b38e1b3ae4684200fe2ade3176809882413b165be2193c39c12f1ac0f693972/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:17 np0005548788.localdomain podman[76408]: 2025-12-06 08:34:17.426745007 +0000 UTC m=+0.169380115 container init bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 08:34:17 np0005548788.localdomain podman[76408]: 2025-12-06 08:34:17.438317987 +0000 UTC m=+0.180953085 container start bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, container_name=nova_wait_for_compute_service, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 08:34:17 np0005548788.localdomain podman[76408]: 2025-12-06 08:34:17.438631697 +0000 UTC m=+0.181266845 container attach bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:34:17 np0005548788.localdomain sudo[76428]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:34:17 np0005548788.localdomain sudo[76428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 06 08:34:17 np0005548788.localdomain sudo[76428]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Activating special unit Exit the Session...
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped target Main User Target.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped target Basic System.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped target Paths.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped target Sockets.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped target Timers.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Closed D-Bus User Message Bus Socket.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Stopped Create User's Volatile Files and Directories.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Removed slice User Application Slice.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Reached target Shutdown.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Finished Exit the Session.
Dec 06 08:34:27 np0005548788.localdomain systemd[76335]: Reached target Exit the Session.
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 08:34:27 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 08:34:30 np0005548788.localdomain sudo[76434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:34:30 np0005548788.localdomain sudo[76434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:34:30 np0005548788.localdomain sudo[76434]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:30 np0005548788.localdomain sudo[76449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:34:30 np0005548788.localdomain sudo[76449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:34:31 np0005548788.localdomain sudo[76449]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:31 np0005548788.localdomain sudo[76495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:34:31 np0005548788.localdomain sudo[76495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:34:31 np0005548788.localdomain sudo[76495]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:34:37 np0005548788.localdomain podman[76510]: 2025-12-06 08:34:37.28444084 +0000 UTC m=+0.099210823 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Dec 06 08:34:37 np0005548788.localdomain podman[76513]: 2025-12-06 08:34:37.330154483 +0000 UTC m=+0.142113781 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 06 08:34:37 np0005548788.localdomain podman[76514]: 2025-12-06 08:34:37.397960634 +0000 UTC m=+0.202356123 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1)
Dec 06 08:34:37 np0005548788.localdomain podman[76513]: 2025-12-06 08:34:37.416787734 +0000 UTC m=+0.228747072 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64)
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:34:37 np0005548788.localdomain podman[76514]: 2025-12-06 08:34:37.439685187 +0000 UTC m=+0.244080676 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:34:37 np0005548788.localdomain podman[76510]: 2025-12-06 08:34:37.450381561 +0000 UTC m=+0.265151594 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:34:37 np0005548788.localdomain podman[76511]: 2025-12-06 08:34:37.537366552 +0000 UTC m=+0.352108514 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:34:37 np0005548788.localdomain podman[76512]: 2025-12-06 08:34:37.598397629 +0000 UTC m=+0.412980176 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Dec 06 08:34:37 np0005548788.localdomain podman[76511]: 2025-12-06 08:34:37.603025009 +0000 UTC m=+0.417766991 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:34:37 np0005548788.localdomain podman[76512]: 2025-12-06 08:34:37.614997792 +0000 UTC m=+0.429580279 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:34:37 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:34:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:34:40 np0005548788.localdomain podman[76623]: 2025-12-06 08:34:40.276002693 +0000 UTC m=+0.101437961 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12)
Dec 06 08:34:40 np0005548788.localdomain podman[76623]: 2025-12-06 08:34:40.638581822 +0000 UTC m=+0.464017070 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-type=git, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:34:40 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:34:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:34:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:34:42 np0005548788.localdomain podman[76646]: 2025-12-06 08:34:42.253432622 +0000 UTC m=+0.077306661 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, version=17.1.12)
Dec 06 08:34:42 np0005548788.localdomain podman[76645]: 2025-12-06 08:34:42.311530349 +0000 UTC m=+0.135862811 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, architecture=x86_64, io.openshift.expose-services=)
Dec 06 08:34:42 np0005548788.localdomain podman[76646]: 2025-12-06 08:34:42.330203904 +0000 UTC m=+0.154077873 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:34:42 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:34:42 np0005548788.localdomain podman[76645]: 2025-12-06 08:34:42.386085325 +0000 UTC m=+0.210417767 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:34:42 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:34:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:34:44 np0005548788.localdomain podman[76693]: 2025-12-06 08:34:44.276864922 +0000 UTC m=+0.101713278 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:34:44 np0005548788.localdomain podman[76693]: 2025-12-06 08:34:44.491617599 +0000 UTC m=+0.316466015 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, container_name=metrics_qdr)
Dec 06 08:34:44 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:34:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:34:47 np0005548788.localdomain podman[76722]: 2025-12-06 08:34:47.260581267 +0000 UTC m=+0.081897538 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step5)
Dec 06 08:34:47 np0005548788.localdomain podman[76722]: 2025-12-06 08:34:47.324896543 +0000 UTC m=+0.146212844 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:34:47 np0005548788.localdomain podman[76722]: unhealthy
Dec 06 08:34:47 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:34:47 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 08:34:54 np0005548788.localdomain sshd[76745]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:34:55 np0005548788.localdomain sshd[76745]: Received disconnect from 152.32.172.117 port 47464:11: Bye Bye [preauth]
Dec 06 08:34:55 np0005548788.localdomain sshd[76745]: Disconnected from authenticating user root 152.32.172.117 port 47464 [preauth]
Dec 06 08:34:55 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:34:55 np0005548788.localdomain recover_tripleo_nova_virtqemud[76748]: 62021
Dec 06 08:34:55 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:34:55 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:35:08 np0005548788.localdomain podman[76750]: 2025-12-06 08:35:08.290603649 +0000 UTC m=+0.110689491 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:35:08 np0005548788.localdomain podman[76749]: 2025-12-06 08:35:08.270117129 +0000 UTC m=+0.092023475 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond)
Dec 06 08:35:08 np0005548788.localdomain podman[76751]: 2025-12-06 08:35:08.334250329 +0000 UTC m=+0.151492725 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:35:08 np0005548788.localdomain podman[76759]: 2025-12-06 08:35:08.375953631 +0000 UTC m=+0.183446831 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 06 08:35:08 np0005548788.localdomain podman[76751]: 2025-12-06 08:35:08.387012685 +0000 UTC m=+0.204255121 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:35:08 np0005548788.localdomain podman[76752]: 2025-12-06 08:35:08.432371378 +0000 UTC m=+0.243933602 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.openshift.expose-services=, architecture=x86_64)
Dec 06 08:35:08 np0005548788.localdomain podman[76752]: 2025-12-06 08:35:08.44366259 +0000 UTC m=+0.255224854 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:35:08 np0005548788.localdomain podman[76749]: 2025-12-06 08:35:08.455377785 +0000 UTC m=+0.277284131 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc.)
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:35:08 np0005548788.localdomain podman[76750]: 2025-12-06 08:35:08.506683597 +0000 UTC m=+0.326769429 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:35:08 np0005548788.localdomain podman[76759]: 2025-12-06 08:35:08.558153394 +0000 UTC m=+0.365646554 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true)
Dec 06 08:35:08 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:35:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:35:11 np0005548788.localdomain podman[76850]: 2025-12-06 08:35:11.256578028 +0000 UTC m=+0.087214651 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, release=1761123044)
Dec 06 08:35:11 np0005548788.localdomain podman[76850]: 2025-12-06 08:35:11.636890964 +0000 UTC m=+0.467527567 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:35:11 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:35:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:35:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:35:13 np0005548788.localdomain podman[76873]: 2025-12-06 08:35:13.241020559 +0000 UTC m=+0.073696340 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 08:35:13 np0005548788.localdomain podman[76873]: 2025-12-06 08:35:13.273368038 +0000 UTC m=+0.106043789 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:35:13 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:35:13 np0005548788.localdomain podman[76874]: 2025-12-06 08:35:13.353800271 +0000 UTC m=+0.179867913 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:34:05Z, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:35:13 np0005548788.localdomain podman[76874]: 2025-12-06 08:35:13.401594487 +0000 UTC m=+0.227662129 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git)
Dec 06 08:35:13 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:35:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:35:15 np0005548788.localdomain systemd[1]: tmp-crun.2enRom.mount: Deactivated successfully.
Dec 06 08:35:15 np0005548788.localdomain podman[76919]: 2025-12-06 08:35:15.264340636 +0000 UTC m=+0.091082667 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:35:15 np0005548788.localdomain podman[76919]: 2025-12-06 08:35:15.468088831 +0000 UTC m=+0.294830882 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com)
Dec 06 08:35:15 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:35:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:35:18 np0005548788.localdomain podman[76949]: 2025-12-06 08:35:18.265742427 +0000 UTC m=+0.093282494 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:35:18 np0005548788.localdomain podman[76949]: 2025-12-06 08:35:18.325535765 +0000 UTC m=+0.153075852 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:35:18 np0005548788.localdomain podman[76949]: unhealthy
Dec 06 08:35:18 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:35:18 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 08:35:19 np0005548788.localdomain sshd[35853]: Received disconnect from 192.168.122.100 port 60464:11: disconnected by user
Dec 06 08:35:19 np0005548788.localdomain sshd[35853]: Disconnected from user zuul 192.168.122.100 port 60464
Dec 06 08:35:19 np0005548788.localdomain sshd[35850]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:35:19 np0005548788.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Dec 06 08:35:19 np0005548788.localdomain systemd[1]: session-27.scope: Consumed 3.128s CPU time.
Dec 06 08:35:19 np0005548788.localdomain systemd-logind[765]: Session 27 logged out. Waiting for processes to exit.
Dec 06 08:35:19 np0005548788.localdomain systemd-logind[765]: Removed session 27.
Dec 06 08:35:31 np0005548788.localdomain sudo[76971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:35:31 np0005548788.localdomain sudo[76971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:35:31 np0005548788.localdomain sudo[76971]: pam_unix(sudo:session): session closed for user root
Dec 06 08:35:32 np0005548788.localdomain sudo[76986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:35:32 np0005548788.localdomain sudo[76986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:35:33 np0005548788.localdomain sudo[76986]: pam_unix(sudo:session): session closed for user root
Dec 06 08:35:33 np0005548788.localdomain sudo[77033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:35:33 np0005548788.localdomain sudo[77033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:35:33 np0005548788.localdomain sudo[77033]: pam_unix(sudo:session): session closed for user root
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: tmp-crun.RPW6PU.mount: Deactivated successfully.
Dec 06 08:35:39 np0005548788.localdomain podman[77051]: 2025-12-06 08:35:39.249770877 +0000 UTC m=+0.068644968 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12)
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: tmp-crun.MuCZaG.mount: Deactivated successfully.
Dec 06 08:35:39 np0005548788.localdomain podman[77052]: 2025-12-06 08:35:39.271392381 +0000 UTC m=+0.084156957 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:35:39 np0005548788.localdomain podman[77050]: 2025-12-06 08:35:39.317995581 +0000 UTC m=+0.137746079 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:35:39 np0005548788.localdomain podman[77049]: 2025-12-06 08:35:39.291800518 +0000 UTC m=+0.114027840 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-type=git, config_id=tripleo_step3, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:35:39 np0005548788.localdomain podman[77051]: 2025-12-06 08:35:39.34539799 +0000 UTC m=+0.164272081 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, container_name=iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 06 08:35:39 np0005548788.localdomain podman[77052]: 2025-12-06 08:35:39.353164465 +0000 UTC m=+0.165929091 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:35:39 np0005548788.localdomain podman[77050]: 2025-12-06 08:35:39.374535352 +0000 UTC m=+0.194285810 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:35:39 np0005548788.localdomain podman[77048]: 2025-12-06 08:35:39.358486186 +0000 UTC m=+0.179806061 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.12)
Dec 06 08:35:39 np0005548788.localdomain podman[77049]: 2025-12-06 08:35:39.427755222 +0000 UTC m=+0.249982614 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, release=1761123044, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd)
Dec 06 08:35:39 np0005548788.localdomain podman[77048]: 2025-12-06 08:35:39.442756236 +0000 UTC m=+0.264076131 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:35:39 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:35:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:35:42 np0005548788.localdomain podman[77154]: 2025-12-06 08:35:42.273974577 +0000 UTC m=+0.094400188 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:35:42 np0005548788.localdomain podman[77154]: 2025-12-06 08:35:42.652715646 +0000 UTC m=+0.473141217 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target)
Dec 06 08:35:42 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:35:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:35:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:35:44 np0005548788.localdomain systemd[1]: tmp-crun.GP9zgH.mount: Deactivated successfully.
Dec 06 08:35:44 np0005548788.localdomain podman[77178]: 2025-12-06 08:35:44.270602726 +0000 UTC m=+0.099682946 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git)
Dec 06 08:35:44 np0005548788.localdomain podman[77179]: 2025-12-06 08:35:44.367730545 +0000 UTC m=+0.193127804 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team)
Dec 06 08:35:44 np0005548788.localdomain podman[77179]: 2025-12-06 08:35:44.392519215 +0000 UTC m=+0.217916444 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, architecture=x86_64)
Dec 06 08:35:44 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:35:44 np0005548788.localdomain podman[77178]: 2025-12-06 08:35:44.446465077 +0000 UTC m=+0.275545307 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com)
Dec 06 08:35:44 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:35:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:35:46 np0005548788.localdomain podman[77227]: 2025-12-06 08:35:46.265178895 +0000 UTC m=+0.090272302 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:35:46 np0005548788.localdomain podman[77227]: 2025-12-06 08:35:46.494879494 +0000 UTC m=+0.319972881 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:35:46 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:35:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:35:49 np0005548788.localdomain podman[77256]: 2025-12-06 08:35:49.265134751 +0000 UTC m=+0.089842579 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 08:35:49 np0005548788.localdomain podman[77256]: 2025-12-06 08:35:49.29912841 +0000 UTC m=+0.123836198 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute)
Dec 06 08:35:49 np0005548788.localdomain podman[77256]: unhealthy
Dec 06 08:35:49 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:35:49 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 08:36:01 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:36:01 np0005548788.localdomain recover_tripleo_nova_virtqemud[77279]: 62021
Dec 06 08:36:01 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:36:01 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:36:10 np0005548788.localdomain podman[77281]: 2025-12-06 08:36:10.265864068 +0000 UTC m=+0.088134888 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Dec 06 08:36:10 np0005548788.localdomain podman[77282]: 2025-12-06 08:36:10.329551405 +0000 UTC m=+0.150522086 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git)
Dec 06 08:36:10 np0005548788.localdomain podman[77282]: 2025-12-06 08:36:10.362690257 +0000 UTC m=+0.183660978 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:11:48Z, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:36:10 np0005548788.localdomain podman[77280]: 2025-12-06 08:36:10.36443494 +0000 UTC m=+0.192131105 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, container_name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:36:10 np0005548788.localdomain podman[77283]: 2025-12-06 08:36:10.42658397 +0000 UTC m=+0.243520359 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git)
Dec 06 08:36:10 np0005548788.localdomain podman[77280]: 2025-12-06 08:36:10.445501683 +0000 UTC m=+0.273197848 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible)
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:36:10 np0005548788.localdomain podman[77283]: 2025-12-06 08:36:10.459467645 +0000 UTC m=+0.276404014 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:36:10 np0005548788.localdomain podman[77289]: 2025-12-06 08:36:10.293348949 +0000 UTC m=+0.103411219 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:36:10 np0005548788.localdomain podman[77281]: 2025-12-06 08:36:10.497390423 +0000 UTC m=+0.319661293 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:10 np0005548788.localdomain podman[77289]: 2025-12-06 08:36:10.528449303 +0000 UTC m=+0.338511583 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:36:10 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:36:11 np0005548788.localdomain sshd[77393]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:36:12 np0005548788.localdomain sshd[77393]: Received disconnect from 152.32.172.117 port 40320:11: Bye Bye [preauth]
Dec 06 08:36:12 np0005548788.localdomain sshd[77393]: Disconnected from authenticating user root 152.32.172.117 port 40320 [preauth]
Dec 06 08:36:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:36:13 np0005548788.localdomain podman[77395]: 2025-12-06 08:36:13.248345046 +0000 UTC m=+0.078379533 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.buildah.version=1.41.4, container_name=nova_migration_target, version=17.1.12, architecture=x86_64)
Dec 06 08:36:13 np0005548788.localdomain podman[77395]: 2025-12-06 08:36:13.624490676 +0000 UTC m=+0.454525143 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:36:13 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:36:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:36:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:36:15 np0005548788.localdomain podman[77419]: 2025-12-06 08:36:15.261423052 +0000 UTC m=+0.083468416 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:15 np0005548788.localdomain podman[77419]: 2025-12-06 08:36:15.312358073 +0000 UTC m=+0.134403397 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, container_name=ovn_controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:36:15 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:36:15 np0005548788.localdomain podman[77418]: 2025-12-06 08:36:15.312069305 +0000 UTC m=+0.137116150 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:36:15 np0005548788.localdomain podman[77418]: 2025-12-06 08:36:15.401678906 +0000 UTC m=+0.226725801 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, container_name=ovn_metadata_agent, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:36:15 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:36:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:36:17 np0005548788.localdomain podman[77467]: 2025-12-06 08:36:17.2639592 +0000 UTC m=+0.091545821 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12)
Dec 06 08:36:17 np0005548788.localdomain podman[77467]: 2025-12-06 08:36:17.513593213 +0000 UTC m=+0.341179834 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:17 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:36:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:36:20 np0005548788.localdomain podman[77495]: 2025-12-06 08:36:20.259468363 +0000 UTC m=+0.082039523 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:36:20 np0005548788.localdomain podman[77495]: 2025-12-06 08:36:20.347946411 +0000 UTC m=+0.170517531 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute)
Dec 06 08:36:20 np0005548788.localdomain podman[77495]: unhealthy
Dec 06 08:36:20 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:36:20 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 08:36:33 np0005548788.localdomain sudo[77517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:36:33 np0005548788.localdomain sudo[77517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:36:33 np0005548788.localdomain sudo[77517]: pam_unix(sudo:session): session closed for user root
Dec 06 08:36:33 np0005548788.localdomain sudo[77532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:36:33 np0005548788.localdomain sudo[77532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:36:34 np0005548788.localdomain sudo[77532]: pam_unix(sudo:session): session closed for user root
Dec 06 08:36:35 np0005548788.localdomain sudo[77579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:36:35 np0005548788.localdomain sudo[77579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:36:35 np0005548788.localdomain sudo[77579]: pam_unix(sudo:session): session closed for user root
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:36:41 np0005548788.localdomain podman[77596]: 2025-12-06 08:36:41.290751695 +0000 UTC m=+0.106654788 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:36:41 np0005548788.localdomain podman[77596]: 2025-12-06 08:36:41.375647303 +0000 UTC m=+0.191550376 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: tmp-crun.0EXwJU.mount: Deactivated successfully.
Dec 06 08:36:41 np0005548788.localdomain podman[77597]: 2025-12-06 08:36:41.390068429 +0000 UTC m=+0.202968871 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container)
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:36:41 np0005548788.localdomain podman[77597]: 2025-12-06 08:36:41.429006078 +0000 UTC m=+0.241906440 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, version=17.1.12)
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:36:41 np0005548788.localdomain podman[77599]: 2025-12-06 08:36:41.447035063 +0000 UTC m=+0.252919593 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git)
Dec 06 08:36:41 np0005548788.localdomain podman[77595]: 2025-12-06 08:36:41.354187074 +0000 UTC m=+0.171305584 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 08:36:41 np0005548788.localdomain podman[77595]: 2025-12-06 08:36:41.484880988 +0000 UTC m=+0.301999508 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, tcib_managed=true, release=1761123044, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vcs-type=git, container_name=collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:36:41 np0005548788.localdomain podman[77599]: 2025-12-06 08:36:41.506796221 +0000 UTC m=+0.312680801 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:36:41 np0005548788.localdomain podman[77594]: 2025-12-06 08:36:41.486429376 +0000 UTC m=+0.306188916 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Dec 06 08:36:41 np0005548788.localdomain podman[77594]: 2025-12-06 08:36:41.566518048 +0000 UTC m=+0.386277538 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 08:36:41 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:36:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:36:44 np0005548788.localdomain systemd[1]: tmp-crun.67WeWX.mount: Deactivated successfully.
Dec 06 08:36:44 np0005548788.localdomain podman[77703]: 2025-12-06 08:36:44.263716985 +0000 UTC m=+0.089968363 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:36:44 np0005548788.localdomain podman[77703]: 2025-12-06 08:36:44.594025869 +0000 UTC m=+0.420277277 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044)
Dec 06 08:36:44 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:36:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:36:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:36:46 np0005548788.localdomain podman[77726]: 2025-12-06 08:36:46.259629603 +0000 UTC m=+0.088923971 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, tcib_managed=true)
Dec 06 08:36:46 np0005548788.localdomain podman[77727]: 2025-12-06 08:36:46.315854214 +0000 UTC m=+0.141705218 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vendor=Red Hat, Inc., container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Dec 06 08:36:46 np0005548788.localdomain podman[77726]: 2025-12-06 08:36:46.332652112 +0000 UTC m=+0.161946500 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z)
Dec 06 08:36:46 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:36:46 np0005548788.localdomain podman[77727]: 2025-12-06 08:36:46.375834789 +0000 UTC m=+0.201685763 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, release=1761123044, architecture=x86_64, vcs-type=git)
Dec 06 08:36:46 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:36:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:36:48 np0005548788.localdomain systemd[1]: tmp-crun.nldd4F.mount: Deactivated successfully.
Dec 06 08:36:48 np0005548788.localdomain podman[77772]: 2025-12-06 08:36:48.267960116 +0000 UTC m=+0.095255173 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 08:36:48 np0005548788.localdomain podman[77772]: 2025-12-06 08:36:48.477685452 +0000 UTC m=+0.304980499 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, distribution-scope=public, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:36:48 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:36:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:36:51 np0005548788.localdomain systemd[1]: tmp-crun.g6xpFh.mount: Deactivated successfully.
Dec 06 08:36:51 np0005548788.localdomain podman[77802]: 2025-12-06 08:36:51.253352253 +0000 UTC m=+0.078997012 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z)
Dec 06 08:36:51 np0005548788.localdomain podman[77802]: 2025-12-06 08:36:51.315278736 +0000 UTC m=+0.140923495 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:36:51 np0005548788.localdomain podman[77802]: unhealthy
Dec 06 08:36:51 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:36:51 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:37:12 np0005548788.localdomain podman[77825]: 2025-12-06 08:37:12.266931374 +0000 UTC m=+0.086594361 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, release=1761123044, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:37:12 np0005548788.localdomain podman[77825]: 2025-12-06 08:37:12.285503287 +0000 UTC m=+0.105166264 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com)
Dec 06 08:37:12 np0005548788.localdomain podman[77826]: 2025-12-06 08:37:12.321187076 +0000 UTC m=+0.138441210 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public)
Dec 06 08:37:12 np0005548788.localdomain podman[77824]: 2025-12-06 08:37:12.2843178 +0000 UTC m=+0.105689148 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public)
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:37:12 np0005548788.localdomain podman[77824]: 2025-12-06 08:37:12.369627052 +0000 UTC m=+0.190998330 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:37:12 np0005548788.localdomain podman[77826]: 2025-12-06 08:37:12.380521962 +0000 UTC m=+0.197776056 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:37:12 np0005548788.localdomain podman[77828]: 2025-12-06 08:37:12.419336456 +0000 UTC m=+0.232194506 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:37:12 np0005548788.localdomain podman[77827]: 2025-12-06 08:37:12.384926854 +0000 UTC m=+0.199084214 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, container_name=iscsid)
Dec 06 08:37:12 np0005548788.localdomain podman[77828]: 2025-12-06 08:37:12.467635957 +0000 UTC m=+0.280493997 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64)
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:37:12 np0005548788.localdomain podman[77827]: 2025-12-06 08:37:12.51863413 +0000 UTC m=+0.332791480 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Dec 06 08:37:12 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:37:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:37:15 np0005548788.localdomain podman[77934]: 2025-12-06 08:37:15.256327641 +0000 UTC m=+0.088180339 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:37:15 np0005548788.localdomain podman[77934]: 2025-12-06 08:37:15.634718069 +0000 UTC m=+0.466570807 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64)
Dec 06 08:37:15 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:37:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:37:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:37:17 np0005548788.localdomain podman[77959]: 2025-12-06 08:37:17.259375064 +0000 UTC m=+0.089045876 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, build-date=2025-11-19T00:14:25Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 08:37:17 np0005548788.localdomain podman[77960]: 2025-12-06 08:37:17.308086219 +0000 UTC m=+0.135311557 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Dec 06 08:37:17 np0005548788.localdomain podman[77960]: 2025-12-06 08:37:17.328754854 +0000 UTC m=+0.155980202 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 06 08:37:17 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:37:17 np0005548788.localdomain podman[77959]: 2025-12-06 08:37:17.360304168 +0000 UTC m=+0.189975020 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, distribution-scope=public, container_name=ovn_metadata_agent, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Dec 06 08:37:17 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:37:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:37:19 np0005548788.localdomain podman[78008]: 2025-12-06 08:37:19.257290523 +0000 UTC m=+0.084606221 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:37:19 np0005548788.localdomain podman[78008]: 2025-12-06 08:37:19.486658043 +0000 UTC m=+0.313973691 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 06 08:37:19 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:37:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:37:22 np0005548788.localdomain podman[78124]: 2025-12-06 08:37:22.236119967 +0000 UTC m=+0.066121754 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:37:22 np0005548788.localdomain podman[78124]: 2025-12-06 08:37:22.297828834 +0000 UTC m=+0.127830641 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=)
Dec 06 08:37:22 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:37:24 np0005548788.localdomain sshd[78152]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:37:25 np0005548788.localdomain sshd[78154]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:37:25 np0005548788.localdomain sshd[78154]: error: kex_exchange_identification: client sent invalid protocol identifier "MGLNDD_38.102.83.97_22"
Dec 06 08:37:25 np0005548788.localdomain sshd[78154]: banner exchange: Connection from 20.163.1.211 port 48576: invalid format
Dec 06 08:37:27 np0005548788.localdomain sshd[78156]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:37:29 np0005548788.localdomain sshd[78156]: Received disconnect from 152.32.172.117 port 57020:11: Bye Bye [preauth]
Dec 06 08:37:29 np0005548788.localdomain sshd[78156]: Disconnected from authenticating user root 152.32.172.117 port 57020 [preauth]
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:37:29 np0005548788.localdomain recover_tripleo_nova_virtqemud[78159]: 62021
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: libpod-bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5.scope: Deactivated successfully.
Dec 06 08:37:29 np0005548788.localdomain podman[78160]: 2025-12-06 08:37:29.767810009 +0000 UTC m=+0.062416689 container died bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, url=https://www.redhat.com, container_name=nova_wait_for_compute_service, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, release=1761123044, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z)
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: tmp-crun.4cHsjG.mount: Deactivated successfully.
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5-userdata-shm.mount: Deactivated successfully.
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3b38e1b3ae4684200fe2ade3176809882413b165be2193c39c12f1ac0f693972-merged.mount: Deactivated successfully.
Dec 06 08:37:29 np0005548788.localdomain podman[78160]: 2025-12-06 08:37:29.814711638 +0000 UTC m=+0.109318278 container cleanup bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, url=https://www.redhat.com, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:37:29 np0005548788.localdomain systemd[1]: libpod-conmon-bcfb4c5814087c01b7e8e077822766089f0500538f1cc7fb018f65b243f11ee5.scope: Deactivated successfully.
Dec 06 08:37:29 np0005548788.localdomain python3[76254]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=558ed7a6d0c1bb3d92c212dc57d9717b --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:37:30 np0005548788.localdomain sudo[76252]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:30 np0005548788.localdomain sudo[78211]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idbmxpyjucmjinzwthmepczyhjbpslbx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:30 np0005548788.localdomain sudo[78211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:30 np0005548788.localdomain python3[78213]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:30 np0005548788.localdomain sudo[78211]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:30 np0005548788.localdomain sudo[78227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmtttilavtgbdjlmdhfrluslqysnhlhe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:30 np0005548788.localdomain sudo[78227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:30 np0005548788.localdomain python3[78229]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:37:30 np0005548788.localdomain sudo[78227]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:31 np0005548788.localdomain sudo[78288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkeqniqjadepbdovhsbjlrkblzhmvmvi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:31 np0005548788.localdomain sudo[78288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:31 np0005548788.localdomain python3[78290]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010250.7848084-118345-17806656720580/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:31 np0005548788.localdomain sudo[78288]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:31 np0005548788.localdomain sudo[78304]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubvkrlzpkbwoaopgbujvrearsvvngcnv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:31 np0005548788.localdomain sudo[78304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:31 np0005548788.localdomain python3[78306]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:37:31 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:37:31 np0005548788.localdomain systemd-sysv-generator[78337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:37:31 np0005548788.localdomain systemd-rc-local-generator[78332]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:37:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:37:32 np0005548788.localdomain sudo[78304]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:32 np0005548788.localdomain sudo[78356]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvamfabokhlqwkpqhgvnhouqvrcysbtk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:32 np0005548788.localdomain sudo[78356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:32 np0005548788.localdomain python3[78358]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:37:32 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:37:32 np0005548788.localdomain systemd-sysv-generator[78386]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:37:32 np0005548788.localdomain systemd-rc-local-generator[78382]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:37:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:37:33 np0005548788.localdomain systemd[1]: Starting nova_compute container...
Dec 06 08:37:33 np0005548788.localdomain tripleo-start-podman-container[78398]: Creating additional drop-in dependency for "nova_compute" (56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76)
Dec 06 08:37:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 08:37:33 np0005548788.localdomain systemd-sysv-generator[78459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:37:33 np0005548788.localdomain systemd-rc-local-generator[78454]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:37:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:37:34 np0005548788.localdomain systemd[1]: Started nova_compute container.
Dec 06 08:37:34 np0005548788.localdomain sudo[78356]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:34 np0005548788.localdomain sudo[78495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fludumkvmtzessriuhlluyrymsbtrsrs ; /usr/bin/python3
Dec 06 08:37:34 np0005548788.localdomain sudo[78495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:34 np0005548788.localdomain python3[78497]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:34 np0005548788.localdomain sudo[78495]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:34 np0005548788.localdomain sshd[78152]: Connection closed by 20.163.1.211 port 48568 [preauth]
Dec 06 08:37:35 np0005548788.localdomain sudo[78543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atmxesfeidyargcvqfpnohdlbfwqyqat ; /usr/bin/python3
Dec 06 08:37:35 np0005548788.localdomain sudo[78543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:35 np0005548788.localdomain sudo[78543]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:35 np0005548788.localdomain sudo[78546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:37:35 np0005548788.localdomain sudo[78546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:37:35 np0005548788.localdomain sudo[78546]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:35 np0005548788.localdomain sudo[78574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:37:35 np0005548788.localdomain sudo[78574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:37:35 np0005548788.localdomain sudo[78616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjevcwppwibbhldnmyhrrtewbrrgznlw ; /usr/bin/python3
Dec 06 08:37:35 np0005548788.localdomain sudo[78616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:35 np0005548788.localdomain sudo[78616]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:35 np0005548788.localdomain sudo[78664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsvehuocupwbfwuugnxmftoteywbjwlk ; /usr/bin/python3
Dec 06 08:37:35 np0005548788.localdomain sudo[78664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:36 np0005548788.localdomain sudo[78574]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548788.localdomain python3[78668]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005548788 step=5 update_config_hash_only=False
Dec 06 08:37:36 np0005548788.localdomain sudo[78664]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548788.localdomain sudo[78694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaiyxkbjhnwfyajaazizyaoqwqfmpopk ; /usr/bin/python3
Dec 06 08:37:36 np0005548788.localdomain sudo[78694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:36 np0005548788.localdomain sudo[78697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:37:36 np0005548788.localdomain sudo[78697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:37:36 np0005548788.localdomain sudo[78697]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548788.localdomain python3[78696]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:36 np0005548788.localdomain sudo[78694]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548788.localdomain sudo[78725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvnwwdsazzyxamqebgoqwejbvomezadt ; /usr/bin/python3
Dec 06 08:37:36 np0005548788.localdomain sudo[78725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:37 np0005548788.localdomain python3[78727]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:37:37 np0005548788.localdomain sudo[78725]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:37:43 np0005548788.localdomain podman[78730]: 2025-12-06 08:37:43.283400242 +0000 UTC m=+0.099121084 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:37:43 np0005548788.localdomain podman[78729]: 2025-12-06 08:37:43.331427565 +0000 UTC m=+0.152894775 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=1761123044, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12)
Dec 06 08:37:43 np0005548788.localdomain podman[78730]: 2025-12-06 08:37:43.342679243 +0000 UTC m=+0.158400075 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12)
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:37:43 np0005548788.localdomain podman[78729]: 2025-12-06 08:37:43.394484024 +0000 UTC m=+0.215951164 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:37:43 np0005548788.localdomain podman[78728]: 2025-12-06 08:37:43.485858857 +0000 UTC m=+0.307234564 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond)
Dec 06 08:37:43 np0005548788.localdomain podman[78728]: 2025-12-06 08:37:43.523617154 +0000 UTC m=+0.344992871 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4)
Dec 06 08:37:43 np0005548788.localdomain podman[78742]: 2025-12-06 08:37:43.534423128 +0000 UTC m=+0.345837047 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ceilometer_agent_ipmi, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:37:43 np0005548788.localdomain podman[78742]: 2025-12-06 08:37:43.568879652 +0000 UTC m=+0.380293581 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:37:43 np0005548788.localdomain podman[78731]: 2025-12-06 08:37:43.579391668 +0000 UTC m=+0.391913941 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12)
Dec 06 08:37:43 np0005548788.localdomain podman[78731]: 2025-12-06 08:37:43.663270409 +0000 UTC m=+0.475792662 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp17/openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Dec 06 08:37:43 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:37:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:37:46 np0005548788.localdomain podman[78840]: 2025-12-06 08:37:46.26769776 +0000 UTC m=+0.087791462 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:37:46 np0005548788.localdomain podman[78840]: 2025-12-06 08:37:46.640860129 +0000 UTC m=+0.460953861 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:37:46 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:37:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:37:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:37:48 np0005548788.localdomain podman[78865]: 2025-12-06 08:37:48.258697327 +0000 UTC m=+0.087571876 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:37:48 np0005548788.localdomain systemd[1]: tmp-crun.yNjnVp.mount: Deactivated successfully.
Dec 06 08:37:48 np0005548788.localdomain podman[78865]: 2025-12-06 08:37:48.319480796 +0000 UTC m=+0.148355385 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 06 08:37:48 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:37:48 np0005548788.localdomain podman[78866]: 2025-12-06 08:37:48.326385168 +0000 UTC m=+0.152164562 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:37:48 np0005548788.localdomain podman[78866]: 2025-12-06 08:37:48.410973962 +0000 UTC m=+0.236753326 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:37:48 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:37:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:37:50 np0005548788.localdomain podman[78911]: 2025-12-06 08:37:50.265312627 +0000 UTC m=+0.087728422 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z)
Dec 06 08:37:50 np0005548788.localdomain podman[78911]: 2025-12-06 08:37:50.484259141 +0000 UTC m=+0.306674996 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Dec 06 08:37:50 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:37:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:37:53 np0005548788.localdomain podman[78942]: 2025-12-06 08:37:53.267555839 +0000 UTC m=+0.095364487 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, release=1761123044, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:37:53 np0005548788.localdomain podman[78942]: 2025-12-06 08:37:53.301854299 +0000 UTC m=+0.129662967 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z)
Dec 06 08:37:53 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:38:01 np0005548788.localdomain sshd[78968]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:38:02 np0005548788.localdomain sshd[78968]: Accepted publickey for zuul from 192.168.122.100 port 51540 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:38:02 np0005548788.localdomain systemd-logind[765]: New session 33 of user zuul.
Dec 06 08:38:02 np0005548788.localdomain systemd[1]: Started Session 33 of User zuul.
Dec 06 08:38:02 np0005548788.localdomain sshd[78968]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:38:02 np0005548788.localdomain sudo[79075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uszckzjribyyauyfsrocrqhgdkqutydx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765010282.2122521-40280-128088198204547/AnsiballZ_setup.py
Dec 06 08:38:02 np0005548788.localdomain sudo[79075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 08:38:03 np0005548788.localdomain python3[79077]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 08:38:05 np0005548788.localdomain sudo[79075]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:10 np0005548788.localdomain sudo[79338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myagcjpotazdnlxfpxctpkntldhucllz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765010290.0668154-40368-76894295556975/AnsiballZ_dnf.py
Dec 06 08:38:10 np0005548788.localdomain sudo[79338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 08:38:10 np0005548788.localdomain python3[79340]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Dec 06 08:38:13 np0005548788.localdomain sudo[79338]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:38:14 np0005548788.localdomain podman[79359]: 2025-12-06 08:38:14.273318211 +0000 UTC m=+0.089619589 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:38:14 np0005548788.localdomain podman[79372]: 2025-12-06 08:38:14.336063161 +0000 UTC m=+0.146062505 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:38:14 np0005548788.localdomain podman[79372]: 2025-12-06 08:38:14.368688468 +0000 UTC m=+0.178687822 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:38:14 np0005548788.localdomain podman[79358]: 2025-12-06 08:38:14.3810455 +0000 UTC m=+0.201470105 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step3, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:38:14 np0005548788.localdomain podman[79358]: 2025-12-06 08:38:14.391595916 +0000 UTC m=+0.212020571 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044)
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:38:14 np0005548788.localdomain podman[79357]: 2025-12-06 08:38:14.434795901 +0000 UTC m=+0.255681490 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:38:14 np0005548788.localdomain podman[79357]: 2025-12-06 08:38:14.472692521 +0000 UTC m=+0.293578080 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, managed_by=tripleo_ansible)
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:38:14 np0005548788.localdomain podman[79360]: 2025-12-06 08:38:14.491888565 +0000 UTC m=+0.304220250 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 06 08:38:14 np0005548788.localdomain podman[79360]: 2025-12-06 08:38:14.505758683 +0000 UTC m=+0.318090368 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64)
Dec 06 08:38:14 np0005548788.localdomain podman[79359]: 2025-12-06 08:38:14.506171846 +0000 UTC m=+0.322473264 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, version=17.1.12, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public)
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:38:14 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:38:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:38:17 np0005548788.localdomain podman[79469]: 2025-12-06 08:38:17.244728491 +0000 UTC m=+0.070156548 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 06 08:38:17 np0005548788.localdomain podman[79469]: 2025-12-06 08:38:17.618093667 +0000 UTC m=+0.443521714 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z)
Dec 06 08:38:17 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:38:17 np0005548788.localdomain sudo[79566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqhlyhdduwhtqatbzvvpdskgnbenaqme ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765010297.3424554-40424-241872026144050/AnsiballZ_iptables.py
Dec 06 08:38:17 np0005548788.localdomain sudo[79566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 08:38:17 np0005548788.localdomain python3[79568]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Dec 06 08:38:17 np0005548788.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 06 08:38:17 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Dec 06 08:38:17 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 08:38:17 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:38:17 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:38:17 np0005548788.localdomain sudo[79566]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:38:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:38:19 np0005548788.localdomain podman[79592]: 2025-12-06 08:38:19.236242175 +0000 UTC m=+0.063225545 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:38:19 np0005548788.localdomain systemd[1]: tmp-crun.Y6aq3n.mount: Deactivated successfully.
Dec 06 08:38:19 np0005548788.localdomain podman[79591]: 2025-12-06 08:38:19.259494593 +0000 UTC m=+0.086512224 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:38:19 np0005548788.localdomain podman[79592]: 2025-12-06 08:38:19.294566877 +0000 UTC m=+0.121550227 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 06 08:38:19 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:38:19 np0005548788.localdomain podman[79591]: 2025-12-06 08:38:19.31314009 +0000 UTC m=+0.140157691 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.)
Dec 06 08:38:19 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:38:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:38:21 np0005548788.localdomain podman[79639]: 2025-12-06 08:38:21.259381275 +0000 UTC m=+0.086871075 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:38:21 np0005548788.localdomain podman[79639]: 2025-12-06 08:38:21.486006127 +0000 UTC m=+0.313495937 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:38:21 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:38:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:38:24 np0005548788.localdomain systemd[1]: tmp-crun.UjmppL.mount: Deactivated successfully.
Dec 06 08:38:24 np0005548788.localdomain podman[79715]: 2025-12-06 08:38:24.255737855 +0000 UTC m=+0.089873277 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:38:24 np0005548788.localdomain podman[79715]: 2025-12-06 08:38:24.282173252 +0000 UTC m=+0.116308654 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, architecture=x86_64, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:38:24 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:38:36 np0005548788.localdomain sudo[79740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:38:36 np0005548788.localdomain sudo[79740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:38:36 np0005548788.localdomain sudo[79740]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:36 np0005548788.localdomain sudo[79755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:38:36 np0005548788.localdomain sudo[79755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:38:37 np0005548788.localdomain sudo[79755]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:38 np0005548788.localdomain sudo[79801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:38:38 np0005548788.localdomain sudo[79801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:38:38 np0005548788.localdomain sudo[79801]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:41 np0005548788.localdomain sshd[79816]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:38:43 np0005548788.localdomain sshd[79816]: Received disconnect from 152.32.172.117 port 34658:11: Bye Bye [preauth]
Dec 06 08:38:43 np0005548788.localdomain sshd[79816]: Disconnected from authenticating user root 152.32.172.117 port 34658 [preauth]
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:38:45 np0005548788.localdomain podman[79819]: 2025-12-06 08:38:45.285121388 +0000 UTC m=+0.098730941 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:38:45 np0005548788.localdomain podman[79819]: 2025-12-06 08:38:45.334224156 +0000 UTC m=+0.147833769 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:38:45 np0005548788.localdomain podman[79818]: 2025-12-06 08:38:45.330913044 +0000 UTC m=+0.144693592 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044)
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:38:45 np0005548788.localdomain podman[79820]: 2025-12-06 08:38:45.395596852 +0000 UTC m=+0.203477488 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:38:45 np0005548788.localdomain podman[79820]: 2025-12-06 08:38:45.43275023 +0000 UTC m=+0.240630906 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Dec 06 08:38:45 np0005548788.localdomain podman[79832]: 2025-12-06 08:38:45.451663725 +0000 UTC m=+0.253716051 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:38:45 np0005548788.localdomain podman[79821]: 2025-12-06 08:38:45.501598068 +0000 UTC m=+0.308579467 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:38:45 np0005548788.localdomain podman[79832]: 2025-12-06 08:38:45.511112911 +0000 UTC m=+0.313165207 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git)
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:38:45 np0005548788.localdomain podman[79818]: 2025-12-06 08:38:45.523296397 +0000 UTC m=+0.337076905 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-type=git)
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:38:45 np0005548788.localdomain podman[79821]: 2025-12-06 08:38:45.540623384 +0000 UTC m=+0.347604763 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1)
Dec 06 08:38:45 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:38:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:38:48 np0005548788.localdomain systemd[1]: tmp-crun.R49WlW.mount: Deactivated successfully.
Dec 06 08:38:48 np0005548788.localdomain podman[79929]: 2025-12-06 08:38:48.272640257 +0000 UTC m=+0.096893765 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:38:48 np0005548788.localdomain podman[79929]: 2025-12-06 08:38:48.619790113 +0000 UTC m=+0.444043621 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 08:38:48 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:38:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:38:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:38:50 np0005548788.localdomain podman[79953]: 2025-12-06 08:38:50.260715603 +0000 UTC m=+0.085774060 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z)
Dec 06 08:38:50 np0005548788.localdomain podman[79953]: 2025-12-06 08:38:50.291755223 +0000 UTC m=+0.116813680 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:38:50 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:38:50 np0005548788.localdomain podman[79952]: 2025-12-06 08:38:50.313055611 +0000 UTC m=+0.141743980 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git)
Dec 06 08:38:50 np0005548788.localdomain podman[79952]: 2025-12-06 08:38:50.393716803 +0000 UTC m=+0.222405112 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 08:38:50 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:38:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:38:52 np0005548788.localdomain systemd[1]: tmp-crun.ZUlDEI.mount: Deactivated successfully.
Dec 06 08:38:52 np0005548788.localdomain podman[80000]: 2025-12-06 08:38:52.26591644 +0000 UTC m=+0.090416814 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:38:52 np0005548788.localdomain podman[80000]: 2025-12-06 08:38:52.470181492 +0000 UTC m=+0.294681836 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:38:52 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:38:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:38:55 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:38:55 np0005548788.localdomain recover_tripleo_nova_virtqemud[80029]: 62021
Dec 06 08:38:55 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:38:55 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:38:55 np0005548788.localdomain podman[80027]: 2025-12-06 08:38:55.253265324 +0000 UTC m=+0.084266445 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:38:55 np0005548788.localdomain podman[80027]: 2025-12-06 08:38:55.284671454 +0000 UTC m=+0.115672555 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:38:55 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:39:16 np0005548788.localdomain podman[80056]: 2025-12-06 08:39:16.261959288 +0000 UTC m=+0.087174694 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_id=tripleo_step3, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:51:28Z, container_name=collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:39:16 np0005548788.localdomain podman[80056]: 2025-12-06 08:39:16.275534988 +0000 UTC m=+0.100750404 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:39:16 np0005548788.localdomain podman[80055]: 2025-12-06 08:39:16.374463174 +0000 UTC m=+0.201704213 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1)
Dec 06 08:39:16 np0005548788.localdomain podman[80055]: 2025-12-06 08:39:16.381557024 +0000 UTC m=+0.208798143 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:39:16 np0005548788.localdomain podman[80064]: 2025-12-06 08:39:16.423963164 +0000 UTC m=+0.239271245 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 08:39:16 np0005548788.localdomain podman[80058]: 2025-12-06 08:39:16.335731167 +0000 UTC m=+0.154057871 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 06 08:39:16 np0005548788.localdomain podman[80058]: 2025-12-06 08:39:16.46947081 +0000 UTC m=+0.287797454 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=)
Dec 06 08:39:16 np0005548788.localdomain podman[80064]: 2025-12-06 08:39:16.477731205 +0000 UTC m=+0.293039226 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, architecture=x86_64)
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:39:16 np0005548788.localdomain podman[80057]: 2025-12-06 08:39:16.471168102 +0000 UTC m=+0.292359394 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:39:16 np0005548788.localdomain podman[80057]: 2025-12-06 08:39:16.554686782 +0000 UTC m=+0.375878104 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:39:16 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:39:17 np0005548788.localdomain sshd[78971]: Received disconnect from 192.168.122.100 port 51540:11: disconnected by user
Dec 06 08:39:17 np0005548788.localdomain sshd[78971]: Disconnected from user zuul 192.168.122.100 port 51540
Dec 06 08:39:17 np0005548788.localdomain sshd[78968]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:39:17 np0005548788.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Dec 06 08:39:17 np0005548788.localdomain systemd[1]: session-33.scope: Consumed 5.817s CPU time.
Dec 06 08:39:17 np0005548788.localdomain systemd-logind[765]: Session 33 logged out. Waiting for processes to exit.
Dec 06 08:39:17 np0005548788.localdomain systemd-logind[765]: Removed session 33.
Dec 06 08:39:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:39:19 np0005548788.localdomain podman[80169]: 2025-12-06 08:39:19.245622236 +0000 UTC m=+0.076503094 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git)
Dec 06 08:39:19 np0005548788.localdomain podman[80169]: 2025-12-06 08:39:19.620682195 +0000 UTC m=+0.451563013 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z)
Dec 06 08:39:19 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:39:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:39:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:39:21 np0005548788.localdomain systemd[1]: tmp-crun.c4pJvw.mount: Deactivated successfully.
Dec 06 08:39:21 np0005548788.localdomain podman[80193]: 2025-12-06 08:39:21.27504227 +0000 UTC m=+0.098772862 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:39:21 np0005548788.localdomain podman[80193]: 2025-12-06 08:39:21.326067058 +0000 UTC m=+0.149797660 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, build-date=2025-11-18T23:34:05Z, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:39:21 np0005548788.localdomain podman[80192]: 2025-12-06 08:39:21.372071249 +0000 UTC m=+0.196395869 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 06 08:39:21 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:39:21 np0005548788.localdomain podman[80192]: 2025-12-06 08:39:21.4258271 +0000 UTC m=+0.250151680 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, release=1761123044, vcs-type=git)
Dec 06 08:39:21 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:39:22 np0005548788.localdomain systemd[1]: tmp-crun.KCYASa.mount: Deactivated successfully.
Dec 06 08:39:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:39:23 np0005548788.localdomain podman[80283]: 2025-12-06 08:39:23.26545091 +0000 UTC m=+0.092935602 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:23 np0005548788.localdomain podman[80283]: 2025-12-06 08:39:23.48974267 +0000 UTC m=+0.317227322 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:39:23 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:39:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:39:26 np0005548788.localdomain podman[80312]: 2025-12-06 08:39:26.247577381 +0000 UTC m=+0.074890864 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=)
Dec 06 08:39:26 np0005548788.localdomain podman[80312]: 2025-12-06 08:39:26.280099796 +0000 UTC m=+0.107413299 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:39:26 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:39:26 np0005548788.localdomain sshd[80338]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:39:26 np0005548788.localdomain sshd[80338]: Accepted publickey for zuul from 38.102.83.114 port 37688 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:39:26 np0005548788.localdomain systemd-logind[765]: New session 34 of user zuul.
Dec 06 08:39:26 np0005548788.localdomain systemd[1]: Started Session 34 of User zuul.
Dec 06 08:39:26 np0005548788.localdomain sshd[80338]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:39:26 np0005548788.localdomain sudo[80355]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpjbyqlabvtceiqvpjsbrtdqwmtikkql ; /usr/bin/python3
Dec 06 08:39:26 np0005548788.localdomain sudo[80355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:39:26 np0005548788.localdomain python3[80357]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:39:29 np0005548788.localdomain sudo[80355]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:38 np0005548788.localdomain sudo[80359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:39:38 np0005548788.localdomain sudo[80359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:39:38 np0005548788.localdomain sudo[80359]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:38 np0005548788.localdomain sudo[80374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:39:38 np0005548788.localdomain sudo[80374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:39:39 np0005548788.localdomain sudo[80374]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:41 np0005548788.localdomain sudo[80422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:39:41 np0005548788.localdomain sudo[80422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:39:41 np0005548788.localdomain sudo[80422]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:39:47 np0005548788.localdomain podman[80437]: 2025-12-06 08:39:47.271119483 +0000 UTC m=+0.094825341 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:39:47 np0005548788.localdomain podman[80437]: 2025-12-06 08:39:47.2826497 +0000 UTC m=+0.106355548 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:39:47 np0005548788.localdomain podman[80439]: 2025-12-06 08:39:47.33282411 +0000 UTC m=+0.151868003 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:11:48Z, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:39:47 np0005548788.localdomain podman[80441]: 2025-12-06 08:39:47.384360583 +0000 UTC m=+0.198266587 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z)
Dec 06 08:39:47 np0005548788.localdomain podman[80439]: 2025-12-06 08:39:47.389518581 +0000 UTC m=+0.208562534 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute)
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:39:47 np0005548788.localdomain podman[80441]: 2025-12-06 08:39:47.418606771 +0000 UTC m=+0.232512795 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:39:47 np0005548788.localdomain podman[80440]: 2025-12-06 08:39:47.476593353 +0000 UTC m=+0.294516682 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid)
Dec 06 08:39:47 np0005548788.localdomain podman[80438]: 2025-12-06 08:39:47.447280727 +0000 UTC m=+0.268926201 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:39:47 np0005548788.localdomain podman[80440]: 2025-12-06 08:39:47.511045597 +0000 UTC m=+0.328968956 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, container_name=iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:39:47 np0005548788.localdomain podman[80438]: 2025-12-06 08:39:47.526307718 +0000 UTC m=+0.347953242 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 08:39:47 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:39:50 np0005548788.localdomain sshd[80544]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:39:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:39:50 np0005548788.localdomain podman[80546]: 2025-12-06 08:39:50.275258845 +0000 UTC m=+0.101505498 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:50 np0005548788.localdomain podman[80546]: 2025-12-06 08:39:50.658532467 +0000 UTC m=+0.484779060 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:39:50 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:39:51 np0005548788.localdomain sshd[80544]: Received disconnect from 152.32.172.117 port 48082:11: Bye Bye [preauth]
Dec 06 08:39:51 np0005548788.localdomain sshd[80544]: Disconnected from authenticating user root 152.32.172.117 port 48082 [preauth]
Dec 06 08:39:51 np0005548788.localdomain sudo[80583]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gocmgnedmhtcfsdqttqoyezuhnvxvich ; /usr/bin/python3
Dec 06 08:39:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:39:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:39:51 np0005548788.localdomain sudo[80583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:39:51 np0005548788.localdomain podman[80585]: 2025-12-06 08:39:51.677353576 +0000 UTC m=+0.095056348 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent)
Dec 06 08:39:51 np0005548788.localdomain podman[80586]: 2025-12-06 08:39:51.728079473 +0000 UTC m=+0.143532055 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ovn_controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:39:51 np0005548788.localdomain podman[80586]: 2025-12-06 08:39:51.780603346 +0000 UTC m=+0.196055948 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:39:51 np0005548788.localdomain podman[80585]: 2025-12-06 08:39:51.781012759 +0000 UTC m=+0.198715531 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.)
Dec 06 08:39:51 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:39:51 np0005548788.localdomain python3[80587]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:39:51 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:39:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:39:54 np0005548788.localdomain podman[80633]: 2025-12-06 08:39:54.271938713 +0000 UTC m=+0.093366096 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible)
Dec 06 08:39:54 np0005548788.localdomain podman[80633]: 2025-12-06 08:39:54.469822807 +0000 UTC m=+0.291250240 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:39:54 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:39:55 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:39:55 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:39:55 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:39:56 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:39:56 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:39:56 np0005548788.localdomain systemd[1]: run-r829304dfd05d4d2a8f2ebd047b90f8af.service: Deactivated successfully.
Dec 06 08:39:56 np0005548788.localdomain systemd[1]: run-re4647d0a50b44f549bf1e8e275e8078c.service: Deactivated successfully.
Dec 06 08:39:56 np0005548788.localdomain sudo[80583]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:39:57 np0005548788.localdomain podman[80813]: 2025-12-06 08:39:57.241086373 +0000 UTC m=+0.072001575 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:39:57 np0005548788.localdomain podman[80813]: 2025-12-06 08:39:57.274731273 +0000 UTC m=+0.105646425 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12)
Dec 06 08:39:57 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:40:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:40:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4516 writes, 20K keys, 4516 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4516 writes, 510 syncs, 8.85 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:40:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:40:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.2 total, 600.0 interval
                                                          Cumulative writes: 5111 writes, 22K keys, 5111 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5111 writes, 587 syncs, 8.71 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: tmp-crun.ZOr71U.mount: Deactivated successfully.
Dec 06 08:40:18 np0005548788.localdomain podman[80848]: 2025-12-06 08:40:18.302614338 +0000 UTC m=+0.120292428 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:40:18 np0005548788.localdomain podman[80841]: 2025-12-06 08:40:18.349398914 +0000 UTC m=+0.173719079 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:40:18 np0005548788.localdomain podman[80848]: 2025-12-06 08:40:18.359729773 +0000 UTC m=+0.177407833 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:40:18 np0005548788.localdomain podman[80840]: 2025-12-06 08:40:18.264940434 +0000 UTC m=+0.092307963 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, release=1761123044, container_name=collectd)
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:40:18 np0005548788.localdomain podman[80840]: 2025-12-06 08:40:18.402694551 +0000 UTC m=+0.230062120 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:40:18 np0005548788.localdomain podman[80839]: 2025-12-06 08:40:18.419625524 +0000 UTC m=+0.243841625 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:40:18 np0005548788.localdomain podman[80841]: 2025-12-06 08:40:18.457498294 +0000 UTC m=+0.281818449 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:40:18 np0005548788.localdomain podman[80842]: 2025-12-06 08:40:18.264208902 +0000 UTC m=+0.081187009 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_id=tripleo_step3)
Dec 06 08:40:18 np0005548788.localdomain podman[80842]: 2025-12-06 08:40:18.500560235 +0000 UTC m=+0.317538352 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=)
Dec 06 08:40:18 np0005548788.localdomain podman[80839]: 2025-12-06 08:40:18.514692491 +0000 UTC m=+0.338908562 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:40:18 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:40:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:40:21 np0005548788.localdomain podman[80945]: 2025-12-06 08:40:21.255584978 +0000 UTC m=+0.083320084 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:40:21 np0005548788.localdomain podman[80945]: 2025-12-06 08:40:21.62264213 +0000 UTC m=+0.450377226 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:40:21 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:40:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:40:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:40:22 np0005548788.localdomain podman[80969]: 2025-12-06 08:40:22.253650647 +0000 UTC m=+0.081139938 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:40:22 np0005548788.localdomain podman[80969]: 2025-12-06 08:40:22.279410683 +0000 UTC m=+0.106900004 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:40:22 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:40:22 np0005548788.localdomain systemd[1]: tmp-crun.UIisVZ.mount: Deactivated successfully.
Dec 06 08:40:22 np0005548788.localdomain podman[80968]: 2025-12-06 08:40:22.378060031 +0000 UTC m=+0.203161418 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 08:40:22 np0005548788.localdomain podman[80968]: 2025-12-06 08:40:22.427707185 +0000 UTC m=+0.252808502 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:40:22 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:40:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:40:25 np0005548788.localdomain podman[81062]: 2025-12-06 08:40:25.263162594 +0000 UTC m=+0.090578010 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Dec 06 08:40:25 np0005548788.localdomain podman[81062]: 2025-12-06 08:40:25.454717653 +0000 UTC m=+0.282133068 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:40:25 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:40:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:40:28 np0005548788.localdomain podman[81091]: 2025-12-06 08:40:28.262295992 +0000 UTC m=+0.088434324 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 08:40:28 np0005548788.localdomain podman[81091]: 2025-12-06 08:40:28.292623158 +0000 UTC m=+0.118761500 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64)
Dec 06 08:40:28 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:40:39 np0005548788.localdomain sudo[81130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuuscigvyffiaijaotnxaehnasruqyde ; /usr/bin/python3
Dec 06 08:40:39 np0005548788.localdomain sudo[81130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:40:39 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:40:39 np0005548788.localdomain recover_tripleo_nova_virtqemud[81134]: 62021
Dec 06 08:40:39 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:40:39 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:40:39 np0005548788.localdomain python3[81132]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:40:42 np0005548788.localdomain sudo[81256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:40:42 np0005548788.localdomain sudo[81256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548788.localdomain sudo[81256]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:42 np0005548788.localdomain sudo[81271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:40:42 np0005548788.localdomain sudo[81271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548788.localdomain sudo[81271]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:42 np0005548788.localdomain sudo[81307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:40:42 np0005548788.localdomain sudo[81307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548788.localdomain sudo[81307]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:42 np0005548788.localdomain rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:40:42 np0005548788.localdomain sudo[81322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:40:42 np0005548788.localdomain sudo[81322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548788.localdomain rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:40:43 np0005548788.localdomain sudo[81322]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:43 np0005548788.localdomain sudo[81375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:40:43 np0005548788.localdomain sudo[81375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:43 np0005548788.localdomain sudo[81375]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:40:49 np0005548788.localdomain podman[81451]: 2025-12-06 08:40:49.289705023 +0000 UTC m=+0.104521971 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4)
Dec 06 08:40:49 np0005548788.localdomain podman[81451]: 2025-12-06 08:40:49.326108988 +0000 UTC m=+0.140925916 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: tmp-crun.5XTBCC.mount: Deactivated successfully.
Dec 06 08:40:49 np0005548788.localdomain podman[81452]: 2025-12-06 08:40:49.337155269 +0000 UTC m=+0.152159853 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:40:49 np0005548788.localdomain podman[81452]: 2025-12-06 08:40:49.375704329 +0000 UTC m=+0.190708893 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, url=https://www.redhat.com)
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:40:49 np0005548788.localdomain podman[81455]: 2025-12-06 08:40:49.380279391 +0000 UTC m=+0.186989609 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, managed_by=tripleo_ansible)
Dec 06 08:40:49 np0005548788.localdomain podman[81453]: 2025-12-06 08:40:49.434568518 +0000 UTC m=+0.246137136 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:40:49 np0005548788.localdomain podman[81453]: 2025-12-06 08:40:49.468503587 +0000 UTC m=+0.280072215 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:40:49 np0005548788.localdomain podman[81454]: 2025-12-06 08:40:49.48023458 +0000 UTC m=+0.289476886 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid)
Dec 06 08:40:49 np0005548788.localdomain podman[81454]: 2025-12-06 08:40:49.493484628 +0000 UTC m=+0.302726954 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid)
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:40:49 np0005548788.localdomain podman[81455]: 2025-12-06 08:40:49.510925658 +0000 UTC m=+0.317635846 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:40:49 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:40:51 np0005548788.localdomain sudo[81130]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:40:52 np0005548788.localdomain podman[81563]: 2025-12-06 08:40:52.242364433 +0000 UTC m=+0.070605993 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:40:52 np0005548788.localdomain podman[81563]: 2025-12-06 08:40:52.62159429 +0000 UTC m=+0.449835860 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true)
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:40:52 np0005548788.localdomain podman[81586]: 2025-12-06 08:40:52.737753348 +0000 UTC m=+0.080145166 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: tmp-crun.lUK8IU.mount: Deactivated successfully.
Dec 06 08:40:52 np0005548788.localdomain podman[81587]: 2025-12-06 08:40:52.762603566 +0000 UTC m=+0.100563498 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:40:52 np0005548788.localdomain podman[81587]: 2025-12-06 08:40:52.808747243 +0000 UTC m=+0.146707185 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4)
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:40:52 np0005548788.localdomain podman[81586]: 2025-12-06 08:40:52.865487846 +0000 UTC m=+0.207879714 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:40:52 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:40:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:40:56 np0005548788.localdomain podman[81633]: 2025-12-06 08:40:56.259235575 +0000 UTC m=+0.085244075 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, version=17.1.12)
Dec 06 08:40:56 np0005548788.localdomain podman[81633]: 2025-12-06 08:40:56.442542859 +0000 UTC m=+0.268551359 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:40:56 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:40:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:40:59 np0005548788.localdomain systemd[1]: tmp-crun.y7OzkN.mount: Deactivated successfully.
Dec 06 08:40:59 np0005548788.localdomain podman[81662]: 2025-12-06 08:40:59.259182166 +0000 UTC m=+0.088131513 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:40:59 np0005548788.localdomain podman[81662]: 2025-12-06 08:40:59.312270427 +0000 UTC m=+0.141219784 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:40:59 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:40:59 np0005548788.localdomain sshd[81688]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:01 np0005548788.localdomain sshd[81688]: Received disconnect from 152.32.172.117 port 58510:11: Bye Bye [preauth]
Dec 06 08:41:01 np0005548788.localdomain sshd[81688]: Disconnected from authenticating user root 152.32.172.117 port 58510 [preauth]
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: tmp-crun.4ibaCb.mount: Deactivated successfully.
Dec 06 08:41:20 np0005548788.localdomain podman[81690]: 2025-12-06 08:41:20.28497635 +0000 UTC m=+0.105740138 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64)
Dec 06 08:41:20 np0005548788.localdomain podman[81690]: 2025-12-06 08:41:20.291855052 +0000 UTC m=+0.112618820 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 08:41:20 np0005548788.localdomain podman[81693]: 2025-12-06 08:41:20.303150851 +0000 UTC m=+0.114544130 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:41:20 np0005548788.localdomain podman[81693]: 2025-12-06 08:41:20.30956859 +0000 UTC m=+0.120961879 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, tcib_managed=true, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:41:20 np0005548788.localdomain podman[81692]: 2025-12-06 08:41:20.411144708 +0000 UTC m=+0.226307663 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:41:20 np0005548788.localdomain podman[81699]: 2025-12-06 08:41:20.452491896 +0000 UTC m=+0.258933132 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 08:41:20 np0005548788.localdomain podman[81692]: 2025-12-06 08:41:20.470971446 +0000 UTC m=+0.286134411 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:41:20 np0005548788.localdomain podman[81699]: 2025-12-06 08:41:20.504124631 +0000 UTC m=+0.310565847 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:41:20 np0005548788.localdomain podman[81691]: 2025-12-06 08:41:20.427609757 +0000 UTC m=+0.245228208 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 08:41:20 np0005548788.localdomain podman[81691]: 2025-12-06 08:41:20.562518545 +0000 UTC m=+0.380136946 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z)
Dec 06 08:41:20 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:41:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:41:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:41:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:41:23 np0005548788.localdomain podman[81800]: 2025-12-06 08:41:23.253801611 +0000 UTC m=+0.080113637 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vcs-type=git, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:41:23 np0005548788.localdomain podman[81800]: 2025-12-06 08:41:23.302164495 +0000 UTC m=+0.128476591 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12)
Dec 06 08:41:23 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:41:23 np0005548788.localdomain podman[81801]: 2025-12-06 08:41:23.30916623 +0000 UTC m=+0.130422270 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:41:23 np0005548788.localdomain podman[81802]: 2025-12-06 08:41:23.367310957 +0000 UTC m=+0.184667157 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, container_name=ovn_controller, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:41:23 np0005548788.localdomain podman[81802]: 2025-12-06 08:41:23.387733418 +0000 UTC m=+0.205089608 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, release=1761123044, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Dec 06 08:41:23 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:41:23 np0005548788.localdomain podman[81801]: 2025-12-06 08:41:23.702114332 +0000 UTC m=+0.523370422 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 06 08:41:23 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:41:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:41:27 np0005548788.localdomain podman[81915]: 2025-12-06 08:41:27.27574376 +0000 UTC m=+0.093256892 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 08:41:27 np0005548788.localdomain podman[81915]: 2025-12-06 08:41:27.476784742 +0000 UTC m=+0.294297874 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:41:27 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:41:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:41:30 np0005548788.localdomain systemd[1]: tmp-crun.BxDqLa.mount: Deactivated successfully.
Dec 06 08:41:30 np0005548788.localdomain podman[81946]: 2025-12-06 08:41:30.256913831 +0000 UTC m=+0.085456760 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true)
Dec 06 08:41:30 np0005548788.localdomain podman[81946]: 2025-12-06 08:41:30.31738053 +0000 UTC m=+0.145923489 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 08:41:30 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:41:44 np0005548788.localdomain sudo[81972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:41:44 np0005548788.localdomain sudo[81972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:41:44 np0005548788.localdomain sudo[81972]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:44 np0005548788.localdomain sudo[81987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:41:44 np0005548788.localdomain sudo[81987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:41:44 np0005548788.localdomain sudo[81987]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:45 np0005548788.localdomain sudo[82035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:41:45 np0005548788.localdomain sudo[82035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:41:45 np0005548788.localdomain sudo[82035]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: tmp-crun.dPtIb2.mount: Deactivated successfully.
Dec 06 08:41:51 np0005548788.localdomain podman[82051]: 2025-12-06 08:41:51.275022446 +0000 UTC m=+0.098892472 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, container_name=collectd, config_id=tripleo_step3, release=1761123044, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: tmp-crun.0PLj2b.mount: Deactivated successfully.
Dec 06 08:41:51 np0005548788.localdomain podman[82051]: 2025-12-06 08:41:51.314559314 +0000 UTC m=+0.138429290 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, container_name=collectd, maintainer=OpenStack TripleO Team)
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:41:51 np0005548788.localdomain podman[82053]: 2025-12-06 08:41:51.36977623 +0000 UTC m=+0.186327728 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=)
Dec 06 08:41:51 np0005548788.localdomain podman[82050]: 2025-12-06 08:41:51.32212876 +0000 UTC m=+0.145447809 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:41:51 np0005548788.localdomain podman[82052]: 2025-12-06 08:41:51.421065372 +0000 UTC m=+0.239153859 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-type=git, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 06 08:41:51 np0005548788.localdomain podman[82052]: 2025-12-06 08:41:51.449558128 +0000 UTC m=+0.267646555 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:41:51 np0005548788.localdomain podman[82066]: 2025-12-06 08:41:51.298742304 +0000 UTC m=+0.108701398 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:41:51 np0005548788.localdomain podman[82053]: 2025-12-06 08:41:51.475992739 +0000 UTC m=+0.292544217 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:41:51 np0005548788.localdomain podman[82066]: 2025-12-06 08:41:51.529003925 +0000 UTC m=+0.338962959 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:41:51 np0005548788.localdomain podman[82050]: 2025-12-06 08:41:51.553250129 +0000 UTC m=+0.376569218 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=logrotate_crond, io.buildah.version=1.41.4)
Dec 06 08:41:51 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:41:52 np0005548788.localdomain sshd[80341]: Received disconnect from 38.102.83.114 port 37688:11: disconnected by user
Dec 06 08:41:52 np0005548788.localdomain sshd[80341]: Disconnected from user zuul 38.102.83.114 port 37688
Dec 06 08:41:52 np0005548788.localdomain sshd[80338]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:41:52 np0005548788.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Dec 06 08:41:52 np0005548788.localdomain systemd[1]: session-34.scope: Consumed 13.978s CPU time.
Dec 06 08:41:52 np0005548788.localdomain systemd-logind[765]: Session 34 logged out. Waiting for processes to exit.
Dec 06 08:41:52 np0005548788.localdomain systemd-logind[765]: Removed session 34.
Dec 06 08:41:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:41:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:41:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:41:54 np0005548788.localdomain podman[82163]: 2025-12-06 08:41:54.263338526 +0000 UTC m=+0.085174176 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:41:54 np0005548788.localdomain podman[82162]: 2025-12-06 08:41:54.310011096 +0000 UTC m=+0.134648833 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc.)
Dec 06 08:41:54 np0005548788.localdomain podman[82164]: 2025-12-06 08:41:54.366535802 +0000 UTC m=+0.184221503 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_controller, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:41:54 np0005548788.localdomain podman[82162]: 2025-12-06 08:41:54.387684418 +0000 UTC m=+0.212322185 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12)
Dec 06 08:41:54 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:41:54 np0005548788.localdomain podman[82164]: 2025-12-06 08:41:54.418627929 +0000 UTC m=+0.236313600 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12)
Dec 06 08:41:54 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:41:54 np0005548788.localdomain podman[82163]: 2025-12-06 08:41:54.638424497 +0000 UTC m=+0.460260127 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:41:54 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:41:55 np0005548788.localdomain sshd[82231]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:55 np0005548788.localdomain sshd[82231]: Accepted publickey for zuul from 38.102.83.114 port 49840 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:41:55 np0005548788.localdomain systemd-logind[765]: New session 35 of user zuul.
Dec 06 08:41:55 np0005548788.localdomain systemd[1]: Started Session 35 of User zuul.
Dec 06 08:41:55 np0005548788.localdomain sshd[82231]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:41:55 np0005548788.localdomain sudo[82248]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmhyqdzmbduncthnrmvbfbtomslljiwp ; /usr/bin/python3
Dec 06 08:41:55 np0005548788.localdomain sudo[82248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:41:55 np0005548788.localdomain python3[82250]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:41:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:41:58 np0005548788.localdomain systemd[1]: tmp-crun.faGzbw.mount: Deactivated successfully.
Dec 06 08:41:58 np0005548788.localdomain podman[82254]: 2025-12-06 08:41:58.288574043 +0000 UTC m=+0.112775163 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:41:58 np0005548788.localdomain podman[82254]: 2025-12-06 08:41:58.455616511 +0000 UTC m=+0.279817641 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:41:58 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:41:59 np0005548788.localdomain rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:42:00 np0005548788.localdomain rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:42:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:42:01 np0005548788.localdomain systemd[1]: tmp-crun.S3htDE.mount: Deactivated successfully.
Dec 06 08:42:01 np0005548788.localdomain podman[82409]: 2025-12-06 08:42:01.252786713 +0000 UTC m=+0.081233824 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:42:01 np0005548788.localdomain podman[82409]: 2025-12-06 08:42:01.28133426 +0000 UTC m=+0.109781311 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:42:01 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:42:04 np0005548788.localdomain sudo[82248]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:11 np0005548788.localdomain sshd[82494]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:42:12 np0005548788.localdomain sshd[82494]: Received disconnect from 152.32.172.117 port 60160:11: Bye Bye [preauth]
Dec 06 08:42:12 np0005548788.localdomain sshd[82494]: Disconnected from authenticating user root 152.32.172.117 port 60160 [preauth]
Dec 06 08:42:21 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:42:21 np0005548788.localdomain recover_tripleo_nova_virtqemud[82497]: 62021
Dec 06 08:42:21 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:42:21 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: tmp-crun.qh6mLh.mount: Deactivated successfully.
Dec 06 08:42:22 np0005548788.localdomain podman[82501]: 2025-12-06 08:42:22.302833802 +0000 UTC m=+0.110936447 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: tmp-crun.z5Vj39.mount: Deactivated successfully.
Dec 06 08:42:22 np0005548788.localdomain podman[82501]: 2025-12-06 08:42:22.341013349 +0000 UTC m=+0.149115994 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:42:22 np0005548788.localdomain podman[82500]: 2025-12-06 08:42:22.385934064 +0000 UTC m=+0.202929715 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1)
Dec 06 08:42:22 np0005548788.localdomain podman[82498]: 2025-12-06 08:42:22.34204296 +0000 UTC m=+0.160775484 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-cron-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12)
Dec 06 08:42:22 np0005548788.localdomain podman[82512]: 2025-12-06 08:42:22.436877056 +0000 UTC m=+0.242946417 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 06 08:42:22 np0005548788.localdomain podman[82500]: 2025-12-06 08:42:22.443641686 +0000 UTC m=+0.260637317 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:42:22 np0005548788.localdomain podman[82498]: 2025-12-06 08:42:22.472914305 +0000 UTC m=+0.291646829 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z)
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:42:22 np0005548788.localdomain podman[82499]: 2025-12-06 08:42:22.48337629 +0000 UTC m=+0.301618649 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 06 08:42:22 np0005548788.localdomain podman[82499]: 2025-12-06 08:42:22.515613101 +0000 UTC m=+0.333855480 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:42:22 np0005548788.localdomain podman[82512]: 2025-12-06 08:42:22.53586192 +0000 UTC m=+0.341931311 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Dec 06 08:42:22 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: tmp-crun.4oycZQ.mount: Deactivated successfully.
Dec 06 08:42:25 np0005548788.localdomain podman[82633]: 2025-12-06 08:42:25.269970344 +0000 UTC m=+0.096367234 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z)
Dec 06 08:42:25 np0005548788.localdomain podman[82632]: 2025-12-06 08:42:25.371692764 +0000 UTC m=+0.198383603 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:42:25 np0005548788.localdomain podman[82634]: 2025-12-06 08:42:25.337642186 +0000 UTC m=+0.160760544 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:42:25 np0005548788.localdomain podman[82632]: 2025-12-06 08:42:25.409756176 +0000 UTC m=+0.236447005 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=)
Dec 06 08:42:25 np0005548788.localdomain podman[82634]: 2025-12-06 08:42:25.416905968 +0000 UTC m=+0.240024296 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:42:25 np0005548788.localdomain podman[82633]: 2025-12-06 08:42:25.612461762 +0000 UTC m=+0.438858652 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=)
Dec 06 08:42:25 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:42:26 np0005548788.localdomain systemd[1]: tmp-crun.2a8EIP.mount: Deactivated successfully.
Dec 06 08:42:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:42:29 np0005548788.localdomain podman[82723]: 2025-12-06 08:42:29.284956443 +0000 UTC m=+0.101774402 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:42:29 np0005548788.localdomain podman[82723]: 2025-12-06 08:42:29.476759241 +0000 UTC m=+0.293577240 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git)
Dec 06 08:42:29 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:42:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:42:32 np0005548788.localdomain podman[82752]: 2025-12-06 08:42:32.256190642 +0000 UTC m=+0.083808224 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044)
Dec 06 08:42:32 np0005548788.localdomain podman[82752]: 2025-12-06 08:42:32.314690149 +0000 UTC m=+0.142307711 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute)
Dec 06 08:42:32 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:42:33 np0005548788.localdomain python3[82791]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Dec 06 08:42:45 np0005548788.localdomain sudo[82792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:42:45 np0005548788.localdomain sudo[82792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:45 np0005548788.localdomain sudo[82792]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:45 np0005548788.localdomain sudo[82807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:42:45 np0005548788.localdomain sudo[82807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:46 np0005548788.localdomain podman[82894]: 2025-12-06 08:42:46.778670411 +0000 UTC m=+0.096237590 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, release=1763362218, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 08:42:46 np0005548788.localdomain podman[82894]: 2025-12-06 08:42:46.886645615 +0000 UTC m=+0.204212834 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:42:47 np0005548788.localdomain sudo[82807]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:47 np0005548788.localdomain sudo[82960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:42:47 np0005548788.localdomain sudo[82960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:47 np0005548788.localdomain sudo[82960]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:47 np0005548788.localdomain sudo[82975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:42:47 np0005548788.localdomain sudo[82975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:48 np0005548788.localdomain sudo[82975]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:48 np0005548788.localdomain sudo[83023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:42:48 np0005548788.localdomain sudo[83023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:48 np0005548788.localdomain sudo[83023]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:42:53 np0005548788.localdomain podman[83040]: 2025-12-06 08:42:53.289280475 +0000 UTC m=+0.101414791 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:42:53 np0005548788.localdomain podman[83039]: 2025-12-06 08:42:53.348838275 +0000 UTC m=+0.163304333 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1)
Dec 06 08:42:53 np0005548788.localdomain podman[83040]: 2025-12-06 08:42:53.4017956 +0000 UTC m=+0.213929956 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: tmp-crun.vcNo9V.mount: Deactivated successfully.
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:42:53 np0005548788.localdomain podman[83041]: 2025-12-06 08:42:53.431568435 +0000 UTC m=+0.240175571 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:42:53 np0005548788.localdomain podman[83041]: 2025-12-06 08:42:53.438754038 +0000 UTC m=+0.247361194 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1)
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:42:53 np0005548788.localdomain podman[83042]: 2025-12-06 08:42:53.408149478 +0000 UTC m=+0.212663697 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=)
Dec 06 08:42:53 np0005548788.localdomain podman[83038]: 2025-12-06 08:42:53.486870043 +0000 UTC m=+0.301880588 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 08:42:53 np0005548788.localdomain podman[83038]: 2025-12-06 08:42:53.498453283 +0000 UTC m=+0.313463858 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:42:53 np0005548788.localdomain podman[83039]: 2025-12-06 08:42:53.513749167 +0000 UTC m=+0.328215215 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, architecture=x86_64)
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:42:53 np0005548788.localdomain podman[83042]: 2025-12-06 08:42:53.541806549 +0000 UTC m=+0.346320728 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:42:53 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:42:56 np0005548788.localdomain podman[83147]: 2025-12-06 08:42:56.248731189 +0000 UTC m=+0.078866381 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public)
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: tmp-crun.ypMNOE.mount: Deactivated successfully.
Dec 06 08:42:56 np0005548788.localdomain podman[83148]: 2025-12-06 08:42:56.318617049 +0000 UTC m=+0.140621169 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=)
Dec 06 08:42:56 np0005548788.localdomain podman[83147]: 2025-12-06 08:42:56.372806173 +0000 UTC m=+0.202941425 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044)
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:42:56 np0005548788.localdomain podman[83149]: 2025-12-06 08:42:56.376555769 +0000 UTC m=+0.195429482 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:42:56 np0005548788.localdomain podman[83149]: 2025-12-06 08:42:56.459589888 +0000 UTC m=+0.278463601 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044)
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:42:56 np0005548788.localdomain podman[83148]: 2025-12-06 08:42:56.725522518 +0000 UTC m=+0.547526598 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, architecture=x86_64, container_name=nova_migration_target, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044)
Dec 06 08:42:56 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:42:57 np0005548788.localdomain systemd[1]: tmp-crun.5Y0hMi.mount: Deactivated successfully.
Dec 06 08:43:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:43:00 np0005548788.localdomain podman[83218]: 2025-12-06 08:43:00.262599592 +0000 UTC m=+0.090176612 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:43:00 np0005548788.localdomain podman[83218]: 2025-12-06 08:43:00.48463859 +0000 UTC m=+0.312215610 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=metrics_qdr, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:43:00 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:43:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:43:03 np0005548788.localdomain podman[83248]: 2025-12-06 08:43:03.272823172 +0000 UTC m=+0.099959126 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Dec 06 08:43:03 np0005548788.localdomain podman[83248]: 2025-12-06 08:43:03.328745649 +0000 UTC m=+0.155881603 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute)
Dec 06 08:43:03 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:43:22 np0005548788.localdomain sshd[83274]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:43:23 np0005548788.localdomain sshd[83274]: Received disconnect from 152.32.172.117 port 56796:11: Bye Bye [preauth]
Dec 06 08:43:23 np0005548788.localdomain sshd[83274]: Disconnected from authenticating user root 152.32.172.117 port 56796 [preauth]
Dec 06 08:43:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:43:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:43:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:43:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:43:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:43:23 np0005548788.localdomain systemd[1]: tmp-crun.9B7Wyw.mount: Deactivated successfully.
Dec 06 08:43:23 np0005548788.localdomain podman[83276]: 2025-12-06 08:43:23.971434786 +0000 UTC m=+0.097563941 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:43:24 np0005548788.localdomain systemd[1]: tmp-crun.XHWI0v.mount: Deactivated successfully.
Dec 06 08:43:24 np0005548788.localdomain podman[83277]: 2025-12-06 08:43:24.032266225 +0000 UTC m=+0.154992175 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 08:43:24 np0005548788.localdomain podman[83279]: 2025-12-06 08:43:24.077432978 +0000 UTC m=+0.193147060 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step3)
Dec 06 08:43:24 np0005548788.localdomain podman[83279]: 2025-12-06 08:43:24.115686356 +0000 UTC m=+0.231400428 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid)
Dec 06 08:43:24 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:43:24 np0005548788.localdomain podman[83278]: 2025-12-06 08:43:24.137184884 +0000 UTC m=+0.255298100 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:43:24 np0005548788.localdomain podman[83277]: 2025-12-06 08:43:24.145591176 +0000 UTC m=+0.268317166 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp17/openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:43:24 np0005548788.localdomain podman[83276]: 2025-12-06 08:43:24.155770271 +0000 UTC m=+0.281899456 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=logrotate_crond, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, io.openshift.expose-services=)
Dec 06 08:43:24 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:43:24 np0005548788.localdomain podman[83278]: 2025-12-06 08:43:24.171747498 +0000 UTC m=+0.289860674 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Dec 06 08:43:24 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:43:24 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:43:24 np0005548788.localdomain podman[83285]: 2025-12-06 08:43:24.239330847 +0000 UTC m=+0.352876572 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Dec 06 08:43:24 np0005548788.localdomain podman[83285]: 2025-12-06 08:43:24.273490848 +0000 UTC m=+0.387036503 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 06 08:43:24 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:43:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:43:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:43:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:43:27 np0005548788.localdomain podman[83414]: 2025-12-06 08:43:27.280834648 +0000 UTC m=+0.098650375 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Dec 06 08:43:27 np0005548788.localdomain podman[83416]: 2025-12-06 08:43:27.327875419 +0000 UTC m=+0.139502324 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.)
Dec 06 08:43:27 np0005548788.localdomain podman[83414]: 2025-12-06 08:43:27.337594961 +0000 UTC m=+0.155410718 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:43:27 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:43:27 np0005548788.localdomain podman[83416]: 2025-12-06 08:43:27.383048413 +0000 UTC m=+0.194675298 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 08:43:27 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:43:27 np0005548788.localdomain podman[83415]: 2025-12-06 08:43:27.398270476 +0000 UTC m=+0.214170694 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:43:27 np0005548788.localdomain podman[83415]: 2025-12-06 08:43:27.78346994 +0000 UTC m=+0.599370188 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Dec 06 08:43:27 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:43:28 np0005548788.localdomain systemd[1]: tmp-crun.nH8uPU.mount: Deactivated successfully.
Dec 06 08:43:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:43:31 np0005548788.localdomain podman[83504]: 2025-12-06 08:43:31.268648 +0000 UTC m=+0.098354775 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible)
Dec 06 08:43:31 np0005548788.localdomain podman[83504]: 2025-12-06 08:43:31.491530104 +0000 UTC m=+0.321236879 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, release=1761123044, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, container_name=metrics_qdr, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:43:31 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:43:33 np0005548788.localdomain sshd[82234]: Received disconnect from 38.102.83.114 port 49840:11: disconnected by user
Dec 06 08:43:33 np0005548788.localdomain sshd[82234]: Disconnected from user zuul 38.102.83.114 port 49840
Dec 06 08:43:33 np0005548788.localdomain sshd[82231]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:43:33 np0005548788.localdomain systemd[1]: session-35.scope: Deactivated successfully.
Dec 06 08:43:33 np0005548788.localdomain systemd[1]: session-35.scope: Consumed 6.521s CPU time.
Dec 06 08:43:33 np0005548788.localdomain systemd-logind[765]: Session 35 logged out. Waiting for processes to exit.
Dec 06 08:43:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:43:33 np0005548788.localdomain systemd-logind[765]: Removed session 35.
Dec 06 08:43:33 np0005548788.localdomain podman[83533]: 2025-12-06 08:43:33.864715697 +0000 UTC m=+0.085118185 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12)
Dec 06 08:43:33 np0005548788.localdomain podman[83533]: 2025-12-06 08:43:33.896614818 +0000 UTC m=+0.117017296 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team)
Dec 06 08:43:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:43:48 np0005548788.localdomain sudo[83559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:43:48 np0005548788.localdomain sudo[83559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:43:48 np0005548788.localdomain sudo[83559]: pam_unix(sudo:session): session closed for user root
Dec 06 08:43:48 np0005548788.localdomain sudo[83574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:43:48 np0005548788.localdomain sudo[83574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:43:49 np0005548788.localdomain sudo[83574]: pam_unix(sudo:session): session closed for user root
Dec 06 08:43:49 np0005548788.localdomain sudo[83620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:43:49 np0005548788.localdomain sudo[83620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:43:49 np0005548788.localdomain sudo[83620]: pam_unix(sudo:session): session closed for user root
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:43:54 np0005548788.localdomain podman[83636]: 2025-12-06 08:43:54.27000823 +0000 UTC m=+0.098230862 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, version=17.1.12, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:43:54 np0005548788.localdomain podman[83637]: 2025-12-06 08:43:54.324256185 +0000 UTC m=+0.148852215 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:43:54 np0005548788.localdomain podman[83637]: 2025-12-06 08:43:54.334705499 +0000 UTC m=+0.159301599 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:43:54 np0005548788.localdomain podman[83636]: 2025-12-06 08:43:54.395696244 +0000 UTC m=+0.223918916 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044)
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:43:54 np0005548788.localdomain podman[83667]: 2025-12-06 08:43:54.39945246 +0000 UTC m=+0.092264496 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 08:43:54 np0005548788.localdomain podman[83666]: 2025-12-06 08:43:54.454271953 +0000 UTC m=+0.151115144 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, container_name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron)
Dec 06 08:43:54 np0005548788.localdomain podman[83669]: 2025-12-06 08:43:54.516450995 +0000 UTC m=+0.201040876 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, url=https://www.redhat.com)
Dec 06 08:43:54 np0005548788.localdomain podman[83667]: 2025-12-06 08:43:54.531989148 +0000 UTC m=+0.224801144 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Dec 06 08:43:54 np0005548788.localdomain podman[83666]: 2025-12-06 08:43:54.540664137 +0000 UTC m=+0.237507328 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public)
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:43:54 np0005548788.localdomain podman[83669]: 2025-12-06 08:43:54.57778953 +0000 UTC m=+0.262379371 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Dec 06 08:43:54 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: tmp-crun.cj65R3.mount: Deactivated successfully.
Dec 06 08:43:58 np0005548788.localdomain podman[83747]: 2025-12-06 08:43:58.239790654 +0000 UTC m=+0.070086388 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:43:58 np0005548788.localdomain podman[83746]: 2025-12-06 08:43:58.303486132 +0000 UTC m=+0.133983472 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 06 08:43:58 np0005548788.localdomain podman[83748]: 2025-12-06 08:43:58.271309042 +0000 UTC m=+0.095880399 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:43:58 np0005548788.localdomain podman[83748]: 2025-12-06 08:43:58.355040613 +0000 UTC m=+0.179611940 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:43:58 np0005548788.localdomain podman[83746]: 2025-12-06 08:43:58.369567764 +0000 UTC m=+0.200065054 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1761123044, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:43:58 np0005548788.localdomain podman[83747]: 2025-12-06 08:43:58.628701384 +0000 UTC m=+0.458997128 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:43:58 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:44:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:44:02 np0005548788.localdomain systemd[1]: tmp-crun.WcFmKj.mount: Deactivated successfully.
Dec 06 08:44:02 np0005548788.localdomain podman[83811]: 2025-12-06 08:44:02.271716109 +0000 UTC m=+0.097994925 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, config_id=tripleo_step1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team)
Dec 06 08:44:02 np0005548788.localdomain podman[83811]: 2025-12-06 08:44:02.505884333 +0000 UTC m=+0.332163139 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 06 08:44:02 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:44:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:44:04 np0005548788.localdomain podman[83840]: 2025-12-06 08:44:04.25010271 +0000 UTC m=+0.081911017 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:44:04 np0005548788.localdomain podman[83840]: 2025-12-06 08:44:04.286724577 +0000 UTC m=+0.118532894 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Dec 06 08:44:04 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:44:21 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:44:21 np0005548788.localdomain recover_tripleo_nova_virtqemud[83868]: 62021
Dec 06 08:44:21 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:44:21 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: tmp-crun.7VhZqt.mount: Deactivated successfully.
Dec 06 08:44:25 np0005548788.localdomain podman[83873]: 2025-12-06 08:44:25.300654306 +0000 UTC m=+0.112696491 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:44:25 np0005548788.localdomain podman[83870]: 2025-12-06 08:44:25.350426522 +0000 UTC m=+0.168127083 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:44:25 np0005548788.localdomain podman[83869]: 2025-12-06 08:44:25.393326404 +0000 UTC m=+0.215908157 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:44:25 np0005548788.localdomain podman[83869]: 2025-12-06 08:44:25.402581271 +0000 UTC m=+0.225162994 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:44:25 np0005548788.localdomain podman[83870]: 2025-12-06 08:44:25.41025573 +0000 UTC m=+0.227956241 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:44:25 np0005548788.localdomain podman[83873]: 2025-12-06 08:44:25.482100962 +0000 UTC m=+0.294143107 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:44:25 np0005548788.localdomain podman[83878]: 2025-12-06 08:44:25.499411659 +0000 UTC m=+0.309339359 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:44:25 np0005548788.localdomain podman[83871]: 2025-12-06 08:44:25.559061202 +0000 UTC m=+0.375258497 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:44:25 np0005548788.localdomain podman[83878]: 2025-12-06 08:44:25.613569525 +0000 UTC m=+0.423497215 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true)
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:44:25 np0005548788.localdomain podman[83871]: 2025-12-06 08:44:25.666332484 +0000 UTC m=+0.482529829 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:44:25 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:44:26 np0005548788.localdomain systemd[1]: tmp-crun.2c8Hff.mount: Deactivated successfully.
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: tmp-crun.ftu8Dh.mount: Deactivated successfully.
Dec 06 08:44:29 np0005548788.localdomain podman[84024]: 2025-12-06 08:44:29.262880205 +0000 UTC m=+0.088877101 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:44:29 np0005548788.localdomain podman[84026]: 2025-12-06 08:44:29.309136572 +0000 UTC m=+0.127881233 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:44:29 np0005548788.localdomain podman[84026]: 2025-12-06 08:44:29.335071258 +0000 UTC m=+0.153815979 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64)
Dec 06 08:44:29 np0005548788.localdomain podman[84025]: 2025-12-06 08:44:29.37087776 +0000 UTC m=+0.192044516 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target)
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:44:29 np0005548788.localdomain podman[84024]: 2025-12-06 08:44:29.439859752 +0000 UTC m=+0.265856728 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:44:29 np0005548788.localdomain podman[84025]: 2025-12-06 08:44:29.75174614 +0000 UTC m=+0.572912976 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:44:29 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:44:31 np0005548788.localdomain sshd[84093]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:44:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:44:33 np0005548788.localdomain podman[84095]: 2025-12-06 08:44:33.264401385 +0000 UTC m=+0.088463199 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 08:44:33 np0005548788.localdomain podman[84095]: 2025-12-06 08:44:33.51149522 +0000 UTC m=+0.335557044 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:44:33 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:44:33 np0005548788.localdomain sshd[84093]: Received disconnect from 152.32.172.117 port 36162:11: Bye Bye [preauth]
Dec 06 08:44:33 np0005548788.localdomain sshd[84093]: Disconnected from authenticating user root 152.32.172.117 port 36162 [preauth]
Dec 06 08:44:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:44:35 np0005548788.localdomain podman[84125]: 2025-12-06 08:44:35.265676066 +0000 UTC m=+0.083709301 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:44:35 np0005548788.localdomain podman[84125]: 2025-12-06 08:44:35.298230098 +0000 UTC m=+0.116263363 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute)
Dec 06 08:44:35 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:44:50 np0005548788.localdomain sudo[84151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:44:50 np0005548788.localdomain sudo[84151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:44:50 np0005548788.localdomain sudo[84151]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:50 np0005548788.localdomain sudo[84166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:44:50 np0005548788.localdomain sudo[84166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:44:50 np0005548788.localdomain sudo[84166]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:51 np0005548788.localdomain sudo[84214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:44:51 np0005548788.localdomain sudo[84214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:44:51 np0005548788.localdomain sudo[84214]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:53 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: tmp-crun.E8JG6Z.mount: Deactivated successfully.
Dec 06 08:44:56 np0005548788.localdomain podman[84230]: 2025-12-06 08:44:56.263755373 +0000 UTC m=+0.092271237 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:44:56 np0005548788.localdomain podman[84230]: 2025-12-06 08:44:56.279471411 +0000 UTC m=+0.107987235 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:44:56 np0005548788.localdomain podman[84231]: 2025-12-06 08:44:56.337909736 +0000 UTC m=+0.162658283 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-collectd, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:44:56 np0005548788.localdomain podman[84232]: 2025-12-06 08:44:56.375492534 +0000 UTC m=+0.196752793 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z)
Dec 06 08:44:56 np0005548788.localdomain podman[84232]: 2025-12-06 08:44:56.411485511 +0000 UTC m=+0.232745770 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:44:56 np0005548788.localdomain podman[84244]: 2025-12-06 08:44:56.427328034 +0000 UTC m=+0.241277495 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:44:56 np0005548788.localdomain podman[84238]: 2025-12-06 08:44:56.388191688 +0000 UTC m=+0.205757921 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-type=git, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 08:44:56 np0005548788.localdomain podman[84244]: 2025-12-06 08:44:56.461601278 +0000 UTC m=+0.275550809 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, config_id=tripleo_step4, version=17.1.12)
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:44:56 np0005548788.localdomain podman[84238]: 2025-12-06 08:44:56.518855567 +0000 UTC m=+0.336421790 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12)
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:44:56 np0005548788.localdomain podman[84231]: 2025-12-06 08:44:56.57402266 +0000 UTC m=+0.398771167 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-collectd-container, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 06 08:44:56 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:44:57 np0005548788.localdomain systemd[1]: tmp-crun.nYsx78.mount: Deactivated successfully.
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:45:00 np0005548788.localdomain podman[84339]: 2025-12-06 08:45:00.27251939 +0000 UTC m=+0.095867819 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:45:00 np0005548788.localdomain podman[84339]: 2025-12-06 08:45:00.333618977 +0000 UTC m=+0.156967416 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: tmp-crun.jos3n6.mount: Deactivated successfully.
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:45:00 np0005548788.localdomain podman[84340]: 2025-12-06 08:45:00.378680948 +0000 UTC m=+0.197186046 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_migration_target, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Dec 06 08:45:00 np0005548788.localdomain podman[84341]: 2025-12-06 08:45:00.347795069 +0000 UTC m=+0.164425499 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true)
Dec 06 08:45:00 np0005548788.localdomain podman[84341]: 2025-12-06 08:45:00.434824612 +0000 UTC m=+0.251455082 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:45:00 np0005548788.localdomain podman[84340]: 2025-12-06 08:45:00.767353139 +0000 UTC m=+0.585858287 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4)
Dec 06 08:45:00 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:45:01 np0005548788.localdomain systemd[1]: tmp-crun.V325rL.mount: Deactivated successfully.
Dec 06 08:45:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:45:04 np0005548788.localdomain podman[84410]: 2025-12-06 08:45:04.261510501 +0000 UTC m=+0.088467279 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:45:04 np0005548788.localdomain podman[84410]: 2025-12-06 08:45:04.498647196 +0000 UTC m=+0.325603964 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 08:45:04 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:45:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:45:06 np0005548788.localdomain systemd[1]: tmp-crun.Vc9bWv.mount: Deactivated successfully.
Dec 06 08:45:06 np0005548788.localdomain podman[84439]: 2025-12-06 08:45:06.250690036 +0000 UTC m=+0.079727397 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:45:06 np0005548788.localdomain podman[84439]: 2025-12-06 08:45:06.30587268 +0000 UTC m=+0.134909971 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step5)
Dec 06 08:45:06 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:45:27 np0005548788.localdomain podman[84467]: 2025-12-06 08:45:27.268769583 +0000 UTC m=+0.092500794 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:45:27 np0005548788.localdomain podman[84467]: 2025-12-06 08:45:27.306543937 +0000 UTC m=+0.130275158 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: tmp-crun.DyLsnI.mount: Deactivated successfully.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:45:27 np0005548788.localdomain podman[84466]: 2025-12-06 08:45:27.322388539 +0000 UTC m=+0.149122694 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container)
Dec 06 08:45:27 np0005548788.localdomain podman[84468]: 2025-12-06 08:45:27.379121681 +0000 UTC m=+0.199544169 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 08:45:27 np0005548788.localdomain podman[84468]: 2025-12-06 08:45:27.408768791 +0000 UTC m=+0.229191269 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:45:27 np0005548788.localdomain podman[84475]: 2025-12-06 08:45:27.485684511 +0000 UTC m=+0.300057191 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:45:27 np0005548788.localdomain podman[84466]: 2025-12-06 08:45:27.506355533 +0000 UTC m=+0.333089678 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Dec 06 08:45:27 np0005548788.localdomain podman[84475]: 2025-12-06 08:45:27.516958112 +0000 UTC m=+0.331330752 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:45:27 np0005548788.localdomain podman[84469]: 2025-12-06 08:45:27.590001871 +0000 UTC m=+0.407334513 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4)
Dec 06 08:45:27 np0005548788.localdomain podman[84469]: 2025-12-06 08:45:27.627741894 +0000 UTC m=+0.445074526 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:45:27 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:45:31 np0005548788.localdomain podman[84618]: 2025-12-06 08:45:31.245004579 +0000 UTC m=+0.070636305 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:45:31 np0005548788.localdomain podman[84617]: 2025-12-06 08:45:31.295363403 +0000 UTC m=+0.118232173 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, distribution-scope=public)
Dec 06 08:45:31 np0005548788.localdomain podman[84617]: 2025-12-06 08:45:31.326324394 +0000 UTC m=+0.149193184 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc.)
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: tmp-crun.LiXjia.mount: Deactivated successfully.
Dec 06 08:45:31 np0005548788.localdomain podman[84619]: 2025-12-06 08:45:31.406405041 +0000 UTC m=+0.228616831 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:45:31 np0005548788.localdomain podman[84619]: 2025-12-06 08:45:31.431641665 +0000 UTC m=+0.253853435 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:45:31 np0005548788.localdomain podman[84618]: 2025-12-06 08:45:31.603592627 +0000 UTC m=+0.429224323 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:45:31 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:45:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:45:35 np0005548788.localdomain podman[84686]: 2025-12-06 08:45:35.249545042 +0000 UTC m=+0.082083741 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git)
Dec 06 08:45:35 np0005548788.localdomain podman[84686]: 2025-12-06 08:45:35.451552676 +0000 UTC m=+0.284091365 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 08:45:35 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:45:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:45:37 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:45:37 np0005548788.localdomain recover_tripleo_nova_virtqemud[84722]: 62021
Dec 06 08:45:37 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:45:37 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:45:37 np0005548788.localdomain podman[84715]: 2025-12-06 08:45:37.266446808 +0000 UTC m=+0.096172718 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 08:45:37 np0005548788.localdomain podman[84715]: 2025-12-06 08:45:37.298619528 +0000 UTC m=+0.128345478 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1761123044, tcib_managed=true)
Dec 06 08:45:37 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:45:42 np0005548788.localdomain sshd[84743]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:45:44 np0005548788.localdomain sshd[84743]: Received disconnect from 152.32.172.117 port 53898:11: Bye Bye [preauth]
Dec 06 08:45:44 np0005548788.localdomain sshd[84743]: Disconnected from authenticating user root 152.32.172.117 port 53898 [preauth]
Dec 06 08:45:51 np0005548788.localdomain systemd[1]: Starting dnf makecache...
Dec 06 08:45:51 np0005548788.localdomain dnf[84745]: Updating Subscription Management repositories.
Dec 06 08:45:51 np0005548788.localdomain sudo[84746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:45:51 np0005548788.localdomain sudo[84746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:45:51 np0005548788.localdomain sudo[84746]: pam_unix(sudo:session): session closed for user root
Dec 06 08:45:51 np0005548788.localdomain sudo[84761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:45:51 np0005548788.localdomain sudo[84761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:45:52 np0005548788.localdomain sudo[84761]: pam_unix(sudo:session): session closed for user root
Dec 06 08:45:53 np0005548788.localdomain sudo[84809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:45:53 np0005548788.localdomain sudo[84809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:45:53 np0005548788.localdomain sudo[84809]: pam_unix(sudo:session): session closed for user root
Dec 06 08:45:53 np0005548788.localdomain dnf[84745]: Metadata cache refreshed recently.
Dec 06 08:45:53 np0005548788.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 08:45:53 np0005548788.localdomain systemd[1]: Finished dnf makecache.
Dec 06 08:45:53 np0005548788.localdomain systemd[1]: dnf-makecache.service: Consumed 2.228s CPU time.
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:45:58 np0005548788.localdomain podman[84826]: 2025-12-06 08:45:58.293161935 +0000 UTC m=+0.105725166 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git)
Dec 06 08:45:58 np0005548788.localdomain podman[84835]: 2025-12-06 08:45:58.354641244 +0000 UTC m=+0.158405652 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:45:58 np0005548788.localdomain podman[84824]: 2025-12-06 08:45:58.268005453 +0000 UTC m=+0.085569139 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 08:45:58 np0005548788.localdomain podman[84824]: 2025-12-06 08:45:58.401589543 +0000 UTC m=+0.219153219 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-cron, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:45:58 np0005548788.localdomain podman[84835]: 2025-12-06 08:45:58.411336595 +0000 UTC m=+0.215100993 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:45:58 np0005548788.localdomain podman[84827]: 2025-12-06 08:45:58.415843255 +0000 UTC m=+0.224791433 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-18T23:44:13Z)
Dec 06 08:45:58 np0005548788.localdomain podman[84826]: 2025-12-06 08:45:58.426706003 +0000 UTC m=+0.239269254 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:45:58 np0005548788.localdomain podman[84825]: 2025-12-06 08:45:58.330943928 +0000 UTC m=+0.147275765 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible)
Dec 06 08:45:58 np0005548788.localdomain podman[84827]: 2025-12-06 08:45:58.498639506 +0000 UTC m=+0.307587684 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:45:58 np0005548788.localdomain podman[84825]: 2025-12-06 08:45:58.517928426 +0000 UTC m=+0.334260263 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container)
Dec 06 08:45:58 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:45:59 np0005548788.localdomain systemd[1]: tmp-crun.4cwyWt.mount: Deactivated successfully.
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: tmp-crun.zHu6ld.mount: Deactivated successfully.
Dec 06 08:46:02 np0005548788.localdomain podman[84931]: 2025-12-06 08:46:02.261030044 +0000 UTC m=+0.089218239 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 08:46:02 np0005548788.localdomain podman[84933]: 2025-12-06 08:46:02.320262767 +0000 UTC m=+0.139194306 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:46:02 np0005548788.localdomain podman[84931]: 2025-12-06 08:46:02.374795802 +0000 UTC m=+0.202983997 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git)
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:46:02 np0005548788.localdomain podman[84932]: 2025-12-06 08:46:02.393535432 +0000 UTC m=+0.213730290 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Dec 06 08:46:02 np0005548788.localdomain podman[84933]: 2025-12-06 08:46:02.445851429 +0000 UTC m=+0.264782928 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:46:02 np0005548788.localdomain podman[84932]: 2025-12-06 08:46:02.734812045 +0000 UTC m=+0.555006893 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:46:02 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:46:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:46:06 np0005548788.localdomain systemd[1]: tmp-crun.ZK2km0.mount: Deactivated successfully.
Dec 06 08:46:06 np0005548788.localdomain podman[85002]: 2025-12-06 08:46:06.271349816 +0000 UTC m=+0.099025792 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Dec 06 08:46:06 np0005548788.localdomain podman[85002]: 2025-12-06 08:46:06.446849754 +0000 UTC m=+0.274525700 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64)
Dec 06 08:46:06 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:46:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:46:08 np0005548788.localdomain podman[85031]: 2025-12-06 08:46:08.412383548 +0000 UTC m=+0.239802815 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com)
Dec 06 08:46:08 np0005548788.localdomain podman[85031]: 2025-12-06 08:46:08.469770033 +0000 UTC m=+0.297189280 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:46:08 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:46:27 np0005548788.localdomain sshd[85057]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:46:29 np0005548788.localdomain podman[85059]: 2025-12-06 08:46:29.279359731 +0000 UTC m=+0.103741858 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Dec 06 08:46:29 np0005548788.localdomain podman[85059]: 2025-12-06 08:46:29.285988946 +0000 UTC m=+0.110371073 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, container_name=logrotate_crond, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 08:46:29 np0005548788.localdomain sshd[85057]: Received disconnect from 36.50.177.119 port 37404:11: Bye Bye [preauth]
Dec 06 08:46:29 np0005548788.localdomain sshd[85057]: Disconnected from authenticating user root 36.50.177.119 port 37404 [preauth]
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: tmp-crun.TkrxoJ.mount: Deactivated successfully.
Dec 06 08:46:29 np0005548788.localdomain podman[85061]: 2025-12-06 08:46:29.385184124 +0000 UTC m=+0.202241014 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 06 08:46:29 np0005548788.localdomain podman[85068]: 2025-12-06 08:46:29.34302664 +0000 UTC m=+0.155004374 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com)
Dec 06 08:46:29 np0005548788.localdomain podman[85068]: 2025-12-06 08:46:29.431045331 +0000 UTC m=+0.243023105 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, vcs-type=git)
Dec 06 08:46:29 np0005548788.localdomain podman[85062]: 2025-12-06 08:46:29.44067265 +0000 UTC m=+0.255940926 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:46:29 np0005548788.localdomain podman[85062]: 2025-12-06 08:46:29.475812236 +0000 UTC m=+0.291080562 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:46:29 np0005548788.localdomain podman[85060]: 2025-12-06 08:46:29.490875121 +0000 UTC m=+0.310942595 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:46:29 np0005548788.localdomain podman[85061]: 2025-12-06 08:46:29.497823487 +0000 UTC m=+0.314880317 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:46:29 np0005548788.localdomain podman[85060]: 2025-12-06 08:46:29.551793585 +0000 UTC m=+0.371861009 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 06 08:46:29 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:46:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:46:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:46:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:46:33 np0005548788.localdomain podman[85217]: 2025-12-06 08:46:33.924619916 +0000 UTC m=+0.078682304 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:46:33 np0005548788.localdomain systemd[1]: tmp-crun.JrPaHh.mount: Deactivated successfully.
Dec 06 08:46:33 np0005548788.localdomain podman[85218]: 2025-12-06 08:46:33.976825811 +0000 UTC m=+0.129754504 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:36:58Z, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:46:33 np0005548788.localdomain podman[85217]: 2025-12-06 08:46:33.985261402 +0000 UTC m=+0.139323760 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:46:33 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:46:34 np0005548788.localdomain podman[85219]: 2025-12-06 08:46:34.035551687 +0000 UTC m=+0.184767615 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:46:34 np0005548788.localdomain podman[85219]: 2025-12-06 08:46:34.081419755 +0000 UTC m=+0.230635703 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:46:34 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:46:34 np0005548788.localdomain podman[85218]: 2025-12-06 08:46:34.324296535 +0000 UTC m=+0.477225238 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:46:34 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:46:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:46:37 np0005548788.localdomain podman[85287]: 2025-12-06 08:46:37.288413726 +0000 UTC m=+0.116605265 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z)
Dec 06 08:46:37 np0005548788.localdomain podman[85287]: 2025-12-06 08:46:37.484744207 +0000 UTC m=+0.312935786 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc.)
Dec 06 08:46:37 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:46:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:46:39 np0005548788.localdomain podman[85316]: 2025-12-06 08:46:39.248621758 +0000 UTC m=+0.075404653 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, architecture=x86_64)
Dec 06 08:46:39 np0005548788.localdomain podman[85316]: 2025-12-06 08:46:39.277259823 +0000 UTC m=+0.104042718 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:46:39 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:46:51 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:46:51 np0005548788.localdomain recover_tripleo_nova_virtqemud[85341]: 62021
Dec 06 08:46:51 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:46:51 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:46:53 np0005548788.localdomain sudo[85342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:46:53 np0005548788.localdomain sudo[85342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:46:53 np0005548788.localdomain sudo[85342]: pam_unix(sudo:session): session closed for user root
Dec 06 08:46:53 np0005548788.localdomain sudo[85357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:46:53 np0005548788.localdomain sudo[85357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:46:54 np0005548788.localdomain sudo[85357]: pam_unix(sudo:session): session closed for user root
Dec 06 08:46:54 np0005548788.localdomain sudo[85404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:46:54 np0005548788.localdomain sudo[85404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:46:54 np0005548788.localdomain sudo[85404]: pam_unix(sudo:session): session closed for user root
Dec 06 08:46:57 np0005548788.localdomain sshd[85419]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:46:58 np0005548788.localdomain sshd[85419]: Received disconnect from 152.32.172.117 port 58002:11: Bye Bye [preauth]
Dec 06 08:46:58 np0005548788.localdomain sshd[85419]: Disconnected from authenticating user root 152.32.172.117 port 58002 [preauth]
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:47:00 np0005548788.localdomain podman[85423]: 2025-12-06 08:47:00.266095929 +0000 UTC m=+0.089036275 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:11:48Z)
Dec 06 08:47:00 np0005548788.localdomain podman[85421]: 2025-12-06 08:47:00.311946457 +0000 UTC m=+0.134499660 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:47:00 np0005548788.localdomain podman[85421]: 2025-12-06 08:47:00.321729679 +0000 UTC m=+0.144282852 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Dec 06 08:47:00 np0005548788.localdomain podman[85434]: 2025-12-06 08:47:00.286241221 +0000 UTC m=+0.095182864 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi)
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:47:00 np0005548788.localdomain podman[85434]: 2025-12-06 08:47:00.36539269 +0000 UTC m=+0.174334283 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi)
Dec 06 08:47:00 np0005548788.localdomain podman[85423]: 2025-12-06 08:47:00.37288041 +0000 UTC m=+0.195820756 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:47:00 np0005548788.localdomain podman[85424]: 2025-12-06 08:47:00.382582021 +0000 UTC m=+0.196314171 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:47:00 np0005548788.localdomain podman[85424]: 2025-12-06 08:47:00.393368475 +0000 UTC m=+0.207100655 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:47:00 np0005548788.localdomain podman[85422]: 2025-12-06 08:47:00.472576093 +0000 UTC m=+0.294484407 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z)
Dec 06 08:47:00 np0005548788.localdomain podman[85422]: 2025-12-06 08:47:00.481797848 +0000 UTC m=+0.303706171 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd)
Dec 06 08:47:00 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:47:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:47:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:47:04 np0005548788.localdomain podman[85534]: 2025-12-06 08:47:04.261909853 +0000 UTC m=+0.085742523 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.buildah.version=1.41.4)
Dec 06 08:47:04 np0005548788.localdomain podman[85533]: 2025-12-06 08:47:04.327640975 +0000 UTC m=+0.152921400 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 08:47:04 np0005548788.localdomain podman[85534]: 2025-12-06 08:47:04.341886185 +0000 UTC m=+0.165718805 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 08:47:04 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:47:04 np0005548788.localdomain podman[85533]: 2025-12-06 08:47:04.377831227 +0000 UTC m=+0.203111592 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Dec 06 08:47:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:47:04 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:47:04 np0005548788.localdomain podman[85578]: 2025-12-06 08:47:04.477710146 +0000 UTC m=+0.086689632 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:47:04 np0005548788.localdomain podman[85578]: 2025-12-06 08:47:04.854783894 +0000 UTC m=+0.463763470 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1)
Dec 06 08:47:04 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:47:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:47:08 np0005548788.localdomain podman[85601]: 2025-12-06 08:47:08.265408394 +0000 UTC m=+0.090754908 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 06 08:47:08 np0005548788.localdomain podman[85601]: 2025-12-06 08:47:08.48099308 +0000 UTC m=+0.306339554 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:47:08 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:47:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:47:10 np0005548788.localdomain podman[85629]: 2025-12-06 08:47:10.268705987 +0000 UTC m=+0.092071938 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Dec 06 08:47:10 np0005548788.localdomain podman[85629]: 2025-12-06 08:47:10.298348073 +0000 UTC m=+0.121714054 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, release=1761123044, config_id=tripleo_step5, tcib_managed=true, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4)
Dec 06 08:47:10 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:47:12 np0005548788.localdomain sshd[85655]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:47:13 np0005548788.localdomain sshd[85655]: Received disconnect from 45.78.219.195 port 40566:11: Bye Bye [preauth]
Dec 06 08:47:13 np0005548788.localdomain sshd[85655]: Disconnected from authenticating user root 45.78.219.195 port 40566 [preauth]
Dec 06 08:47:16 np0005548788.localdomain sshd[85658]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:47:17 np0005548788.localdomain sshd[85658]: Received disconnect from 179.43.189.36 port 41178:11: Bye Bye [preauth]
Dec 06 08:47:17 np0005548788.localdomain sshd[85658]: Disconnected from authenticating user root 179.43.189.36 port 41178 [preauth]
Dec 06 08:47:26 np0005548788.localdomain sshd[85660]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:47:28 np0005548788.localdomain sshd[85660]: Received disconnect from 101.47.142.76 port 34262:11: Bye Bye [preauth]
Dec 06 08:47:28 np0005548788.localdomain sshd[85660]: Disconnected from authenticating user root 101.47.142.76 port 34262 [preauth]
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:47:31 np0005548788.localdomain podman[85662]: 2025-12-06 08:47:31.28291819 +0000 UTC m=+0.103892044 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:47:31 np0005548788.localdomain podman[85662]: 2025-12-06 08:47:31.291236817 +0000 UTC m=+0.112210641 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, name=rhosp17/openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:47:31 np0005548788.localdomain podman[85676]: 2025-12-06 08:47:31.359973273 +0000 UTC m=+0.161657300 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:47:31 np0005548788.localdomain podman[85664]: 2025-12-06 08:47:31.381424976 +0000 UTC m=+0.194846786 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Dec 06 08:47:31 np0005548788.localdomain podman[85664]: 2025-12-06 08:47:31.416056267 +0000 UTC m=+0.229478127 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Dec 06 08:47:31 np0005548788.localdomain podman[85663]: 2025-12-06 08:47:31.43006271 +0000 UTC m=+0.245622796 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:47:31 np0005548788.localdomain podman[85663]: 2025-12-06 08:47:31.468647122 +0000 UTC m=+0.284207228 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible)
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:47:31 np0005548788.localdomain podman[85676]: 2025-12-06 08:47:31.514356986 +0000 UTC m=+0.316041063 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible)
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:47:31 np0005548788.localdomain podman[85670]: 2025-12-06 08:47:31.59824955 +0000 UTC m=+0.407876053 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 08:47:31 np0005548788.localdomain podman[85670]: 2025-12-06 08:47:31.638723661 +0000 UTC m=+0.448350134 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-iscsid)
Dec 06 08:47:31 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:47:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:47:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:47:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:47:35 np0005548788.localdomain podman[85821]: 2025-12-06 08:47:35.273663677 +0000 UTC m=+0.069562742 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 08:47:35 np0005548788.localdomain podman[85820]: 2025-12-06 08:47:35.243800213 +0000 UTC m=+0.069192801 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.)
Dec 06 08:47:35 np0005548788.localdomain podman[85819]: 2025-12-06 08:47:35.307564215 +0000 UTC m=+0.132078755 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible)
Dec 06 08:47:35 np0005548788.localdomain podman[85821]: 2025-12-06 08:47:35.324748366 +0000 UTC m=+0.120647491 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 08:47:35 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:47:35 np0005548788.localdomain podman[85819]: 2025-12-06 08:47:35.373784342 +0000 UTC m=+0.198298902 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:47:35 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:47:35 np0005548788.localdomain podman[85820]: 2025-12-06 08:47:35.600521703 +0000 UTC m=+0.425914381 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:47:35 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:47:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:47:39 np0005548788.localdomain podman[85888]: 2025-12-06 08:47:39.270773259 +0000 UTC m=+0.096998110 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=metrics_qdr, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:47:39 np0005548788.localdomain podman[85888]: 2025-12-06 08:47:39.480610777 +0000 UTC m=+0.306835578 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1)
Dec 06 08:47:39 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:47:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:47:41 np0005548788.localdomain podman[85918]: 2025-12-06 08:47:41.254588221 +0000 UTC m=+0.078067825 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Dec 06 08:47:41 np0005548788.localdomain podman[85918]: 2025-12-06 08:47:41.290710298 +0000 UTC m=+0.114189862 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute)
Dec 06 08:47:41 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:47:49 np0005548788.localdomain sshd[85942]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:47:51 np0005548788.localdomain sshd[85942]: Received disconnect from 45.119.84.54 port 37578:11: Bye Bye [preauth]
Dec 06 08:47:51 np0005548788.localdomain sshd[85942]: Disconnected from authenticating user root 45.119.84.54 port 37578 [preauth]
Dec 06 08:47:54 np0005548788.localdomain sshd[85944]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:47:54 np0005548788.localdomain sudo[85946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:47:54 np0005548788.localdomain sudo[85946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:47:54 np0005548788.localdomain sudo[85946]: pam_unix(sudo:session): session closed for user root
Dec 06 08:47:54 np0005548788.localdomain sudo[85961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:47:54 np0005548788.localdomain sudo[85961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:47:55 np0005548788.localdomain sshd[85944]: Received disconnect from 148.227.3.232 port 53244:11: Bye Bye [preauth]
Dec 06 08:47:55 np0005548788.localdomain sshd[85944]: Disconnected from authenticating user root 148.227.3.232 port 53244 [preauth]
Dec 06 08:47:55 np0005548788.localdomain sudo[85961]: pam_unix(sudo:session): session closed for user root
Dec 06 08:47:56 np0005548788.localdomain sudo[86007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:47:56 np0005548788.localdomain sudo[86007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:47:56 np0005548788.localdomain sudo[86007]: pam_unix(sudo:session): session closed for user root
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:48:02 np0005548788.localdomain podman[86023]: 2025-12-06 08:48:02.279851327 +0000 UTC m=+0.097680752 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:48:02 np0005548788.localdomain podman[86023]: 2025-12-06 08:48:02.290323901 +0000 UTC m=+0.108153296 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:48:02 np0005548788.localdomain podman[86026]: 2025-12-06 08:48:02.334742563 +0000 UTC m=+0.145822319 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=)
Dec 06 08:48:02 np0005548788.localdomain podman[86026]: 2025-12-06 08:48:02.36662174 +0000 UTC m=+0.177701576 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi)
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: tmp-crun.DyM6sm.mount: Deactivated successfully.
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:48:02 np0005548788.localdomain podman[86025]: 2025-12-06 08:48:02.432687192 +0000 UTC m=+0.249697832 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T23:44:13Z)
Dec 06 08:48:02 np0005548788.localdomain podman[86024]: 2025-12-06 08:48:02.395038458 +0000 UTC m=+0.212812561 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Dec 06 08:48:02 np0005548788.localdomain podman[86025]: 2025-12-06 08:48:02.449636297 +0000 UTC m=+0.266647007 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:48:02 np0005548788.localdomain podman[86024]: 2025-12-06 08:48:02.478693965 +0000 UTC m=+0.296468158 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64)
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:48:02 np0005548788.localdomain podman[86022]: 2025-12-06 08:48:02.540455915 +0000 UTC m=+0.358318931 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron)
Dec 06 08:48:02 np0005548788.localdomain podman[86022]: 2025-12-06 08:48:02.575598121 +0000 UTC m=+0.393461147 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:48:02 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:48:06 np0005548788.localdomain podman[86135]: 2025-12-06 08:48:06.247648414 +0000 UTC m=+0.070943824 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, architecture=x86_64)
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: tmp-crun.Oe9r64.mount: Deactivated successfully.
Dec 06 08:48:06 np0005548788.localdomain podman[86135]: 2025-12-06 08:48:06.302714787 +0000 UTC m=+0.126010157 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, architecture=x86_64, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:48:06 np0005548788.localdomain podman[86133]: 2025-12-06 08:48:06.274167434 +0000 UTC m=+0.100379664 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:48:06 np0005548788.localdomain podman[86134]: 2025-12-06 08:48:06.305415561 +0000 UTC m=+0.130896169 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, managed_by=tripleo_ansible)
Dec 06 08:48:06 np0005548788.localdomain podman[86133]: 2025-12-06 08:48:06.409747136 +0000 UTC m=+0.235959346 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:48:06 np0005548788.localdomain podman[86134]: 2025-12-06 08:48:06.724811529 +0000 UTC m=+0.550292137 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:48:06 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:48:07 np0005548788.localdomain systemd[1]: tmp-crun.eKT5Zn.mount: Deactivated successfully.
Dec 06 08:48:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:48:10 np0005548788.localdomain podman[86206]: 2025-12-06 08:48:10.259921137 +0000 UTC m=+0.087668822 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 06 08:48:10 np0005548788.localdomain podman[86206]: 2025-12-06 08:48:10.472624403 +0000 UTC m=+0.300372108 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:48:10 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:48:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:48:12 np0005548788.localdomain podman[86235]: 2025-12-06 08:48:12.235858584 +0000 UTC m=+0.067165128 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z)
Dec 06 08:48:12 np0005548788.localdomain podman[86235]: 2025-12-06 08:48:12.271571598 +0000 UTC m=+0.102878222 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:48:12 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:48:13 np0005548788.localdomain sshd[86261]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:48:15 np0005548788.localdomain sshd[86261]: Received disconnect from 152.32.172.117 port 36022:11: Bye Bye [preauth]
Dec 06 08:48:15 np0005548788.localdomain sshd[86261]: Disconnected from authenticating user root 152.32.172.117 port 36022 [preauth]
Dec 06 08:48:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:48:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:48:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:48:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:48:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:48:33 np0005548788.localdomain podman[86265]: 2025-12-06 08:48:33.277568437 +0000 UTC m=+0.092452240 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_compute, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:48:33 np0005548788.localdomain podman[86265]: 2025-12-06 08:48:33.940812585 +0000 UTC m=+0.755696438 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:48:33 np0005548788.localdomain podman[86263]: 2025-12-06 08:48:33.94810731 +0000 UTC m=+0.765174040 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:48:33 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:48:33 np0005548788.localdomain podman[86270]: 2025-12-06 08:48:33.995395243 +0000 UTC m=+0.801678800 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:48:34 np0005548788.localdomain podman[86270]: 2025-12-06 08:48:34.02149859 +0000 UTC m=+0.827782127 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:48:34 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:48:34 np0005548788.localdomain podman[86263]: 2025-12-06 08:48:34.034477841 +0000 UTC m=+0.851544601 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:48:34 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:48:34 np0005548788.localdomain podman[86264]: 2025-12-06 08:48:34.090769002 +0000 UTC m=+0.905805629 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:48:34 np0005548788.localdomain podman[86264]: 2025-12-06 08:48:34.124663659 +0000 UTC m=+0.939700276 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=collectd, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:51:28Z, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:48:34 np0005548788.localdomain podman[86266]: 2025-12-06 08:48:34.13016752 +0000 UTC m=+0.938336655 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:48:34 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:48:34 np0005548788.localdomain podman[86266]: 2025-12-06 08:48:34.208676407 +0000 UTC m=+1.016845512 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:48:34 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:48:34 np0005548788.localdomain systemd[1]: tmp-crun.TtgL0x.mount: Deactivated successfully.
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:48:37 np0005548788.localdomain podman[86415]: 2025-12-06 08:48:37.26559058 +0000 UTC m=+0.087765806 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:48:37 np0005548788.localdomain podman[86415]: 2025-12-06 08:48:37.319121975 +0000 UTC m=+0.141297201 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ovn_metadata_agent)
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:48:37 np0005548788.localdomain podman[86417]: 2025-12-06 08:48:37.321591411 +0000 UTC m=+0.139306308 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, container_name=ovn_controller)
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: tmp-crun.DdTwX8.mount: Deactivated successfully.
Dec 06 08:48:37 np0005548788.localdomain podman[86416]: 2025-12-06 08:48:37.382250186 +0000 UTC m=+0.203144501 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1)
Dec 06 08:48:37 np0005548788.localdomain podman[86417]: 2025-12-06 08:48:37.404882536 +0000 UTC m=+0.222597443 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:48:37 np0005548788.localdomain podman[86416]: 2025-12-06 08:48:37.804469012 +0000 UTC m=+0.625363257 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:48:37 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:48:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:48:41 np0005548788.localdomain podman[86487]: 2025-12-06 08:48:41.262163406 +0000 UTC m=+0.091614333 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr)
Dec 06 08:48:41 np0005548788.localdomain podman[86487]: 2025-12-06 08:48:41.453910015 +0000 UTC m=+0.283360972 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 08:48:41 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:48:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:48:43 np0005548788.localdomain podman[86517]: 2025-12-06 08:48:43.26031725 +0000 UTC m=+0.084124432 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, config_id=tripleo_step5, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:48:43 np0005548788.localdomain podman[86517]: 2025-12-06 08:48:43.290557775 +0000 UTC m=+0.114364977 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, architecture=x86_64)
Dec 06 08:48:43 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:48:50 np0005548788.localdomain sshd[86543]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:48:51 np0005548788.localdomain sshd[86544]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:48:56 np0005548788.localdomain sudo[86545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:48:56 np0005548788.localdomain sudo[86545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:48:56 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:48:56 np0005548788.localdomain sudo[86545]: pam_unix(sudo:session): session closed for user root
Dec 06 08:48:56 np0005548788.localdomain recover_tripleo_nova_virtqemud[86561]: 62021
Dec 06 08:48:56 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:48:56 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:48:56 np0005548788.localdomain sudo[86562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:48:56 np0005548788.localdomain sudo[86562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:48:57 np0005548788.localdomain sudo[86562]: pam_unix(sudo:session): session closed for user root
Dec 06 08:48:57 np0005548788.localdomain sudo[86610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:48:57 np0005548788.localdomain sudo[86610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:48:57 np0005548788.localdomain sudo[86610]: pam_unix(sudo:session): session closed for user root
Dec 06 08:49:00 np0005548788.localdomain sshd[86543]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:49:00 np0005548788.localdomain sshd[86543]: banner exchange: Connection from 222.208.64.40 port 57178: Connection timed out
Dec 06 08:49:01 np0005548788.localdomain sshd[86544]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:49:01 np0005548788.localdomain sshd[86544]: banner exchange: Connection from 113.108.95.34 port 12068: Connection timed out
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: tmp-crun.F9cwdG.mount: Deactivated successfully.
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:49:04 np0005548788.localdomain podman[86627]: 2025-12-06 08:49:04.387042333 +0000 UTC m=+0.203481963 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 06 08:49:04 np0005548788.localdomain podman[86625]: 2025-12-06 08:49:04.345977573 +0000 UTC m=+0.164101824 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z)
Dec 06 08:49:04 np0005548788.localdomain podman[86626]: 2025-12-06 08:49:04.299585189 +0000 UTC m=+0.116004448 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, distribution-scope=public, container_name=collectd, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:49:04 np0005548788.localdomain podman[86625]: 2025-12-06 08:49:04.425767641 +0000 UTC m=+0.243891862 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron)
Dec 06 08:49:04 np0005548788.localdomain podman[86626]: 2025-12-06 08:49:04.43771104 +0000 UTC m=+0.254130289 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:49:04 np0005548788.localdomain podman[86627]: 2025-12-06 08:49:04.445671626 +0000 UTC m=+0.262111216 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:49:04 np0005548788.localdomain podman[86674]: 2025-12-06 08:49:04.482127324 +0000 UTC m=+0.186325383 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, url=https://www.redhat.com)
Dec 06 08:49:04 np0005548788.localdomain podman[86674]: 2025-12-06 08:49:04.497612712 +0000 UTC m=+0.201810791 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid)
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:49:04 np0005548788.localdomain podman[86628]: 2025-12-06 08:49:04.594720345 +0000 UTC m=+0.405049155 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, release=1761123044, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:49:04 np0005548788.localdomain podman[86628]: 2025-12-06 08:49:04.649807729 +0000 UTC m=+0.460136499 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044)
Dec 06 08:49:04 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:49:08 np0005548788.localdomain podman[86738]: 2025-12-06 08:49:08.266896872 +0000 UTC m=+0.090833020 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: tmp-crun.63SATu.mount: Deactivated successfully.
Dec 06 08:49:08 np0005548788.localdomain podman[86738]: 2025-12-06 08:49:08.324950637 +0000 UTC m=+0.148886785 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent)
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:49:08 np0005548788.localdomain podman[86739]: 2025-12-06 08:49:08.331935612 +0000 UTC m=+0.151939018 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 08:49:08 np0005548788.localdomain podman[86740]: 2025-12-06 08:49:08.412999049 +0000 UTC m=+0.228097643 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, container_name=ovn_controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:49:08 np0005548788.localdomain podman[86740]: 2025-12-06 08:49:08.440580332 +0000 UTC m=+0.255678916 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, container_name=ovn_controller)
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:49:08 np0005548788.localdomain podman[86739]: 2025-12-06 08:49:08.702342535 +0000 UTC m=+0.522345941 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z)
Dec 06 08:49:08 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:49:09 np0005548788.localdomain systemd[1]: tmp-crun.W7ww5U.mount: Deactivated successfully.
Dec 06 08:49:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:49:12 np0005548788.localdomain podman[86810]: 2025-12-06 08:49:12.262314513 +0000 UTC m=+0.090007404 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git)
Dec 06 08:49:12 np0005548788.localdomain podman[86810]: 2025-12-06 08:49:12.467675392 +0000 UTC m=+0.295368333 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 06 08:49:12 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:49:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:49:14 np0005548788.localdomain podman[86839]: 2025-12-06 08:49:14.270508097 +0000 UTC m=+0.086664191 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 06 08:49:14 np0005548788.localdomain podman[86839]: 2025-12-06 08:49:14.329255683 +0000 UTC m=+0.145411808 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step5, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 08:49:14 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:49:33 np0005548788.localdomain sshd[86884]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:49:35 np0005548788.localdomain sshd[86884]: Received disconnect from 152.32.172.117 port 49186:11: Bye Bye [preauth]
Dec 06 08:49:35 np0005548788.localdomain sshd[86884]: Disconnected from authenticating user root 152.32.172.117 port 49186 [preauth]
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: tmp-crun.sQ7NK9.mount: Deactivated successfully.
Dec 06 08:49:35 np0005548788.localdomain podman[86913]: 2025-12-06 08:49:35.211097952 +0000 UTC m=+0.156745177 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Dec 06 08:49:35 np0005548788.localdomain podman[86914]: 2025-12-06 08:49:35.21750838 +0000 UTC m=+0.155860549 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:49:35 np0005548788.localdomain podman[86914]: 2025-12-06 08:49:35.229562014 +0000 UTC m=+0.167914213 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 06 08:49:35 np0005548788.localdomain podman[86911]: 2025-12-06 08:49:35.188582836 +0000 UTC m=+0.140645689 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, tcib_managed=true, distribution-scope=public)
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:49:35 np0005548788.localdomain podman[86913]: 2025-12-06 08:49:35.293262383 +0000 UTC m=+0.238909558 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:49:35 np0005548788.localdomain podman[86925]: 2025-12-06 08:49:35.3103005 +0000 UTC m=+0.244267394 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:49:35 np0005548788.localdomain podman[86912]: 2025-12-06 08:49:35.168758183 +0000 UTC m=+0.117418351 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, release=1761123044, container_name=collectd, url=https://www.redhat.com, vcs-type=git)
Dec 06 08:49:35 np0005548788.localdomain podman[86925]: 2025-12-06 08:49:35.343646411 +0000 UTC m=+0.277613295 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 08:49:35 np0005548788.localdomain podman[86912]: 2025-12-06 08:49:35.353603669 +0000 UTC m=+0.302263827 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:49:35 np0005548788.localdomain podman[86911]: 2025-12-06 08:49:35.373107792 +0000 UTC m=+0.325170685 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible)
Dec 06 08:49:35 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:49:37 np0005548788.localdomain sshd[87022]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:49:38 np0005548788.localdomain sshd[87022]: Received disconnect from 179.43.189.36 port 48024:11: Bye Bye [preauth]
Dec 06 08:49:38 np0005548788.localdomain sshd[87022]: Disconnected from authenticating user root 179.43.189.36 port 48024 [preauth]
Dec 06 08:49:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:49:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:49:38 np0005548788.localdomain podman[87024]: 2025-12-06 08:49:38.519160581 +0000 UTC m=+0.102886413 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:49:38 np0005548788.localdomain podman[87024]: 2025-12-06 08:49:38.571535101 +0000 UTC m=+0.155260973 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vendor=Red Hat, Inc.)
Dec 06 08:49:38 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:49:38 np0005548788.localdomain podman[87042]: 2025-12-06 08:49:38.668660293 +0000 UTC m=+0.140679170 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, version=17.1.12, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:49:38 np0005548788.localdomain podman[87042]: 2025-12-06 08:49:38.699014272 +0000 UTC m=+0.171033149 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:49:38 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:49:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:49:39 np0005548788.localdomain podman[87070]: 2025-12-06 08:49:39.239610427 +0000 UTC m=+0.070120819 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:49:39 np0005548788.localdomain podman[87070]: 2025-12-06 08:49:39.577540037 +0000 UTC m=+0.408050499 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:49:39 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:49:40 np0005548788.localdomain sshd[87093]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:49:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:49:43 np0005548788.localdomain podman[87094]: 2025-12-06 08:49:43.266480981 +0000 UTC m=+0.092401738 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:49:43 np0005548788.localdomain podman[87094]: 2025-12-06 08:49:43.466580618 +0000 UTC m=+0.292501305 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z)
Dec 06 08:49:43 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:49:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:49:45 np0005548788.localdomain podman[87124]: 2025-12-06 08:49:45.270930459 +0000 UTC m=+0.096949219 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:49:45 np0005548788.localdomain podman[87124]: 2025-12-06 08:49:45.329078028 +0000 UTC m=+0.155096798 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 06 08:49:45 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:49:51 np0005548788.localdomain sshd[87093]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:49:51 np0005548788.localdomain sshd[87093]: banner exchange: Connection from 113.137.40.250 port 41158: Connection timed out
Dec 06 08:49:57 np0005548788.localdomain sudo[87151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:49:57 np0005548788.localdomain sudo[87151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:49:57 np0005548788.localdomain sudo[87151]: pam_unix(sudo:session): session closed for user root
Dec 06 08:49:58 np0005548788.localdomain sudo[87166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:49:58 np0005548788.localdomain sudo[87166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:49:58 np0005548788.localdomain sudo[87166]: pam_unix(sudo:session): session closed for user root
Dec 06 08:50:01 np0005548788.localdomain sudo[87212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:50:01 np0005548788.localdomain sudo[87212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:50:01 np0005548788.localdomain sudo[87212]: pam_unix(sudo:session): session closed for user root
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:50:06 np0005548788.localdomain podman[87227]: 2025-12-06 08:50:06.263322707 +0000 UTC m=+0.092466430 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 06 08:50:06 np0005548788.localdomain podman[87235]: 2025-12-06 08:50:06.315872761 +0000 UTC m=+0.131243078 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:50:06 np0005548788.localdomain podman[87228]: 2025-12-06 08:50:06.370219182 +0000 UTC m=+0.194301319 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 08:50:06 np0005548788.localdomain podman[87228]: 2025-12-06 08:50:06.378330123 +0000 UTC m=+0.202412230 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:50:06 np0005548788.localdomain podman[87240]: 2025-12-06 08:50:06.285030668 +0000 UTC m=+0.095549056 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 06 08:50:06 np0005548788.localdomain podman[87240]: 2025-12-06 08:50:06.469519382 +0000 UTC m=+0.280037810 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:50:06 np0005548788.localdomain podman[87227]: 2025-12-06 08:50:06.520590271 +0000 UTC m=+0.349733994 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-cron, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:50:06 np0005548788.localdomain podman[87235]: 2025-12-06 08:50:06.605726454 +0000 UTC m=+0.421096791 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, container_name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:50:06 np0005548788.localdomain podman[87229]: 2025-12-06 08:50:06.525305207 +0000 UTC m=+0.345910706 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:50:06 np0005548788.localdomain podman[87229]: 2025-12-06 08:50:06.658758704 +0000 UTC m=+0.479364233 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Dec 06 08:50:06 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:50:07 np0005548788.localdomain sshd[87338]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:50:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:50:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:50:09 np0005548788.localdomain podman[87341]: 2025-12-06 08:50:09.270977335 +0000 UTC m=+0.095587967 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 06 08:50:09 np0005548788.localdomain podman[87340]: 2025-12-06 08:50:09.318348479 +0000 UTC m=+0.142961271 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:50:09 np0005548788.localdomain podman[87341]: 2025-12-06 08:50:09.327590575 +0000 UTC m=+0.152201137 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:50:09 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:50:09 np0005548788.localdomain podman[87340]: 2025-12-06 08:50:09.385911098 +0000 UTC m=+0.210523900 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 06 08:50:09 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:50:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:50:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 786 writes, 3166 keys, 786 commit groups, 1.0 writes per commit group, ingest: 4.00 MB, 0.01 MB/s
                                                          Interval WAL: 786 writes, 263 syncs, 2.99 writes per sync, written: 0.00 GB, 0.01 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:50:09 np0005548788.localdomain sshd[87338]: Received disconnect from 36.50.177.119 port 48904:11: Bye Bye [preauth]
Dec 06 08:50:09 np0005548788.localdomain sshd[87338]: Disconnected from authenticating user root 36.50.177.119 port 48904 [preauth]
Dec 06 08:50:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:50:09 np0005548788.localdomain podman[87389]: 2025-12-06 08:50:09.805068839 +0000 UTC m=+0.087927289 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:50:10 np0005548788.localdomain podman[87389]: 2025-12-06 08:50:10.163863273 +0000 UTC m=+0.446721733 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 06 08:50:10 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:50:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:50:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 229 writes, 821 keys, 229 commit groups, 1.0 writes per commit group, ingest: 0.91 MB, 0.00 MB/s
                                                          Interval WAL: 229 writes, 77 syncs, 2.97 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:50:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:50:14 np0005548788.localdomain podman[87410]: 2025-12-06 08:50:14.259034769 +0000 UTC m=+0.084991940 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64)
Dec 06 08:50:14 np0005548788.localdomain podman[87410]: 2025-12-06 08:50:14.454564155 +0000 UTC m=+0.280521406 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:50:14 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:50:15 np0005548788.localdomain sshd[87439]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:50:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:50:16 np0005548788.localdomain podman[87441]: 2025-12-06 08:50:16.251052954 +0000 UTC m=+0.074502385 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:50:16 np0005548788.localdomain podman[87441]: 2025-12-06 08:50:16.281677841 +0000 UTC m=+0.105127262 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public)
Dec 06 08:50:16 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:50:17 np0005548788.localdomain sshd[87439]: Received disconnect from 45.119.84.54 port 57642:11: Bye Bye [preauth]
Dec 06 08:50:17 np0005548788.localdomain sshd[87439]: Disconnected from authenticating user root 45.119.84.54 port 57642 [preauth]
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: tmp-crun.WIqR7F.mount: Deactivated successfully.
Dec 06 08:50:37 np0005548788.localdomain podman[87513]: 2025-12-06 08:50:37.261071558 +0000 UTC m=+0.081190192 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:50:37 np0005548788.localdomain podman[87514]: 2025-12-06 08:50:37.323018383 +0000 UTC m=+0.139613068 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:50:37 np0005548788.localdomain podman[87515]: 2025-12-06 08:50:37.289124675 +0000 UTC m=+0.099783717 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:50:37 np0005548788.localdomain podman[87514]: 2025-12-06 08:50:37.34946945 +0000 UTC m=+0.166064115 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:50:37 np0005548788.localdomain podman[87513]: 2025-12-06 08:50:37.390712656 +0000 UTC m=+0.210831300 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:50:37 np0005548788.localdomain podman[87515]: 2025-12-06 08:50:37.421965772 +0000 UTC m=+0.232624884 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:50:37 np0005548788.localdomain podman[87512]: 2025-12-06 08:50:37.474429075 +0000 UTC m=+0.294948061 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Dec 06 08:50:37 np0005548788.localdomain podman[87512]: 2025-12-06 08:50:37.486964142 +0000 UTC m=+0.307483138 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:50:37 np0005548788.localdomain podman[87526]: 2025-12-06 08:50:37.394622626 +0000 UTC m=+0.202622165 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64)
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:50:37 np0005548788.localdomain podman[87526]: 2025-12-06 08:50:37.525410791 +0000 UTC m=+0.333410370 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:50:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:50:38 np0005548788.localdomain systemd[1]: tmp-crun.XgiPtb.mount: Deactivated successfully.
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: tmp-crun.W08lxa.mount: Deactivated successfully.
Dec 06 08:50:40 np0005548788.localdomain podman[87622]: 2025-12-06 08:50:40.26983074 +0000 UTC m=+0.096305419 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:50:40 np0005548788.localdomain podman[87623]: 2025-12-06 08:50:40.31995048 +0000 UTC m=+0.144051006 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Dec 06 08:50:40 np0005548788.localdomain podman[87622]: 2025-12-06 08:50:40.325558484 +0000 UTC m=+0.152033163 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z)
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:50:40 np0005548788.localdomain podman[87623]: 2025-12-06 08:50:40.368269774 +0000 UTC m=+0.192370380 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:50:40 np0005548788.localdomain podman[87653]: 2025-12-06 08:50:40.379507221 +0000 UTC m=+0.087315340 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Dec 06 08:50:40 np0005548788.localdomain podman[87653]: 2025-12-06 08:50:40.751671159 +0000 UTC m=+0.459479278 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=nova_migration_target, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:50:40 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:50:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:50:45 np0005548788.localdomain podman[87692]: 2025-12-06 08:50:45.263058735 +0000 UTC m=+0.090211440 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:50:45 np0005548788.localdomain podman[87692]: 2025-12-06 08:50:45.482714548 +0000 UTC m=+0.309867213 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 08:50:45 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:50:46 np0005548788.localdomain sshd[87721]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:50:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:50:47 np0005548788.localdomain podman[87723]: 2025-12-06 08:50:47.253875713 +0000 UTC m=+0.080379746 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com)
Dec 06 08:50:47 np0005548788.localdomain sshd[87721]: Received disconnect from 179.43.189.36 port 59012:11: Bye Bye [preauth]
Dec 06 08:50:47 np0005548788.localdomain sshd[87721]: Disconnected from authenticating user root 179.43.189.36 port 59012 [preauth]
Dec 06 08:50:47 np0005548788.localdomain podman[87723]: 2025-12-06 08:50:47.314791807 +0000 UTC m=+0.141295790 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, release=1761123044, url=https://www.redhat.com)
Dec 06 08:50:47 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:50:50 np0005548788.localdomain sshd[87750]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:50:54 np0005548788.localdomain sshd[87751]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:50:55 np0005548788.localdomain sshd[87751]: Received disconnect from 152.32.172.117 port 34856:11: Bye Bye [preauth]
Dec 06 08:50:55 np0005548788.localdomain sshd[87751]: Disconnected from authenticating user root 152.32.172.117 port 34856 [preauth]
Dec 06 08:51:00 np0005548788.localdomain sshd[87750]: error: kex_exchange_identification: read: Connection timed out
Dec 06 08:51:00 np0005548788.localdomain sshd[87750]: banner exchange: Connection from 14.103.127.30 port 44866: Connection timed out
Dec 06 08:51:01 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:51:01 np0005548788.localdomain recover_tripleo_nova_virtqemud[87754]: 62021
Dec 06 08:51:01 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:51:01 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:51:01 np0005548788.localdomain sudo[87755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:51:01 np0005548788.localdomain sudo[87755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:01 np0005548788.localdomain sudo[87755]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:01 np0005548788.localdomain sudo[87770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:51:01 np0005548788.localdomain sudo[87770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:02 np0005548788.localdomain sudo[87770]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:02 np0005548788.localdomain sudo[87806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:51:02 np0005548788.localdomain sudo[87806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:02 np0005548788.localdomain sudo[87806]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:02 np0005548788.localdomain sudo[87821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:51:02 np0005548788.localdomain sudo[87821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:03 np0005548788.localdomain sudo[87821]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:03 np0005548788.localdomain sshd[87867]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:51:03 np0005548788.localdomain sudo[87869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:51:03 np0005548788.localdomain sudo[87869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:03 np0005548788.localdomain sudo[87869]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:07 np0005548788.localdomain sshd[87867]: Received disconnect from 101.47.142.76 port 49798:11: Bye Bye [preauth]
Dec 06 08:51:07 np0005548788.localdomain sshd[87867]: Disconnected from authenticating user root 101.47.142.76 port 49798 [preauth]
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: tmp-crun.BQs8Zv.mount: Deactivated successfully.
Dec 06 08:51:08 np0005548788.localdomain podman[87884]: 2025-12-06 08:51:08.13433723 +0000 UTC m=+0.102857641 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Dec 06 08:51:08 np0005548788.localdomain podman[87884]: 2025-12-06 08:51:08.176623878 +0000 UTC m=+0.145144289 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:51:08 np0005548788.localdomain podman[87886]: 2025-12-06 08:51:08.192748306 +0000 UTC m=+0.159924666 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute)
Dec 06 08:51:08 np0005548788.localdomain podman[87887]: 2025-12-06 08:51:08.232562617 +0000 UTC m=+0.199659124 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, tcib_managed=true, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:51:08 np0005548788.localdomain podman[87886]: 2025-12-06 08:51:08.243550767 +0000 UTC m=+0.210727107 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:51:08 np0005548788.localdomain podman[87888]: 2025-12-06 08:51:08.281678606 +0000 UTC m=+0.240350553 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:51:08 np0005548788.localdomain podman[87887]: 2025-12-06 08:51:08.293688907 +0000 UTC m=+0.260785324 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com)
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:51:08 np0005548788.localdomain podman[87888]: 2025-12-06 08:51:08.310478046 +0000 UTC m=+0.269150023 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:51:08 np0005548788.localdomain podman[87885]: 2025-12-06 08:51:08.378407827 +0000 UTC m=+0.345136103 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container)
Dec 06 08:51:08 np0005548788.localdomain podman[87885]: 2025-12-06 08:51:08.392690179 +0000 UTC m=+0.359418495 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, batch=17.1_20251118.1, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-collectd, release=1761123044)
Dec 06 08:51:08 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:51:11 np0005548788.localdomain podman[87997]: 2025-12-06 08:51:11.268769549 +0000 UTC m=+0.085821974 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4)
Dec 06 08:51:11 np0005548788.localdomain podman[87996]: 2025-12-06 08:51:11.318057883 +0000 UTC m=+0.138432692 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 08:51:11 np0005548788.localdomain podman[87997]: 2025-12-06 08:51:11.321649884 +0000 UTC m=+0.138702249 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, release=1761123044, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: tmp-crun.LuAd4X.mount: Deactivated successfully.
Dec 06 08:51:11 np0005548788.localdomain podman[87995]: 2025-12-06 08:51:11.386665484 +0000 UTC m=+0.210975124 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, version=17.1.12)
Dec 06 08:51:11 np0005548788.localdomain podman[87995]: 2025-12-06 08:51:11.459039952 +0000 UTC m=+0.283349582 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:51:11 np0005548788.localdomain podman[87996]: 2025-12-06 08:51:11.672273165 +0000 UTC m=+0.492647934 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Dec 06 08:51:11 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:51:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:51:16 np0005548788.localdomain podman[88063]: 2025-12-06 08:51:16.260679161 +0000 UTC m=+0.086641340 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Dec 06 08:51:16 np0005548788.localdomain podman[88063]: 2025-12-06 08:51:16.462569354 +0000 UTC m=+0.288531473 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 06 08:51:16 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:51:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:51:18 np0005548788.localdomain podman[88093]: 2025-12-06 08:51:18.274641034 +0000 UTC m=+0.097850607 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute)
Dec 06 08:51:18 np0005548788.localdomain podman[88093]: 2025-12-06 08:51:18.306758288 +0000 UTC m=+0.129967861 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute)
Dec 06 08:51:18 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:51:30 np0005548788.localdomain sshd[88118]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:51:31 np0005548788.localdomain sshd[88118]: Received disconnect from 36.50.177.119 port 42286:11: Bye Bye [preauth]
Dec 06 08:51:31 np0005548788.localdomain sshd[88118]: Disconnected from authenticating user root 36.50.177.119 port 42286 [preauth]
Dec 06 08:51:36 np0005548788.localdomain sshd[88163]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:51:38 np0005548788.localdomain sshd[88163]: Received disconnect from 45.119.84.54 port 50024:11: Bye Bye [preauth]
Dec 06 08:51:38 np0005548788.localdomain sshd[88163]: Disconnected from authenticating user root 45.119.84.54 port 50024 [preauth]
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:51:38 np0005548788.localdomain podman[88167]: 2025-12-06 08:51:38.345832369 +0000 UTC m=+0.091216502 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, release=1761123044, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z)
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:51:38 np0005548788.localdomain podman[88167]: 2025-12-06 08:51:38.393552484 +0000 UTC m=+0.138936577 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:51:38 np0005548788.localdomain podman[88188]: 2025-12-06 08:51:38.454260752 +0000 UTC m=+0.077694094 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, release=1761123044, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: tmp-crun.A1q8MM.mount: Deactivated successfully.
Dec 06 08:51:38 np0005548788.localdomain podman[88189]: 2025-12-06 08:51:38.468617886 +0000 UTC m=+0.084976539 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:51:38 np0005548788.localdomain podman[88188]: 2025-12-06 08:51:38.497643133 +0000 UTC m=+0.121076465 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1)
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:51:38 np0005548788.localdomain podman[88187]: 2025-12-06 08:51:38.512788051 +0000 UTC m=+0.137633797 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 06 08:51:38 np0005548788.localdomain podman[88187]: 2025-12-06 08:51:38.567430381 +0000 UTC m=+0.192276107 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:51:38 np0005548788.localdomain podman[88189]: 2025-12-06 08:51:38.598523152 +0000 UTC m=+0.214881785 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:51:38 np0005548788.localdomain podman[88218]: 2025-12-06 08:51:38.569280268 +0000 UTC m=+0.139336719 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:51:38 np0005548788.localdomain podman[88218]: 2025-12-06 08:51:38.649652133 +0000 UTC m=+0.219708604 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:51:38 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: tmp-crun.ntgyIf.mount: Deactivated successfully.
Dec 06 08:51:42 np0005548788.localdomain podman[88277]: 2025-12-06 08:51:42.281110431 +0000 UTC m=+0.104449951 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: tmp-crun.l2vBgD.mount: Deactivated successfully.
Dec 06 08:51:42 np0005548788.localdomain podman[88278]: 2025-12-06 08:51:42.339418273 +0000 UTC m=+0.157742198 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:51:42 np0005548788.localdomain podman[88279]: 2025-12-06 08:51:42.377998196 +0000 UTC m=+0.193563006 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:51:42 np0005548788.localdomain podman[88277]: 2025-12-06 08:51:42.386460188 +0000 UTC m=+0.209799708 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:51:42 np0005548788.localdomain podman[88279]: 2025-12-06 08:51:42.438639222 +0000 UTC m=+0.254203972 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller)
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:51:42 np0005548788.localdomain podman[88278]: 2025-12-06 08:51:42.652743701 +0000 UTC m=+0.471067666 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:51:42 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:51:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:51:47 np0005548788.localdomain podman[88347]: 2025-12-06 08:51:47.269519324 +0000 UTC m=+0.088104854 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, release=1761123044, vcs-type=git, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:51:47 np0005548788.localdomain podman[88347]: 2025-12-06 08:51:47.5151591 +0000 UTC m=+0.333744610 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:51:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:51:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:51:49 np0005548788.localdomain podman[88376]: 2025-12-06 08:51:49.252710076 +0000 UTC m=+0.079521009 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:51:49 np0005548788.localdomain podman[88376]: 2025-12-06 08:51:49.307609474 +0000 UTC m=+0.134420367 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute)
Dec 06 08:51:49 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:51:53 np0005548788.localdomain sshd[88402]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:51:54 np0005548788.localdomain sshd[88402]: Received disconnect from 179.43.189.36 port 41070:11: Bye Bye [preauth]
Dec 06 08:51:54 np0005548788.localdomain sshd[88402]: Disconnected from authenticating user root 179.43.189.36 port 41070 [preauth]
Dec 06 08:52:04 np0005548788.localdomain sudo[88404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:52:04 np0005548788.localdomain sudo[88404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:52:04 np0005548788.localdomain sudo[88404]: pam_unix(sudo:session): session closed for user root
Dec 06 08:52:04 np0005548788.localdomain sudo[88419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:52:04 np0005548788.localdomain sudo[88419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:52:04 np0005548788.localdomain sudo[88419]: pam_unix(sudo:session): session closed for user root
Dec 06 08:52:05 np0005548788.localdomain sudo[88468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:52:05 np0005548788.localdomain sudo[88468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:52:05 np0005548788.localdomain sudo[88468]: pam_unix(sudo:session): session closed for user root
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:52:09 np0005548788.localdomain podman[88483]: 2025-12-06 08:52:09.276238765 +0000 UTC m=+0.095560555 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:52:09 np0005548788.localdomain podman[88483]: 2025-12-06 08:52:09.287000458 +0000 UTC m=+0.106322268 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z)
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:52:09 np0005548788.localdomain podman[88485]: 2025-12-06 08:52:09.330343748 +0000 UTC m=+0.144033884 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:11:48Z)
Dec 06 08:52:09 np0005548788.localdomain podman[88497]: 2025-12-06 08:52:09.332301509 +0000 UTC m=+0.136118661 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 06 08:52:09 np0005548788.localdomain podman[88485]: 2025-12-06 08:52:09.383706278 +0000 UTC m=+0.197396384 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1)
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:52:09 np0005548788.localdomain podman[88487]: 2025-12-06 08:52:09.433428726 +0000 UTC m=+0.243049227 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid)
Dec 06 08:52:09 np0005548788.localdomain podman[88484]: 2025-12-06 08:52:09.387599049 +0000 UTC m=+0.205886298 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:52:09 np0005548788.localdomain podman[88497]: 2025-12-06 08:52:09.466581911 +0000 UTC m=+0.270398993 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1)
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:52:09 np0005548788.localdomain podman[88484]: 2025-12-06 08:52:09.517324 +0000 UTC m=+0.335611299 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Dec 06 08:52:09 np0005548788.localdomain podman[88487]: 2025-12-06 08:52:09.517680341 +0000 UTC m=+0.327300872 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, container_name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:52:09 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:52:10 np0005548788.localdomain systemd[1]: tmp-crun.kvweTE.mount: Deactivated successfully.
Dec 06 08:52:12 np0005548788.localdomain sshd[88594]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: tmp-crun.b5eYbB.mount: Deactivated successfully.
Dec 06 08:52:13 np0005548788.localdomain podman[88596]: 2025-12-06 08:52:13.265477285 +0000 UTC m=+0.090364614 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:52:13 np0005548788.localdomain podman[88598]: 2025-12-06 08:52:13.325312365 +0000 UTC m=+0.142598260 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 08:52:13 np0005548788.localdomain podman[88597]: 2025-12-06 08:52:13.362517176 +0000 UTC m=+0.182202584 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:52:13 np0005548788.localdomain podman[88598]: 2025-12-06 08:52:13.379709628 +0000 UTC m=+0.196995563 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:52:13 np0005548788.localdomain podman[88596]: 2025-12-06 08:52:13.395488065 +0000 UTC m=+0.220375374 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-type=git, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:52:13 np0005548788.localdomain podman[88597]: 2025-12-06 08:52:13.762695439 +0000 UTC m=+0.582380907 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:52:13 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:52:14 np0005548788.localdomain systemd[1]: tmp-crun.NJvDry.mount: Deactivated successfully.
Dec 06 08:52:14 np0005548788.localdomain sshd[88594]: Received disconnect from 152.32.172.117 port 39226:11: Bye Bye [preauth]
Dec 06 08:52:14 np0005548788.localdomain sshd[88594]: Disconnected from authenticating user root 152.32.172.117 port 39226 [preauth]
Dec 06 08:52:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:52:18 np0005548788.localdomain podman[88670]: 2025-12-06 08:52:18.246347598 +0000 UTC m=+0.073638588 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 06 08:52:18 np0005548788.localdomain podman[88670]: 2025-12-06 08:52:18.437385825 +0000 UTC m=+0.264676845 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:52:18 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:52:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:52:20 np0005548788.localdomain podman[88701]: 2025-12-06 08:52:20.272532739 +0000 UTC m=+0.099076695 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:52:20 np0005548788.localdomain podman[88701]: 2025-12-06 08:52:20.331358288 +0000 UTC m=+0.157902274 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:52:20 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: tmp-crun.S0vx2r.mount: Deactivated successfully.
Dec 06 08:52:40 np0005548788.localdomain podman[88750]: 2025-12-06 08:52:40.283827598 +0000 UTC m=+0.107006470 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=)
Dec 06 08:52:40 np0005548788.localdomain podman[88752]: 2025-12-06 08:52:40.314387522 +0000 UTC m=+0.132744625 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute)
Dec 06 08:52:40 np0005548788.localdomain podman[88750]: 2025-12-06 08:52:40.318422008 +0000 UTC m=+0.141600870 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:52:40 np0005548788.localdomain podman[88752]: 2025-12-06 08:52:40.373392708 +0000 UTC m=+0.191749771 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, architecture=x86_64)
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:52:40 np0005548788.localdomain podman[88759]: 2025-12-06 08:52:40.375510422 +0000 UTC m=+0.187953952 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 06 08:52:40 np0005548788.localdomain podman[88751]: 2025-12-06 08:52:40.432181575 +0000 UTC m=+0.251672873 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Dec 06 08:52:40 np0005548788.localdomain podman[88751]: 2025-12-06 08:52:40.443519296 +0000 UTC m=+0.263010614 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, release=1761123044, name=rhosp17/openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:52:40 np0005548788.localdomain podman[88759]: 2025-12-06 08:52:40.459675445 +0000 UTC m=+0.272118955 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:52:40 np0005548788.localdomain podman[88753]: 2025-12-06 08:52:40.534165148 +0000 UTC m=+0.348125995 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container)
Dec 06 08:52:40 np0005548788.localdomain podman[88753]: 2025-12-06 08:52:40.547753829 +0000 UTC m=+0.361714606 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public)
Dec 06 08:52:40 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:52:44 np0005548788.localdomain podman[88860]: 2025-12-06 08:52:44.264390829 +0000 UTC m=+0.092678117 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com)
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: tmp-crun.1wl10V.mount: Deactivated successfully.
Dec 06 08:52:44 np0005548788.localdomain podman[88861]: 2025-12-06 08:52:44.319960978 +0000 UTC m=+0.141096874 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:52:44 np0005548788.localdomain podman[88867]: 2025-12-06 08:52:44.360685548 +0000 UTC m=+0.176647464 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Dec 06 08:52:44 np0005548788.localdomain podman[88867]: 2025-12-06 08:52:44.38856827 +0000 UTC m=+0.204530156 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, container_name=ovn_controller)
Dec 06 08:52:44 np0005548788.localdomain podman[88860]: 2025-12-06 08:52:44.391004495 +0000 UTC m=+0.219291852 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:52:44 np0005548788.localdomain podman[88861]: 2025-12-06 08:52:44.691534578 +0000 UTC m=+0.512670464 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:52:44 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:52:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:52:49 np0005548788.localdomain podman[88929]: 2025-12-06 08:52:49.262479855 +0000 UTC m=+0.087627082 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 06 08:52:49 np0005548788.localdomain podman[88929]: 2025-12-06 08:52:49.490895187 +0000 UTC m=+0.316042374 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:52:49 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:52:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:52:51 np0005548788.localdomain podman[88959]: 2025-12-06 08:52:51.248047799 +0000 UTC m=+0.079521369 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:52:51 np0005548788.localdomain podman[88959]: 2025-12-06 08:52:51.30172234 +0000 UTC m=+0.133195840 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute)
Dec 06 08:52:51 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:52:53 np0005548788.localdomain sshd[88985]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:54 np0005548788.localdomain sshd[88985]: Received disconnect from 36.50.177.119 port 33094:11: Bye Bye [preauth]
Dec 06 08:52:54 np0005548788.localdomain sshd[88985]: Disconnected from authenticating user root 36.50.177.119 port 33094 [preauth]
Dec 06 08:52:54 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:52:54 np0005548788.localdomain recover_tripleo_nova_virtqemud[88988]: 62021
Dec 06 08:52:54 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:52:54 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:52:58 np0005548788.localdomain sshd[88989]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:59 np0005548788.localdomain sshd[88989]: Received disconnect from 179.43.189.36 port 54642:11: Bye Bye [preauth]
Dec 06 08:52:59 np0005548788.localdomain sshd[88989]: Disconnected from authenticating user root 179.43.189.36 port 54642 [preauth]
Dec 06 08:53:00 np0005548788.localdomain sshd[88991]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:53:01 np0005548788.localdomain sshd[88991]: Received disconnect from 45.119.84.54 port 45568:11: Bye Bye [preauth]
Dec 06 08:53:01 np0005548788.localdomain sshd[88991]: Disconnected from authenticating user root 45.119.84.54 port 45568 [preauth]
Dec 06 08:53:05 np0005548788.localdomain sudo[88993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:53:05 np0005548788.localdomain sudo[88993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:05 np0005548788.localdomain sudo[88993]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:05 np0005548788.localdomain sudo[89008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:53:05 np0005548788.localdomain sudo[89008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:06 np0005548788.localdomain systemd[1]: tmp-crun.b6iOAo.mount: Deactivated successfully.
Dec 06 08:53:06 np0005548788.localdomain podman[89096]: 2025-12-06 08:53:06.563980168 +0000 UTC m=+0.106881585 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.buildah.version=1.41.4, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True)
Dec 06 08:53:06 np0005548788.localdomain podman[89096]: 2025-12-06 08:53:06.688493468 +0000 UTC m=+0.231394915 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 06 08:53:06 np0005548788.localdomain sudo[89008]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:07 np0005548788.localdomain sudo[89162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:53:07 np0005548788.localdomain sudo[89162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:07 np0005548788.localdomain sudo[89162]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:07 np0005548788.localdomain sudo[89177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:53:07 np0005548788.localdomain sudo[89177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:07 np0005548788.localdomain sudo[89177]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:08 np0005548788.localdomain sudo[89223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:53:08 np0005548788.localdomain sudo[89223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:08 np0005548788.localdomain sudo[89223]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:53:11 np0005548788.localdomain podman[89239]: 2025-12-06 08:53:11.282025893 +0000 UTC m=+0.099940291 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, container_name=collectd, release=1761123044, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:51:28Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Dec 06 08:53:11 np0005548788.localdomain podman[89239]: 2025-12-06 08:53:11.29193982 +0000 UTC m=+0.109854218 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, container_name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: tmp-crun.By7HLI.mount: Deactivated successfully.
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:53:11 np0005548788.localdomain podman[89251]: 2025-12-06 08:53:11.308535143 +0000 UTC m=+0.108846437 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Dec 06 08:53:11 np0005548788.localdomain podman[89251]: 2025-12-06 08:53:11.338593202 +0000 UTC m=+0.138904506 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:53:11 np0005548788.localdomain podman[89238]: 2025-12-06 08:53:11.387229356 +0000 UTC m=+0.204722941 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:53:11 np0005548788.localdomain podman[89240]: 2025-12-06 08:53:11.354593147 +0000 UTC m=+0.167865001 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:53:11 np0005548788.localdomain podman[89238]: 2025-12-06 08:53:11.421748924 +0000 UTC m=+0.239242509 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64)
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:53:11 np0005548788.localdomain podman[89240]: 2025-12-06 08:53:11.438594135 +0000 UTC m=+0.251865989 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:53:11 np0005548788.localdomain podman[89241]: 2025-12-06 08:53:11.494175604 +0000 UTC m=+0.297911024 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_id=tripleo_step3, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:53:11 np0005548788.localdomain podman[89241]: 2025-12-06 08:53:11.532942932 +0000 UTC m=+0.336678402 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vendor=Red Hat, Inc.)
Dec 06 08:53:11 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:53:13 np0005548788.localdomain sshd[89350]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:53:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:53:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:53:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:53:15 np0005548788.localdomain podman[89353]: 2025-12-06 08:53:15.266475735 +0000 UTC m=+0.088429775 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, architecture=x86_64)
Dec 06 08:53:15 np0005548788.localdomain podman[89354]: 2025-12-06 08:53:15.316731999 +0000 UTC m=+0.135455199 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller)
Dec 06 08:53:15 np0005548788.localdomain podman[89354]: 2025-12-06 08:53:15.371723289 +0000 UTC m=+0.190446469 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Dec 06 08:53:15 np0005548788.localdomain podman[89352]: 2025-12-06 08:53:15.38404007 +0000 UTC m=+0.209007524 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 08:53:15 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:53:15 np0005548788.localdomain podman[89352]: 2025-12-06 08:53:15.428893767 +0000 UTC m=+0.253861231 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, tcib_managed=true)
Dec 06 08:53:15 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:53:15 np0005548788.localdomain podman[89353]: 2025-12-06 08:53:15.676846934 +0000 UTC m=+0.498801014 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Dec 06 08:53:15 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:53:16 np0005548788.localdomain systemd[1]: tmp-crun.JsVkw4.mount: Deactivated successfully.
Dec 06 08:53:18 np0005548788.localdomain sshd[89350]: Received disconnect from 45.78.219.195 port 46290:11: Bye Bye [preauth]
Dec 06 08:53:18 np0005548788.localdomain sshd[89350]: Disconnected from authenticating user root 45.78.219.195 port 46290 [preauth]
Dec 06 08:53:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:53:20 np0005548788.localdomain podman[89423]: 2025-12-06 08:53:20.274118514 +0000 UTC m=+0.096177165 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1)
Dec 06 08:53:20 np0005548788.localdomain podman[89423]: 2025-12-06 08:53:20.4754534 +0000 UTC m=+0.297512001 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, name=rhosp17/openstack-qdrouterd, release=1761123044, config_id=tripleo_step1)
Dec 06 08:53:20 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:53:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:53:22 np0005548788.localdomain systemd[1]: tmp-crun.nQCBgk.mount: Deactivated successfully.
Dec 06 08:53:22 np0005548788.localdomain podman[89452]: 2025-12-06 08:53:22.221097356 +0000 UTC m=+0.055021872 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.buildah.version=1.41.4)
Dec 06 08:53:22 np0005548788.localdomain podman[89452]: 2025-12-06 08:53:22.246538293 +0000 UTC m=+0.080462849 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, version=17.1.12, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:53:22 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:53:41 np0005548788.localdomain sshd[89502]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:53:42 np0005548788.localdomain podman[89506]: 2025-12-06 08:53:42.282016275 +0000 UTC m=+0.097958230 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:53:42 np0005548788.localdomain podman[89505]: 2025-12-06 08:53:42.327924734 +0000 UTC m=+0.145082167 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true)
Dec 06 08:53:42 np0005548788.localdomain podman[89505]: 2025-12-06 08:53:42.337759079 +0000 UTC m=+0.154916512 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=collectd)
Dec 06 08:53:42 np0005548788.localdomain podman[89504]: 2025-12-06 08:53:42.36140822 +0000 UTC m=+0.179406668 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, container_name=logrotate_crond, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12)
Dec 06 08:53:42 np0005548788.localdomain podman[89504]: 2025-12-06 08:53:42.370532172 +0000 UTC m=+0.188530630 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, io.buildah.version=1.41.4)
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:53:42 np0005548788.localdomain podman[89506]: 2025-12-06 08:53:42.388796687 +0000 UTC m=+0.204738572 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:53:42 np0005548788.localdomain podman[89507]: 2025-12-06 08:53:42.531685445 +0000 UTC m=+0.340757748 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:53:42 np0005548788.localdomain podman[89507]: 2025-12-06 08:53:42.538891977 +0000 UTC m=+0.347964230 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z)
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:53:42 np0005548788.localdomain podman[89508]: 2025-12-06 08:53:42.637650282 +0000 UTC m=+0.443385582 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:53:42 np0005548788.localdomain podman[89508]: 2025-12-06 08:53:42.691689192 +0000 UTC m=+0.497424492 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:53:42 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:53:46 np0005548788.localdomain podman[89617]: 2025-12-06 08:53:46.255037174 +0000 UTC m=+0.079081416 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, version=17.1.12)
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: tmp-crun.ak644n.mount: Deactivated successfully.
Dec 06 08:53:46 np0005548788.localdomain podman[89616]: 2025-12-06 08:53:46.325149692 +0000 UTC m=+0.148888945 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:53:46 np0005548788.localdomain podman[89618]: 2025-12-06 08:53:46.372804696 +0000 UTC m=+0.192014879 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, container_name=ovn_controller)
Dec 06 08:53:46 np0005548788.localdomain podman[89618]: 2025-12-06 08:53:46.398554892 +0000 UTC m=+0.217765055 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible)
Dec 06 08:53:46 np0005548788.localdomain podman[89616]: 2025-12-06 08:53:46.401700399 +0000 UTC m=+0.225439612 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git)
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:53:46 np0005548788.localdomain podman[89617]: 2025-12-06 08:53:46.640531354 +0000 UTC m=+0.464575526 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:53:46 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:53:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:53:51 np0005548788.localdomain systemd[1]: tmp-crun.9uaowk.mount: Deactivated successfully.
Dec 06 08:53:51 np0005548788.localdomain podman[89685]: 2025-12-06 08:53:51.27258813 +0000 UTC m=+0.096876637 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64)
Dec 06 08:53:51 np0005548788.localdomain podman[89685]: 2025-12-06 08:53:51.456678542 +0000 UTC m=+0.280967079 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:53:51 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:53:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:53:53 np0005548788.localdomain podman[89713]: 2025-12-06 08:53:53.256416061 +0000 UTC m=+0.079639823 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:53:53 np0005548788.localdomain podman[89713]: 2025-12-06 08:53:53.283888781 +0000 UTC m=+0.107112573 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5)
Dec 06 08:53:53 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:54:04 np0005548788.localdomain sshd[89739]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:54:04 np0005548788.localdomain sshd[89739]: Received disconnect from 179.43.189.36 port 57024:11: Bye Bye [preauth]
Dec 06 08:54:04 np0005548788.localdomain sshd[89739]: Disconnected from authenticating user root 179.43.189.36 port 57024 [preauth]
Dec 06 08:54:08 np0005548788.localdomain sudo[89741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:54:08 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:54:08 np0005548788.localdomain sudo[89741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:54:08 np0005548788.localdomain sudo[89741]: pam_unix(sudo:session): session closed for user root
Dec 06 08:54:08 np0005548788.localdomain recover_tripleo_nova_virtqemud[89756]: 62021
Dec 06 08:54:08 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:54:08 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:54:08 np0005548788.localdomain sudo[89758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:54:08 np0005548788.localdomain sudo[89758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:54:09 np0005548788.localdomain sudo[89758]: pam_unix(sudo:session): session closed for user root
Dec 06 08:54:09 np0005548788.localdomain sudo[89805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:54:09 np0005548788.localdomain sudo[89805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:54:09 np0005548788.localdomain sudo[89805]: pam_unix(sudo:session): session closed for user root
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:54:13 np0005548788.localdomain podman[89821]: 2025-12-06 08:54:13.273219235 +0000 UTC m=+0.095219716 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., container_name=collectd, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: tmp-crun.j486z0.mount: Deactivated successfully.
Dec 06 08:54:13 np0005548788.localdomain podman[89821]: 2025-12-06 08:54:13.321757755 +0000 UTC m=+0.143758186 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:54:13 np0005548788.localdomain podman[89823]: 2025-12-06 08:54:13.323900832 +0000 UTC m=+0.140130034 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1)
Dec 06 08:54:13 np0005548788.localdomain podman[89822]: 2025-12-06 08:54:13.383486574 +0000 UTC m=+0.203644578 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:54:13 np0005548788.localdomain podman[89820]: 2025-12-06 08:54:13.297281518 +0000 UTC m=+0.113286403 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp17/openstack-cron, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:54:13 np0005548788.localdomain podman[89820]: 2025-12-06 08:54:13.431605832 +0000 UTC m=+0.247610727 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64)
Dec 06 08:54:13 np0005548788.localdomain podman[89832]: 2025-12-06 08:54:13.440840147 +0000 UTC m=+0.249762784 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:54:13 np0005548788.localdomain podman[89823]: 2025-12-06 08:54:13.455738408 +0000 UTC m=+0.271967620 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid)
Dec 06 08:54:13 np0005548788.localdomain podman[89822]: 2025-12-06 08:54:13.469496594 +0000 UTC m=+0.289654678 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:54:13 np0005548788.localdomain podman[89832]: 2025-12-06 08:54:13.477664545 +0000 UTC m=+0.286587202 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4)
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:54:13 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:54:15 np0005548788.localdomain sshd[89934]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:54:17 np0005548788.localdomain sshd[89934]: Received disconnect from 36.50.177.119 port 57306:11: Bye Bye [preauth]
Dec 06 08:54:17 np0005548788.localdomain sshd[89934]: Disconnected from authenticating user root 36.50.177.119 port 57306 [preauth]
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:54:17 np0005548788.localdomain podman[89937]: 2025-12-06 08:54:17.266651213 +0000 UTC m=+0.087590710 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12)
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: tmp-crun.VFdW2Q.mount: Deactivated successfully.
Dec 06 08:54:17 np0005548788.localdomain podman[89938]: 2025-12-06 08:54:17.332682855 +0000 UTC m=+0.150296879 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:54:17 np0005548788.localdomain podman[89938]: 2025-12-06 08:54:17.356240863 +0000 UTC m=+0.173854827 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller)
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:54:17 np0005548788.localdomain podman[89936]: 2025-12-06 08:54:17.420556031 +0000 UTC m=+0.244129119 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git)
Dec 06 08:54:17 np0005548788.localdomain podman[89936]: 2025-12-06 08:54:17.465527992 +0000 UTC m=+0.289101060 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044)
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:54:17 np0005548788.localdomain podman[89937]: 2025-12-06 08:54:17.626701916 +0000 UTC m=+0.447641453 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:54:17 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:54:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:54:22 np0005548788.localdomain podman[90008]: 2025-12-06 08:54:22.266068458 +0000 UTC m=+0.092260334 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:54:22 np0005548788.localdomain podman[90008]: 2025-12-06 08:54:22.488720672 +0000 UTC m=+0.314912588 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public)
Dec 06 08:54:22 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:54:23 np0005548788.localdomain sshd[90037]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:54:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:54:24 np0005548788.localdomain podman[90039]: 2025-12-06 08:54:24.246969659 +0000 UTC m=+0.070927715 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:54:24 np0005548788.localdomain podman[90039]: 2025-12-06 08:54:24.30261613 +0000 UTC m=+0.126574186 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:54:24 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:54:24 np0005548788.localdomain sshd[90037]: Received disconnect from 45.119.84.54 port 47878:11: Bye Bye [preauth]
Dec 06 08:54:24 np0005548788.localdomain sshd[90037]: Disconnected from authenticating user root 45.119.84.54 port 47878 [preauth]
Dec 06 08:54:36 np0005548788.localdomain sshd[90065]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:54:37 np0005548788.localdomain sshd[90065]: Received disconnect from 148.227.3.232 port 58358:11: Bye Bye [preauth]
Dec 06 08:54:37 np0005548788.localdomain sshd[90065]: Disconnected from authenticating user root 148.227.3.232 port 58358 [preauth]
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:54:44 np0005548788.localdomain podman[90091]: 2025-12-06 08:54:44.269729226 +0000 UTC m=+0.087127451 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd)
Dec 06 08:54:44 np0005548788.localdomain podman[90099]: 2025-12-06 08:54:44.283904074 +0000 UTC m=+0.089109273 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z)
Dec 06 08:54:44 np0005548788.localdomain podman[90091]: 2025-12-06 08:54:44.305681037 +0000 UTC m=+0.123079262 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, container_name=collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:54:44 np0005548788.localdomain podman[90090]: 2025-12-06 08:54:44.327665156 +0000 UTC m=+0.147936910 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64)
Dec 06 08:54:44 np0005548788.localdomain podman[90090]: 2025-12-06 08:54:44.337671905 +0000 UTC m=+0.157943689 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:54:44 np0005548788.localdomain podman[90093]: 2025-12-06 08:54:44.256432725 +0000 UTC m=+0.072627263 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:54:44 np0005548788.localdomain podman[90099]: 2025-12-06 08:54:44.389032882 +0000 UTC m=+0.194238091 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, architecture=x86_64, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:54:44 np0005548788.localdomain podman[90093]: 2025-12-06 08:54:44.392630292 +0000 UTC m=+0.208824810 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, version=17.1.12, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, container_name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:54:44 np0005548788.localdomain podman[90092]: 2025-12-06 08:54:44.52883367 +0000 UTC m=+0.343130950 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc.)
Dec 06 08:54:44 np0005548788.localdomain podman[90092]: 2025-12-06 08:54:44.558733952 +0000 UTC m=+0.373031232 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:54:44 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:54:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:54:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:54:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:54:48 np0005548788.localdomain podman[90201]: 2025-12-06 08:54:48.261502995 +0000 UTC m=+0.090587208 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:54:48 np0005548788.localdomain podman[90201]: 2025-12-06 08:54:48.319318231 +0000 UTC m=+0.148402394 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:54:48 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:54:48 np0005548788.localdomain podman[90202]: 2025-12-06 08:54:48.362341869 +0000 UTC m=+0.188636566 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:54:48 np0005548788.localdomain podman[90203]: 2025-12-06 08:54:48.322262842 +0000 UTC m=+0.145560727 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:54:48 np0005548788.localdomain podman[90203]: 2025-12-06 08:54:48.420788285 +0000 UTC m=+0.244086170 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1)
Dec 06 08:54:48 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:54:48 np0005548788.localdomain podman[90202]: 2025-12-06 08:54:48.713259428 +0000 UTC m=+0.539554135 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:54:48 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:54:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:54:53 np0005548788.localdomain podman[90269]: 2025-12-06 08:54:53.255081606 +0000 UTC m=+0.085497003 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:54:53 np0005548788.localdomain podman[90269]: 2025-12-06 08:54:53.453648149 +0000 UTC m=+0.284063546 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:54:53 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:54:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:54:55 np0005548788.localdomain podman[90298]: 2025-12-06 08:54:55.259704469 +0000 UTC m=+0.086723559 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Dec 06 08:54:55 np0005548788.localdomain podman[90298]: 2025-12-06 08:54:55.29467416 +0000 UTC m=+0.121693220 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 06 08:54:55 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:55:09 np0005548788.localdomain sshd[90325]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:09 np0005548788.localdomain sudo[90327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:55:09 np0005548788.localdomain sudo[90327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:55:09 np0005548788.localdomain sudo[90327]: pam_unix(sudo:session): session closed for user root
Dec 06 08:55:10 np0005548788.localdomain sudo[90342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:55:10 np0005548788.localdomain sudo[90342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:55:10 np0005548788.localdomain sshd[90325]: Received disconnect from 179.43.189.36 port 58478:11: Bye Bye [preauth]
Dec 06 08:55:10 np0005548788.localdomain sshd[90325]: Disconnected from authenticating user root 179.43.189.36 port 58478 [preauth]
Dec 06 08:55:10 np0005548788.localdomain sudo[90342]: pam_unix(sudo:session): session closed for user root
Dec 06 08:55:11 np0005548788.localdomain sudo[90388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:55:11 np0005548788.localdomain sudo[90388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:55:11 np0005548788.localdomain sudo[90388]: pam_unix(sudo:session): session closed for user root
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: tmp-crun.uWR5Oo.mount: Deactivated successfully.
Dec 06 08:55:15 np0005548788.localdomain podman[90403]: 2025-12-06 08:55:15.27932106 +0000 UTC m=+0.100974981 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:49:32Z, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:55:15 np0005548788.localdomain podman[90404]: 2025-12-06 08:55:15.327122696 +0000 UTC m=+0.147187437 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, architecture=x86_64, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 06 08:55:15 np0005548788.localdomain podman[90405]: 2025-12-06 08:55:15.382890638 +0000 UTC m=+0.199587106 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z)
Dec 06 08:55:15 np0005548788.localdomain podman[90405]: 2025-12-06 08:55:15.414586077 +0000 UTC m=+0.231282525 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:55:15 np0005548788.localdomain podman[90406]: 2025-12-06 08:55:15.425878776 +0000 UTC m=+0.239173199 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:55:15 np0005548788.localdomain podman[90406]: 2025-12-06 08:55:15.435567075 +0000 UTC m=+0.248861528 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:55:15 np0005548788.localdomain podman[90412]: 2025-12-06 08:55:15.479478471 +0000 UTC m=+0.289174882 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4)
Dec 06 08:55:15 np0005548788.localdomain podman[90403]: 2025-12-06 08:55:15.495796495 +0000 UTC m=+0.317450386 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond)
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:55:15 np0005548788.localdomain podman[90412]: 2025-12-06 08:55:15.533880641 +0000 UTC m=+0.343577002 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git)
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:55:15 np0005548788.localdomain podman[90404]: 2025-12-06 08:55:15.546640196 +0000 UTC m=+0.366704957 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, version=17.1.12, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 06 08:55:15 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:55:16 np0005548788.localdomain systemd[1]: tmp-crun.4qGBfN.mount: Deactivated successfully.
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:55:19 np0005548788.localdomain podman[90514]: 2025-12-06 08:55:19.268344483 +0000 UTC m=+0.091624160 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Dec 06 08:55:19 np0005548788.localdomain podman[90513]: 2025-12-06 08:55:19.315870141 +0000 UTC m=+0.139136238 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64)
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: tmp-crun.AGTGnV.mount: Deactivated successfully.
Dec 06 08:55:19 np0005548788.localdomain podman[90513]: 2025-12-06 08:55:19.371587722 +0000 UTC m=+0.194853809 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, vcs-type=git, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team)
Dec 06 08:55:19 np0005548788.localdomain podman[90515]: 2025-12-06 08:55:19.386161413 +0000 UTC m=+0.207357887 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller)
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:55:19 np0005548788.localdomain podman[90515]: 2025-12-06 08:55:19.413599549 +0000 UTC m=+0.234795973 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, version=17.1.12, config_id=tripleo_step4)
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:55:19 np0005548788.localdomain podman[90514]: 2025-12-06 08:55:19.659707111 +0000 UTC m=+0.482986738 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute)
Dec 06 08:55:19 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:55:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:55:24 np0005548788.localdomain podman[90579]: 2025-12-06 08:55:24.263249014 +0000 UTC m=+0.090072193 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, config_id=tripleo_step1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 06 08:55:24 np0005548788.localdomain podman[90579]: 2025-12-06 08:55:24.469879486 +0000 UTC m=+0.296702605 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:55:24 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:55:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:55:26 np0005548788.localdomain systemd[1]: tmp-crun.DvLNj6.mount: Deactivated successfully.
Dec 06 08:55:26 np0005548788.localdomain podman[90609]: 2025-12-06 08:55:26.268411465 +0000 UTC m=+0.093330133 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:55:26 np0005548788.localdomain podman[90609]: 2025-12-06 08:55:26.299060222 +0000 UTC m=+0.123978860 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:55:26 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:55:36 np0005548788.localdomain sshd[90635]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:38 np0005548788.localdomain sshd[90635]: Received disconnect from 36.50.177.119 port 36286:11: Bye Bye [preauth]
Dec 06 08:55:38 np0005548788.localdomain sshd[90635]: Disconnected from authenticating user root 36.50.177.119 port 36286 [preauth]
Dec 06 08:55:41 np0005548788.localdomain sshd[89502]: fatal: Timeout before authentication for 101.47.142.76 port 56466
Dec 06 08:55:44 np0005548788.localdomain sshd[90660]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:55:46 np0005548788.localdomain podman[90664]: 2025-12-06 08:55:46.270178896 +0000 UTC m=+0.090708273 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:55:46 np0005548788.localdomain podman[90663]: 2025-12-06 08:55:46.322543343 +0000 UTC m=+0.143072020 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:55:46 np0005548788.localdomain podman[90664]: 2025-12-06 08:55:46.328768065 +0000 UTC m=+0.149297442 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Dec 06 08:55:46 np0005548788.localdomain podman[90663]: 2025-12-06 08:55:46.33766355 +0000 UTC m=+0.158192217 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:55:46 np0005548788.localdomain podman[90665]: 2025-12-06 08:55:46.387117468 +0000 UTC m=+0.200054981 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Dec 06 08:55:46 np0005548788.localdomain podman[90665]: 2025-12-06 08:55:46.426597037 +0000 UTC m=+0.239534500 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4)
Dec 06 08:55:46 np0005548788.localdomain podman[90662]: 2025-12-06 08:55:46.435080699 +0000 UTC m=+0.257721891 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:55:46 np0005548788.localdomain sshd[90660]: Received disconnect from 45.119.84.54 port 38134:11: Bye Bye [preauth]
Dec 06 08:55:46 np0005548788.localdomain sshd[90660]: Disconnected from authenticating user root 45.119.84.54 port 38134 [preauth]
Dec 06 08:55:46 np0005548788.localdomain podman[90662]: 2025-12-06 08:55:46.47687302 +0000 UTC m=+0.299514212 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 08:55:46 np0005548788.localdomain podman[90679]: 2025-12-06 08:55:46.4882216 +0000 UTC m=+0.296382305 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:55:46 np0005548788.localdomain podman[90679]: 2025-12-06 08:55:46.544021134 +0000 UTC m=+0.352181929 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Dec 06 08:55:46 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:55:50 np0005548788.localdomain recover_tripleo_nova_virtqemud[90793]: 62021
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: tmp-crun.Aq2MX3.mount: Deactivated successfully.
Dec 06 08:55:50 np0005548788.localdomain podman[90775]: 2025-12-06 08:55:50.322823344 +0000 UTC m=+0.142870983 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:55:50 np0005548788.localdomain podman[90774]: 2025-12-06 08:55:50.36932089 +0000 UTC m=+0.192784284 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Dec 06 08:55:50 np0005548788.localdomain podman[90776]: 2025-12-06 08:55:50.290900688 +0000 UTC m=+0.108798780 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=)
Dec 06 08:55:50 np0005548788.localdomain podman[90776]: 2025-12-06 08:55:50.424307458 +0000 UTC m=+0.242205530 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64)
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:55:50 np0005548788.localdomain podman[90774]: 2025-12-06 08:55:50.474994524 +0000 UTC m=+0.298457968 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=)
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:55:50 np0005548788.localdomain podman[90775]: 2025-12-06 08:55:50.691872123 +0000 UTC m=+0.511919612 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 08:55:50 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:55:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:55:55 np0005548788.localdomain podman[90846]: 2025-12-06 08:55:55.265155102 +0000 UTC m=+0.089064562 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 06 08:55:55 np0005548788.localdomain podman[90846]: 2025-12-06 08:55:55.495886138 +0000 UTC m=+0.319795548 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:55:55 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:55:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:55:57 np0005548788.localdomain systemd[1]: tmp-crun.ugulGi.mount: Deactivated successfully.
Dec 06 08:55:57 np0005548788.localdomain podman[90875]: 2025-12-06 08:55:57.27347403 +0000 UTC m=+0.099857445 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044)
Dec 06 08:55:57 np0005548788.localdomain podman[90875]: 2025-12-06 08:55:57.304298872 +0000 UTC m=+0.130682307 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:55:57 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:56:11 np0005548788.localdomain sudo[90901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:56:11 np0005548788.localdomain sudo[90901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:56:11 np0005548788.localdomain sudo[90901]: pam_unix(sudo:session): session closed for user root
Dec 06 08:56:11 np0005548788.localdomain sudo[90916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:56:11 np0005548788.localdomain sudo[90916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:56:12 np0005548788.localdomain sudo[90916]: pam_unix(sudo:session): session closed for user root
Dec 06 08:56:12 np0005548788.localdomain sudo[90962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:56:12 np0005548788.localdomain sudo[90962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:56:12 np0005548788.localdomain sudo[90962]: pam_unix(sudo:session): session closed for user root
Dec 06 08:56:13 np0005548788.localdomain sshd[90977]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:56:14 np0005548788.localdomain sshd[90977]: Received disconnect from 179.43.189.36 port 59556:11: Bye Bye [preauth]
Dec 06 08:56:14 np0005548788.localdomain sshd[90977]: Disconnected from authenticating user root 179.43.189.36 port 59556 [preauth]
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:56:17 np0005548788.localdomain podman[90980]: 2025-12-06 08:56:17.279150301 +0000 UTC m=+0.101035061 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Dec 06 08:56:17 np0005548788.localdomain podman[90982]: 2025-12-06 08:56:17.326765371 +0000 UTC m=+0.143343977 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Dec 06 08:56:17 np0005548788.localdomain podman[90982]: 2025-12-06 08:56:17.361925468 +0000 UTC m=+0.178504104 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:56:17 np0005548788.localdomain podman[90979]: 2025-12-06 08:56:17.377972474 +0000 UTC m=+0.201632129 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, vcs-type=git, name=rhosp17/openstack-cron, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=)
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:56:17 np0005548788.localdomain podman[90979]: 2025-12-06 08:56:17.387944061 +0000 UTC m=+0.211603766 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, container_name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:56:17 np0005548788.localdomain podman[90980]: 2025-12-06 08:56:17.398363693 +0000 UTC m=+0.220248503 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:56:17 np0005548788.localdomain podman[90981]: 2025-12-06 08:56:17.490823929 +0000 UTC m=+0.309759778 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Dec 06 08:56:17 np0005548788.localdomain podman[90986]: 2025-12-06 08:56:17.536152629 +0000 UTC m=+0.348159114 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:56:17 np0005548788.localdomain podman[90981]: 2025-12-06 08:56:17.552070741 +0000 UTC m=+0.371006620 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:56:17 np0005548788.localdomain podman[90986]: 2025-12-06 08:56:17.59380044 +0000 UTC m=+0.405806895 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64)
Dec 06 08:56:17 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: tmp-crun.z6KI1n.mount: Deactivated successfully.
Dec 06 08:56:21 np0005548788.localdomain podman[91087]: 2025-12-06 08:56:21.283355775 +0000 UTC m=+0.106489939 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Dec 06 08:56:21 np0005548788.localdomain podman[91087]: 2025-12-06 08:56:21.321692088 +0000 UTC m=+0.144826302 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:56:21 np0005548788.localdomain podman[91089]: 2025-12-06 08:56:21.376942775 +0000 UTC m=+0.191980879 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com)
Dec 06 08:56:21 np0005548788.localdomain podman[91089]: 2025-12-06 08:56:21.427430054 +0000 UTC m=+0.242468198 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:56:21 np0005548788.localdomain podman[91088]: 2025-12-06 08:56:21.429452147 +0000 UTC m=+0.248635760 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 08:56:21 np0005548788.localdomain podman[91088]: 2025-12-06 08:56:21.801002042 +0000 UTC m=+0.620185605 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, container_name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4)
Dec 06 08:56:21 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:56:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:56:26 np0005548788.localdomain sshd[91158]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:56:26 np0005548788.localdomain systemd[1]: tmp-crun.7CGCay.mount: Deactivated successfully.
Dec 06 08:56:26 np0005548788.localdomain podman[91157]: 2025-12-06 08:56:26.272141666 +0000 UTC m=+0.096836001 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Dec 06 08:56:26 np0005548788.localdomain podman[91157]: 2025-12-06 08:56:26.459787532 +0000 UTC m=+0.284481847 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vcs-type=git)
Dec 06 08:56:26 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:56:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:56:28 np0005548788.localdomain podman[91186]: 2025-12-06 08:56:28.25212779 +0000 UTC m=+0.078707552 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5)
Dec 06 08:56:28 np0005548788.localdomain podman[91186]: 2025-12-06 08:56:28.278879826 +0000 UTC m=+0.105459538 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:56:28 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:56:31 np0005548788.localdomain sshd[91158]: Received disconnect from 101.47.142.76 port 36400:11: Bye Bye [preauth]
Dec 06 08:56:31 np0005548788.localdomain sshd[91158]: Disconnected from authenticating user root 101.47.142.76 port 36400 [preauth]
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: tmp-crun.b68gR4.mount: Deactivated successfully.
Dec 06 08:56:48 np0005548788.localdomain podman[91214]: 2025-12-06 08:56:48.273400553 +0000 UTC m=+0.089742733 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:56:48 np0005548788.localdomain podman[91213]: 2025-12-06 08:56:48.341653232 +0000 UTC m=+0.160180480 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, container_name=logrotate_crond, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:56:48 np0005548788.localdomain podman[91213]: 2025-12-06 08:56:48.373525225 +0000 UTC m=+0.192052433 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:56:48 np0005548788.localdomain podman[91216]: 2025-12-06 08:56:48.387513628 +0000 UTC m=+0.194727376 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:44:13Z, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Dec 06 08:56:48 np0005548788.localdomain podman[91216]: 2025-12-06 08:56:48.403549853 +0000 UTC m=+0.210763561 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:56:48 np0005548788.localdomain podman[91219]: 2025-12-06 08:56:48.307446724 +0000 UTC m=+0.114703664 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:56:48 np0005548788.localdomain podman[91215]: 2025-12-06 08:56:48.465486896 +0000 UTC m=+0.279256006 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Dec 06 08:56:48 np0005548788.localdomain podman[91219]: 2025-12-06 08:56:48.496141513 +0000 UTC m=+0.303398473 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:56:48 np0005548788.localdomain podman[91214]: 2025-12-06 08:56:48.515312004 +0000 UTC m=+0.331654174 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:56:48 np0005548788.localdomain podman[91215]: 2025-12-06 08:56:48.526826011 +0000 UTC m=+0.340595091 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:11:48Z)
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:56:48 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: tmp-crun.cnUvpM.mount: Deactivated successfully.
Dec 06 08:56:52 np0005548788.localdomain podman[91323]: 2025-12-06 08:56:52.312223565 +0000 UTC m=+0.135165466 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:56:52 np0005548788.localdomain podman[91325]: 2025-12-06 08:56:52.27580881 +0000 UTC m=+0.091347333 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:56:52 np0005548788.localdomain podman[91324]: 2025-12-06 08:56:52.282411924 +0000 UTC m=+0.097311857 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Dec 06 08:56:52 np0005548788.localdomain podman[91325]: 2025-12-06 08:56:52.358836084 +0000 UTC m=+0.174374617 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:56:52 np0005548788.localdomain sshd[91389]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:56:52 np0005548788.localdomain podman[91323]: 2025-12-06 08:56:52.382362171 +0000 UTC m=+0.205304092 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:56:52 np0005548788.localdomain podman[91324]: 2025-12-06 08:56:52.646758127 +0000 UTC m=+0.461658020 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Dec 06 08:56:52 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:56:54 np0005548788.localdomain sshd[91389]: Received disconnect from 36.50.177.119 port 55136:11: Bye Bye [preauth]
Dec 06 08:56:54 np0005548788.localdomain sshd[91389]: Disconnected from authenticating user root 36.50.177.119 port 55136 [preauth]
Dec 06 08:56:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:56:57 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:56:57 np0005548788.localdomain recover_tripleo_nova_virtqemud[91397]: 62021
Dec 06 08:56:57 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:56:57 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:56:57 np0005548788.localdomain podman[91395]: 2025-12-06 08:56:57.274481087 +0000 UTC m=+0.100164005 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:56:57 np0005548788.localdomain podman[91395]: 2025-12-06 08:56:57.484990999 +0000 UTC m=+0.310673877 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1)
Dec 06 08:56:57 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:56:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:56:59 np0005548788.localdomain systemd[1]: tmp-crun.UR8Sym.mount: Deactivated successfully.
Dec 06 08:56:59 np0005548788.localdomain podman[91426]: 2025-12-06 08:56:59.294282471 +0000 UTC m=+0.115105466 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_compute, url=https://www.redhat.com, config_id=tripleo_step5)
Dec 06 08:56:59 np0005548788.localdomain podman[91426]: 2025-12-06 08:56:59.329761287 +0000 UTC m=+0.150584302 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step5)
Dec 06 08:56:59 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:57:00 np0005548788.localdomain sshd[91451]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:57:02 np0005548788.localdomain sshd[91451]: Received disconnect from 45.119.84.54 port 60766:11: Bye Bye [preauth]
Dec 06 08:57:02 np0005548788.localdomain sshd[91451]: Disconnected from authenticating user root 45.119.84.54 port 60766 [preauth]
Dec 06 08:57:08 np0005548788.localdomain sshd[91453]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:57:09 np0005548788.localdomain sshd[91453]: Received disconnect from 148.227.3.232 port 37792:11: Bye Bye [preauth]
Dec 06 08:57:09 np0005548788.localdomain sshd[91453]: Disconnected from authenticating user root 148.227.3.232 port 37792 [preauth]
Dec 06 08:57:12 np0005548788.localdomain sudo[91455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:57:12 np0005548788.localdomain sudo[91455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:57:12 np0005548788.localdomain sudo[91455]: pam_unix(sudo:session): session closed for user root
Dec 06 08:57:13 np0005548788.localdomain sudo[91470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:57:13 np0005548788.localdomain sudo[91470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:57:13 np0005548788.localdomain sudo[91470]: pam_unix(sudo:session): session closed for user root
Dec 06 08:57:14 np0005548788.localdomain sudo[91516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:57:14 np0005548788.localdomain sudo[91516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:57:14 np0005548788.localdomain sudo[91516]: pam_unix(sudo:session): session closed for user root
Dec 06 08:57:15 np0005548788.localdomain sshd[91531]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:57:16 np0005548788.localdomain sshd[91531]: Received disconnect from 179.43.189.36 port 46476:11: Bye Bye [preauth]
Dec 06 08:57:16 np0005548788.localdomain sshd[91531]: Disconnected from authenticating user root 179.43.189.36 port 46476 [preauth]
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:57:19 np0005548788.localdomain podman[91534]: 2025-12-06 08:57:19.277587233 +0000 UTC m=+0.093444338 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:57:19 np0005548788.localdomain podman[91537]: 2025-12-06 08:57:19.344784628 +0000 UTC m=+0.149612722 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:57:19 np0005548788.localdomain podman[91534]: 2025-12-06 08:57:19.366285391 +0000 UTC m=+0.182142486 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:57:19 np0005548788.localdomain podman[91535]: 2025-12-06 08:57:19.44524354 +0000 UTC m=+0.259797644 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z)
Dec 06 08:57:19 np0005548788.localdomain podman[91535]: 2025-12-06 08:57:19.473163122 +0000 UTC m=+0.287717276 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64)
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:57:19 np0005548788.localdomain podman[91536]: 2025-12-06 08:57:19.487398752 +0000 UTC m=+0.295904260 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:57:19 np0005548788.localdomain podman[91536]: 2025-12-06 08:57:19.501640723 +0000 UTC m=+0.310146221 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, container_name=iscsid, name=rhosp17/openstack-iscsid, release=1761123044, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 08:57:19 np0005548788.localdomain podman[91533]: 2025-12-06 08:57:19.407285068 +0000 UTC m=+0.224330960 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:57:19 np0005548788.localdomain podman[91533]: 2025-12-06 08:57:19.538761359 +0000 UTC m=+0.355807231 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:57:19 np0005548788.localdomain podman[91537]: 2025-12-06 08:57:19.561854592 +0000 UTC m=+0.366682726 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:57:19 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:57:23 np0005548788.localdomain podman[91646]: 2025-12-06 08:57:23.259516068 +0000 UTC m=+0.088213005 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, release=1761123044, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:57:23 np0005548788.localdomain podman[91647]: 2025-12-06 08:57:23.310303985 +0000 UTC m=+0.134719120 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: tmp-crun.9iDEvZ.mount: Deactivated successfully.
Dec 06 08:57:23 np0005548788.localdomain podman[91648]: 2025-12-06 08:57:23.391145162 +0000 UTC m=+0.211063808 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 06 08:57:23 np0005548788.localdomain podman[91648]: 2025-12-06 08:57:23.416649669 +0000 UTC m=+0.236568335 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:57:23 np0005548788.localdomain podman[91646]: 2025-12-06 08:57:23.44549803 +0000 UTC m=+0.274195017 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:57:23 np0005548788.localdomain podman[91647]: 2025-12-06 08:57:23.709818354 +0000 UTC m=+0.534233459 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:57:23 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:57:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:57:28 np0005548788.localdomain podman[91713]: 2025-12-06 08:57:28.253415196 +0000 UTC m=+0.085257235 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vcs-type=git, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Dec 06 08:57:28 np0005548788.localdomain podman[91713]: 2025-12-06 08:57:28.444626841 +0000 UTC m=+0.276468910 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:57:28 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:57:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:57:30 np0005548788.localdomain podman[91742]: 2025-12-06 08:57:30.258889345 +0000 UTC m=+0.081616592 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:57:30 np0005548788.localdomain podman[91742]: 2025-12-06 08:57:30.317226417 +0000 UTC m=+0.139953664 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, config_id=tripleo_step5, tcib_managed=true, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Dec 06 08:57:30 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:57:50 np0005548788.localdomain podman[91769]: 2025-12-06 08:57:50.276179414 +0000 UTC m=+0.094898372 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 06 08:57:50 np0005548788.localdomain podman[91769]: 2025-12-06 08:57:50.311800554 +0000 UTC m=+0.130519512 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd)
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: tmp-crun.MZxws2.mount: Deactivated successfully.
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:57:50 np0005548788.localdomain podman[91770]: 2025-12-06 08:57:50.331005548 +0000 UTC m=+0.146888929 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:57:50 np0005548788.localdomain podman[91770]: 2025-12-06 08:57:50.361593701 +0000 UTC m=+0.177477082 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1)
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:57:50 np0005548788.localdomain podman[91771]: 2025-12-06 08:57:50.376213424 +0000 UTC m=+0.187479542 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid)
Dec 06 08:57:50 np0005548788.localdomain podman[91771]: 2025-12-06 08:57:50.409943855 +0000 UTC m=+0.221209983 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:57:50 np0005548788.localdomain podman[91768]: 2025-12-06 08:57:50.423515815 +0000 UTC m=+0.245480534 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git)
Dec 06 08:57:50 np0005548788.localdomain podman[91777]: 2025-12-06 08:57:50.485071756 +0000 UTC m=+0.292632330 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Dec 06 08:57:50 np0005548788.localdomain podman[91768]: 2025-12-06 08:57:50.5121118 +0000 UTC m=+0.334076519 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z)
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:57:50 np0005548788.localdomain podman[91777]: 2025-12-06 08:57:50.539800536 +0000 UTC m=+0.347361110 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:57:50 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:57:54 np0005548788.localdomain podman[91877]: 2025-12-06 08:57:54.253192547 +0000 UTC m=+0.082525640 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.expose-services=)
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: tmp-crun.E9c2Ow.mount: Deactivated successfully.
Dec 06 08:57:54 np0005548788.localdomain podman[91878]: 2025-12-06 08:57:54.316849503 +0000 UTC m=+0.141840842 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com)
Dec 06 08:57:54 np0005548788.localdomain podman[91879]: 2025-12-06 08:57:54.360643565 +0000 UTC m=+0.182851378 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, architecture=x86_64)
Dec 06 08:57:54 np0005548788.localdomain podman[91879]: 2025-12-06 08:57:54.387454804 +0000 UTC m=+0.209662657 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:57:54 np0005548788.localdomain podman[91877]: 2025-12-06 08:57:54.43980798 +0000 UTC m=+0.269141043 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:57:54 np0005548788.localdomain podman[91878]: 2025-12-06 08:57:54.685602942 +0000 UTC m=+0.510594291 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 06 08:57:54 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:57:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:57:59 np0005548788.localdomain podman[91947]: 2025-12-06 08:57:59.264725682 +0000 UTC m=+0.087970778 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z)
Dec 06 08:57:59 np0005548788.localdomain podman[91947]: 2025-12-06 08:57:59.486925455 +0000 UTC m=+0.310170521 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team)
Dec 06 08:57:59 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:58:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:58:01 np0005548788.localdomain systemd[1]: tmp-crun.H4WkX3.mount: Deactivated successfully.
Dec 06 08:58:01 np0005548788.localdomain podman[91976]: 2025-12-06 08:58:01.264162825 +0000 UTC m=+0.091301170 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute)
Dec 06 08:58:01 np0005548788.localdomain podman[91976]: 2025-12-06 08:58:01.293086969 +0000 UTC m=+0.120225374 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, container_name=nova_compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Dec 06 08:58:01 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:58:07 np0005548788.localdomain sshd[92003]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:58:09 np0005548788.localdomain sshd[92003]: Received disconnect from 36.50.177.119 port 51316:11: Bye Bye [preauth]
Dec 06 08:58:09 np0005548788.localdomain sshd[92003]: Disconnected from authenticating user root 36.50.177.119 port 51316 [preauth]
Dec 06 08:58:11 np0005548788.localdomain sshd[92005]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:58:13 np0005548788.localdomain sshd[92005]: Received disconnect from 45.78.219.195 port 39120:11: Bye Bye [preauth]
Dec 06 08:58:13 np0005548788.localdomain sshd[92005]: Disconnected from authenticating user root 45.78.219.195 port 39120 [preauth]
Dec 06 08:58:14 np0005548788.localdomain sudo[92007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:58:14 np0005548788.localdomain sudo[92007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:58:14 np0005548788.localdomain sudo[92007]: pam_unix(sudo:session): session closed for user root
Dec 06 08:58:14 np0005548788.localdomain sudo[92022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:58:14 np0005548788.localdomain sudo[92022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:58:15 np0005548788.localdomain sudo[92022]: pam_unix(sudo:session): session closed for user root
Dec 06 08:58:16 np0005548788.localdomain sudo[92069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:58:16 np0005548788.localdomain sudo[92069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:58:16 np0005548788.localdomain sudo[92069]: pam_unix(sudo:session): session closed for user root
Dec 06 08:58:17 np0005548788.localdomain sshd[92084]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:58:18 np0005548788.localdomain sshd[92084]: Received disconnect from 179.43.189.36 port 53622:11: Bye Bye [preauth]
Dec 06 08:58:18 np0005548788.localdomain sshd[92084]: Disconnected from authenticating user root 179.43.189.36 port 53622 [preauth]
Dec 06 08:58:20 np0005548788.localdomain sshd[92086]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: tmp-crun.Z2C0gV.mount: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain podman[92088]: 2025-12-06 08:58:21.326110374 +0000 UTC m=+0.141370008 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=logrotate_crond, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 06 08:58:21 np0005548788.localdomain podman[92088]: 2025-12-06 08:58:21.360181216 +0000 UTC m=+0.175440830 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public)
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: tmp-crun.yza5dE.mount: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain podman[92091]: 2025-12-06 08:58:21.37587722 +0000 UTC m=+0.182138586 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:58:21 np0005548788.localdomain podman[92091]: 2025-12-06 08:58:21.412299535 +0000 UTC m=+0.218560901 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain podman[92089]: 2025-12-06 08:58:21.407979922 +0000 UTC m=+0.224455413 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:58:21 np0005548788.localdomain podman[92090]: 2025-12-06 08:58:21.367800342 +0000 UTC m=+0.180499357 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:58:21 np0005548788.localdomain podman[92092]: 2025-12-06 08:58:21.488225291 +0000 UTC m=+0.292381032 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:58:21 np0005548788.localdomain podman[92089]: 2025-12-06 08:58:21.49177528 +0000 UTC m=+0.308250831 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain podman[92092]: 2025-12-06 08:58:21.519528528 +0000 UTC m=+0.323684269 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi)
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain podman[92090]: 2025-12-06 08:58:21.548943156 +0000 UTC m=+0.361642201 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4)
Dec 06 08:58:21 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:58:21 np0005548788.localdomain sshd[92086]: Received disconnect from 45.119.84.54 port 44466:11: Bye Bye [preauth]
Dec 06 08:58:21 np0005548788.localdomain sshd[92086]: Disconnected from authenticating user root 45.119.84.54 port 44466 [preauth]
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: tmp-crun.DodTmU.mount: Deactivated successfully.
Dec 06 08:58:25 np0005548788.localdomain podman[92198]: 2025-12-06 08:58:25.265130934 +0000 UTC m=+0.092586751 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:58:25 np0005548788.localdomain podman[92199]: 2025-12-06 08:58:25.313542379 +0000 UTC m=+0.136438125 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target)
Dec 06 08:58:25 np0005548788.localdomain podman[92198]: 2025-12-06 08:58:25.320603767 +0000 UTC m=+0.148059614 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:58:25 np0005548788.localdomain podman[92200]: 2025-12-06 08:58:25.374464781 +0000 UTC m=+0.192621701 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z)
Dec 06 08:58:25 np0005548788.localdomain podman[92200]: 2025-12-06 08:58:25.397306546 +0000 UTC m=+0.215463366 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4)
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:58:25 np0005548788.localdomain podman[92199]: 2025-12-06 08:58:25.701737539 +0000 UTC m=+0.524633285 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:58:25 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:58:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:58:30 np0005548788.localdomain podman[92269]: 2025-12-06 08:58:30.273744848 +0000 UTC m=+0.095643334 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Dec 06 08:58:30 np0005548788.localdomain podman[92269]: 2025-12-06 08:58:30.478696298 +0000 UTC m=+0.300594864 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:58:30 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:58:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:58:32 np0005548788.localdomain podman[92298]: 2025-12-06 08:58:32.243252178 +0000 UTC m=+0.071654854 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:58:32 np0005548788.localdomain podman[92298]: 2025-12-06 08:58:32.299581647 +0000 UTC m=+0.127984323 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:58:32 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:58:51 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:58:51 np0005548788.localdomain recover_tripleo_nova_virtqemud[92325]: 62021
Dec 06 08:58:51 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:58:51 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:58:52 np0005548788.localdomain podman[92327]: 2025-12-06 08:58:52.278492569 +0000 UTC m=+0.097780871 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:58:52 np0005548788.localdomain podman[92327]: 2025-12-06 08:58:52.31416896 +0000 UTC m=+0.133457322 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true)
Dec 06 08:58:52 np0005548788.localdomain podman[92326]: 2025-12-06 08:58:52.326399688 +0000 UTC m=+0.148778936 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:58:52 np0005548788.localdomain podman[92326]: 2025-12-06 08:58:52.364579747 +0000 UTC m=+0.186958985 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, vcs-type=git, container_name=logrotate_crond, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: tmp-crun.4Yf9gG.mount: Deactivated successfully.
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:58:52 np0005548788.localdomain podman[92328]: 2025-12-06 08:58:52.387171284 +0000 UTC m=+0.203927928 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc.)
Dec 06 08:58:52 np0005548788.localdomain podman[92328]: 2025-12-06 08:58:52.420516984 +0000 UTC m=+0.237273618 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 06 08:58:52 np0005548788.localdomain podman[92335]: 2025-12-06 08:58:52.43072086 +0000 UTC m=+0.240442908 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:58:52 np0005548788.localdomain podman[92329]: 2025-12-06 08:58:52.489468454 +0000 UTC m=+0.303749143 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git)
Dec 06 08:58:52 np0005548788.localdomain podman[92335]: 2025-12-06 08:58:52.514774596 +0000 UTC m=+0.324496684 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:58:52 np0005548788.localdomain podman[92329]: 2025-12-06 08:58:52.525474567 +0000 UTC m=+0.339755236 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid)
Dec 06 08:58:52 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: tmp-crun.B0QT9l.mount: Deactivated successfully.
Dec 06 08:58:56 np0005548788.localdomain podman[92434]: 2025-12-06 08:58:56.260482805 +0000 UTC m=+0.087383940 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_id=tripleo_step4, tcib_managed=true)
Dec 06 08:58:56 np0005548788.localdomain podman[92436]: 2025-12-06 08:58:56.313007357 +0000 UTC m=+0.132536705 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Dec 06 08:58:56 np0005548788.localdomain podman[92434]: 2025-12-06 08:58:56.327808024 +0000 UTC m=+0.154709099 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1)
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:58:56 np0005548788.localdomain podman[92435]: 2025-12-06 08:58:56.238934529 +0000 UTC m=+0.065206095 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 06 08:58:56 np0005548788.localdomain podman[92436]: 2025-12-06 08:58:56.35712649 +0000 UTC m=+0.176655888 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Deactivated successfully.
Dec 06 08:58:56 np0005548788.localdomain podman[92435]: 2025-12-06 08:58:56.682429537 +0000 UTC m=+0.508701063 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z)
Dec 06 08:58:56 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:59:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:59:01 np0005548788.localdomain podman[92503]: 2025-12-06 08:59:01.258390029 +0000 UTC m=+0.083664665 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:59:01 np0005548788.localdomain podman[92503]: 2025-12-06 08:59:01.453094003 +0000 UTC m=+0.278368649 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1)
Dec 06 08:59:01 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:59:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:59:03 np0005548788.localdomain podman[92532]: 2025-12-06 08:59:03.251645163 +0000 UTC m=+0.077229717 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:59:03 np0005548788.localdomain podman[92532]: 2025-12-06 08:59:03.282687251 +0000 UTC m=+0.108271825 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:59:03 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:59:16 np0005548788.localdomain sudo[92560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:59:16 np0005548788.localdomain sudo[92560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:16 np0005548788.localdomain sudo[92560]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:16 np0005548788.localdomain sudo[92575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:59:16 np0005548788.localdomain sudo[92575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:17 np0005548788.localdomain sudo[92575]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:17 np0005548788.localdomain sudo[92623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:59:17 np0005548788.localdomain sudo[92623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:17 np0005548788.localdomain sudo[92623]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:17 np0005548788.localdomain sudo[92638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 08:59:17 np0005548788.localdomain sudo[92638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:17 np0005548788.localdomain sudo[92638]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:19 np0005548788.localdomain sshd[92671]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:59:20 np0005548788.localdomain sshd[92673]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:59:20 np0005548788.localdomain sshd[92673]: Received disconnect from 179.43.189.36 port 39348:11: Bye Bye [preauth]
Dec 06 08:59:20 np0005548788.localdomain sshd[92673]: Disconnected from authenticating user root 179.43.189.36 port 39348 [preauth]
Dec 06 08:59:20 np0005548788.localdomain sudo[92675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:59:20 np0005548788.localdomain sudo[92675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:20 np0005548788.localdomain sudo[92675]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:22 np0005548788.localdomain sshd[92690]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: tmp-crun.CdQVKF.mount: Deactivated successfully.
Dec 06 08:59:23 np0005548788.localdomain podman[92692]: 2025-12-06 08:59:23.27924986 +0000 UTC m=+0.096078039 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64)
Dec 06 08:59:23 np0005548788.localdomain podman[92692]: 2025-12-06 08:59:23.286669588 +0000 UTC m=+0.103497797 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, vcs-type=git, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:59:23 np0005548788.localdomain podman[92694]: 2025-12-06 08:59:23.337414117 +0000 UTC m=+0.153974767 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 06 08:59:23 np0005548788.localdomain podman[92696]: 2025-12-06 08:59:23.389769013 +0000 UTC m=+0.199079920 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Dec 06 08:59:23 np0005548788.localdomain podman[92695]: 2025-12-06 08:59:23.435765834 +0000 UTC m=+0.251246581 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 08:59:23 np0005548788.localdomain podman[92695]: 2025-12-06 08:59:23.443463411 +0000 UTC m=+0.258944148 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, vcs-type=git)
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:59:23 np0005548788.localdomain podman[92694]: 2025-12-06 08:59:23.465573274 +0000 UTC m=+0.282133964 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:59:23 np0005548788.localdomain podman[92693]: 2025-12-06 08:59:23.48937997 +0000 UTC m=+0.306284432 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true)
Dec 06 08:59:23 np0005548788.localdomain podman[92696]: 2025-12-06 08:59:23.497033467 +0000 UTC m=+0.306344364 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, distribution-scope=public)
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:59:23 np0005548788.localdomain podman[92693]: 2025-12-06 08:59:23.549772475 +0000 UTC m=+0.366676977 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=collectd, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:59:23 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:59:23 np0005548788.localdomain sshd[92671]: Received disconnect from 101.47.142.76 port 42752:11: Bye Bye [preauth]
Dec 06 08:59:23 np0005548788.localdomain sshd[92671]: Disconnected from authenticating user root 101.47.142.76 port 42752 [preauth]
Dec 06 08:59:24 np0005548788.localdomain sshd[92690]: Received disconnect from 36.50.177.119 port 57376:11: Bye Bye [preauth]
Dec 06 08:59:24 np0005548788.localdomain sshd[92690]: Disconnected from authenticating user root 36.50.177.119 port 57376 [preauth]
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:59:27 np0005548788.localdomain podman[92805]: 2025-12-06 08:59:27.266908901 +0000 UTC m=+0.088059010 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:59:27 np0005548788.localdomain podman[92804]: 2025-12-06 08:59:27.319453654 +0000 UTC m=+0.143873074 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: tmp-crun.TWf33T.mount: Deactivated successfully.
Dec 06 08:59:27 np0005548788.localdomain podman[92806]: 2025-12-06 08:59:27.385736712 +0000 UTC m=+0.203851947 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team)
Dec 06 08:59:27 np0005548788.localdomain podman[92804]: 2025-12-06 08:59:27.399651632 +0000 UTC m=+0.224071042 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:59:27 np0005548788.localdomain podman[92806]: 2025-12-06 08:59:27.438863493 +0000 UTC m=+0.256978708 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com)
Dec 06 08:59:27 np0005548788.localdomain podman[92806]: unhealthy
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 08:59:27 np0005548788.localdomain podman[92805]: 2025-12-06 08:59:27.615469827 +0000 UTC m=+0.436619856 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 06 08:59:27 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 08:59:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 08:59:32 np0005548788.localdomain podman[92876]: 2025-12-06 08:59:32.255713834 +0000 UTC m=+0.087912856 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:59:32 np0005548788.localdomain podman[92876]: 2025-12-06 08:59:32.519812891 +0000 UTC m=+0.352011923 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 06 08:59:32 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 08:59:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 08:59:34 np0005548788.localdomain sshd[92917]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:59:34 np0005548788.localdomain systemd[1]: tmp-crun.cqnaOx.mount: Deactivated successfully.
Dec 06 08:59:34 np0005548788.localdomain podman[92905]: 2025-12-06 08:59:34.283538705 +0000 UTC m=+0.107653395 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.openshift.expose-services=)
Dec 06 08:59:34 np0005548788.localdomain podman[92905]: 2025-12-06 08:59:34.337637156 +0000 UTC m=+0.161751846 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:59:34 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 08:59:34 np0005548788.localdomain sshd[92932]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:59:35 np0005548788.localdomain sshd[92932]: Received disconnect from 148.227.3.232 port 44278:11: Bye Bye [preauth]
Dec 06 08:59:35 np0005548788.localdomain sshd[92932]: Disconnected from authenticating user root 148.227.3.232 port 44278 [preauth]
Dec 06 08:59:35 np0005548788.localdomain sshd[92917]: Received disconnect from 45.119.84.54 port 54978:11: Bye Bye [preauth]
Dec 06 08:59:35 np0005548788.localdomain sshd[92917]: Disconnected from authenticating user root 45.119.84.54 port 54978 [preauth]
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 08:59:54 np0005548788.localdomain podman[92935]: 2025-12-06 08:59:54.268443002 +0000 UTC m=+0.088151424 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:59:54 np0005548788.localdomain podman[92935]: 2025-12-06 08:59:54.276994106 +0000 UTC m=+0.096702548 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: tmp-crun.WjcXpd.mount: Deactivated successfully.
Dec 06 08:59:54 np0005548788.localdomain podman[92937]: 2025-12-06 08:59:54.340536909 +0000 UTC m=+0.152452050 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:59:54 np0005548788.localdomain podman[92936]: 2025-12-06 08:59:54.385171698 +0000 UTC m=+0.202101614 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Dec 06 08:59:54 np0005548788.localdomain podman[92936]: 2025-12-06 08:59:54.421593453 +0000 UTC m=+0.238523359 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:59:54 np0005548788.localdomain podman[92937]: 2025-12-06 08:59:54.433244072 +0000 UTC m=+0.245159223 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, config_id=tripleo_step3)
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 08:59:54 np0005548788.localdomain podman[92943]: 2025-12-06 08:59:54.533332653 +0000 UTC m=+0.343545441 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.buildah.version=1.41.4)
Dec 06 08:59:54 np0005548788.localdomain podman[92934]: 2025-12-06 08:59:54.504656918 +0000 UTC m=+0.325304038 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:59:54 np0005548788.localdomain podman[92934]: 2025-12-06 08:59:54.588515558 +0000 UTC m=+0.409162608 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 08:59:54 np0005548788.localdomain podman[92943]: 2025-12-06 08:59:54.639619696 +0000 UTC m=+0.449832484 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team)
Dec 06 08:59:54 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 08:59:58 np0005548788.localdomain podman[93050]: 2025-12-06 08:59:58.257348723 +0000 UTC m=+0.083404628 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 06 08:59:58 np0005548788.localdomain podman[93051]: 2025-12-06 08:59:58.281738205 +0000 UTC m=+0.102779404 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.)
Dec 06 08:59:58 np0005548788.localdomain podman[93052]: 2025-12-06 08:59:58.328918183 +0000 UTC m=+0.149755706 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Dec 06 08:59:58 np0005548788.localdomain podman[93050]: 2025-12-06 08:59:58.338800088 +0000 UTC m=+0.164855993 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z)
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 08:59:58 np0005548788.localdomain podman[93052]: 2025-12-06 08:59:58.380708253 +0000 UTC m=+0.201545736 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:59:58 np0005548788.localdomain podman[93052]: unhealthy
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 08:59:58 np0005548788.localdomain podman[93051]: 2025-12-06 08:59:58.669387079 +0000 UTC m=+0.490428328 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:59:58 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:00:01 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:00:01 np0005548788.localdomain recover_tripleo_nova_virtqemud[93119]: 62021
Dec 06 09:00:01 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:00:01 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:00:01 np0005548788.localdomain CROND[93121]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Dec 06 09:00:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:00:03 np0005548788.localdomain podman[93124]: 2025-12-06 09:00:03.271352324 +0000 UTC m=+0.095011425 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, release=1761123044, tcib_managed=true, io.openshift.expose-services=, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.)
Dec 06 09:00:03 np0005548788.localdomain podman[93124]: 2025-12-06 09:00:03.505276629 +0000 UTC m=+0.328935720 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1)
Dec 06 09:00:03 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:00:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:00:05 np0005548788.localdomain podman[93153]: 2025-12-06 09:00:05.264486944 +0000 UTC m=+0.091662562 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:00:05 np0005548788.localdomain podman[93153]: 2025-12-06 09:00:05.297772722 +0000 UTC m=+0.124948320 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:00:05 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:00:21 np0005548788.localdomain sudo[93179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:00:21 np0005548788.localdomain sudo[93179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:00:21 np0005548788.localdomain sudo[93179]: pam_unix(sudo:session): session closed for user root
Dec 06 09:00:21 np0005548788.localdomain sudo[93194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:00:21 np0005548788.localdomain sudo[93194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:00:22 np0005548788.localdomain sudo[93194]: pam_unix(sudo:session): session closed for user root
Dec 06 09:00:23 np0005548788.localdomain sshd[93240]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:24 np0005548788.localdomain sshd[93240]: Received disconnect from 179.43.189.36 port 47752:11: Bye Bye [preauth]
Dec 06 09:00:24 np0005548788.localdomain sshd[93240]: Disconnected from authenticating user root 179.43.189.36 port 47752 [preauth]
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:00:24 np0005548788.localdomain podman[93242]: 2025-12-06 09:00:24.695525524 +0000 UTC m=+0.107051127 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-type=git)
Dec 06 09:00:24 np0005548788.localdomain podman[93242]: 2025-12-06 09:00:24.705135081 +0000 UTC m=+0.116660604 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container)
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:00:24 np0005548788.localdomain podman[93243]: 2025-12-06 09:00:24.805960465 +0000 UTC m=+0.217981293 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, version=17.1.12, container_name=ceilometer_agent_compute)
Dec 06 09:00:24 np0005548788.localdomain podman[93283]: 2025-12-06 09:00:24.824181058 +0000 UTC m=+0.098844674 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Dec 06 09:00:24 np0005548788.localdomain podman[93283]: 2025-12-06 09:00:24.886477721 +0000 UTC m=+0.161141327 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git)
Dec 06 09:00:24 np0005548788.localdomain podman[93243]: 2025-12-06 09:00:24.894704825 +0000 UTC m=+0.306725603 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:00:24 np0005548788.localdomain podman[93244]: 2025-12-06 09:00:24.90747697 +0000 UTC m=+0.318359474 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:00:24 np0005548788.localdomain podman[93282]: 2025-12-06 09:00:24.950131028 +0000 UTC m=+0.235507725 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:00:24 np0005548788.localdomain podman[93244]: 2025-12-06 09:00:24.972991293 +0000 UTC m=+0.383873847 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, tcib_managed=true, container_name=iscsid)
Dec 06 09:00:24 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:00:24 np0005548788.localdomain podman[93282]: 2025-12-06 09:00:24.987473701 +0000 UTC m=+0.272850418 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=logrotate_crond, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, release=1761123044, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12)
Dec 06 09:00:25 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:00:26 np0005548788.localdomain sudo[93347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:00:26 np0005548788.localdomain sudo[93347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:00:26 np0005548788.localdomain sudo[93347]: pam_unix(sudo:session): session closed for user root
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:00:29 np0005548788.localdomain podman[93364]: 2025-12-06 09:00:29.24183076 +0000 UTC m=+0.068452675 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=)
Dec 06 09:00:29 np0005548788.localdomain podman[93364]: 2025-12-06 09:00:29.259681781 +0000 UTC m=+0.086303686 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Dec 06 09:00:29 np0005548788.localdomain podman[93364]: unhealthy
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:00:29 np0005548788.localdomain podman[93362]: 2025-12-06 09:00:29.305559389 +0000 UTC m=+0.134793034 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team)
Dec 06 09:00:29 np0005548788.localdomain podman[93363]: 2025-12-06 09:00:29.365601713 +0000 UTC m=+0.190610708 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, version=17.1.12, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute)
Dec 06 09:00:29 np0005548788.localdomain podman[93362]: 2025-12-06 09:00:29.388868642 +0000 UTC m=+0.218102247 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Deactivated successfully.
Dec 06 09:00:29 np0005548788.localdomain podman[93363]: 2025-12-06 09:00:29.712510748 +0000 UTC m=+0.537519723 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z)
Dec 06 09:00:29 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:00:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:00:34 np0005548788.localdomain systemd[1]: tmp-crun.2YhNTM.mount: Deactivated successfully.
Dec 06 09:00:34 np0005548788.localdomain podman[93436]: 2025-12-06 09:00:34.275516529 +0000 UTC m=+0.098016248 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:00:34 np0005548788.localdomain podman[93436]: 2025-12-06 09:00:34.488735965 +0000 UTC m=+0.311235744 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd)
Dec 06 09:00:34 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:00:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:00:36 np0005548788.localdomain podman[93465]: 2025-12-06 09:00:36.212866376 +0000 UTC m=+0.048816299 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Dec 06 09:00:36 np0005548788.localdomain podman[93465]: 2025-12-06 09:00:36.24055967 +0000 UTC m=+0.076509593 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64)
Dec 06 09:00:36 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:00:39 np0005548788.localdomain sshd[93493]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:41 np0005548788.localdomain sshd[93493]: Received disconnect from 36.50.177.119 port 34578:11: Bye Bye [preauth]
Dec 06 09:00:41 np0005548788.localdomain sshd[93493]: Disconnected from authenticating user root 36.50.177.119 port 34578 [preauth]
Dec 06 09:00:43 np0005548788.localdomain sshd[93495]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:47 np0005548788.localdomain sshd[93495]: Received disconnect from 45.78.219.195 port 44540:11: Bye Bye [preauth]
Dec 06 09:00:47 np0005548788.localdomain sshd[93495]: Disconnected from authenticating user root 45.78.219.195 port 44540 [preauth]
Dec 06 09:00:53 np0005548788.localdomain sshd[93497]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:00:55 np0005548788.localdomain podman[93499]: 2025-12-06 09:00:55.279937056 +0000 UTC m=+0.098074400 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:00:55 np0005548788.localdomain podman[93499]: 2025-12-06 09:00:55.317578498 +0000 UTC m=+0.135715842 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=)
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:00:55 np0005548788.localdomain podman[93500]: 2025-12-06 09:00:55.322272994 +0000 UTC m=+0.140496641 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd)
Dec 06 09:00:55 np0005548788.localdomain podman[93501]: 2025-12-06 09:00:55.387024423 +0000 UTC m=+0.199874133 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:00:55 np0005548788.localdomain sshd[93497]: Received disconnect from 45.119.84.54 port 53606:11: Bye Bye [preauth]
Dec 06 09:00:55 np0005548788.localdomain sshd[93497]: Disconnected from authenticating user root 45.119.84.54 port 53606 [preauth]
Dec 06 09:00:55 np0005548788.localdomain podman[93502]: 2025-12-06 09:00:55.43609459 +0000 UTC m=+0.246649749 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:00:55 np0005548788.localdomain podman[93502]: 2025-12-06 09:00:55.4455102 +0000 UTC m=+0.256065399 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container)
Dec 06 09:00:55 np0005548788.localdomain podman[93500]: 2025-12-06 09:00:55.45458077 +0000 UTC m=+0.272804457 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64)
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:00:55 np0005548788.localdomain podman[93501]: 2025-12-06 09:00:55.498105145 +0000 UTC m=+0.310954865 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:00:55 np0005548788.localdomain podman[93505]: 2025-12-06 09:00:55.59024354 +0000 UTC m=+0.398103187 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team)
Dec 06 09:00:55 np0005548788.localdomain podman[93505]: 2025-12-06 09:00:55.645321311 +0000 UTC m=+0.453180978 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Dec 06 09:00:55 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:01:00 np0005548788.localdomain podman[93612]: 2025-12-06 09:01:00.256875072 +0000 UTC m=+0.082371554 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044)
Dec 06 09:01:00 np0005548788.localdomain podman[93612]: 2025-12-06 09:01:00.296818846 +0000 UTC m=+0.122315318 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:01:00 np0005548788.localdomain podman[93612]: unhealthy
Dec 06 09:01:00 np0005548788.localdomain podman[93613]: 2025-12-06 09:01:00.312326165 +0000 UTC m=+0.135434093 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:01:00 np0005548788.localdomain podman[93614]: 2025-12-06 09:01:00.3701007 +0000 UTC m=+0.187945145 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:01:00 np0005548788.localdomain podman[93614]: 2025-12-06 09:01:00.389018124 +0000 UTC m=+0.206862629 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, distribution-scope=public, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=)
Dec 06 09:01:00 np0005548788.localdomain podman[93614]: unhealthy
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:01:00 np0005548788.localdomain podman[93613]: 2025-12-06 09:01:00.681771526 +0000 UTC m=+0.504879444 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Dec 06 09:01:00 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:01:01 np0005548788.localdomain CROND[93672]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548788.localdomain run-parts[93675]: (/etc/cron.hourly) starting 0anacron
Dec 06 09:01:01 np0005548788.localdomain run-parts[93681]: (/etc/cron.hourly) finished 0anacron
Dec 06 09:01:01 np0005548788.localdomain CROND[93671]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548788.localdomain CROND[93683]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548788.localdomain run-parts[93686]: (/etc/cron.hourly) starting 0anacron
Dec 06 09:01:01 np0005548788.localdomain anacron[93694]: Anacron started on 2025-12-06
Dec 06 09:01:01 np0005548788.localdomain anacron[93694]: Will run job `cron.daily' in 12 min.
Dec 06 09:01:01 np0005548788.localdomain anacron[93694]: Will run job `cron.weekly' in 32 min.
Dec 06 09:01:01 np0005548788.localdomain anacron[93694]: Will run job `cron.monthly' in 52 min.
Dec 06 09:01:01 np0005548788.localdomain anacron[93694]: Jobs will be executed sequentially
Dec 06 09:01:01 np0005548788.localdomain run-parts[93696]: (/etc/cron.hourly) finished 0anacron
Dec 06 09:01:01 np0005548788.localdomain CROND[93682]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 09:01:03 np0005548788.localdomain CROND[93120]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Dec 06 09:01:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:01:05 np0005548788.localdomain systemd[1]: tmp-crun.Lm3LSS.mount: Deactivated successfully.
Dec 06 09:01:05 np0005548788.localdomain podman[93699]: 2025-12-06 09:01:05.265468626 +0000 UTC m=+0.091771675 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12)
Dec 06 09:01:05 np0005548788.localdomain podman[93699]: 2025-12-06 09:01:05.487384621 +0000 UTC m=+0.313687590 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:01:05 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:01:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:01:07 np0005548788.localdomain podman[93728]: 2025-12-06 09:01:07.248298168 +0000 UTC m=+0.078921197 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 06 09:01:07 np0005548788.localdomain podman[93728]: 2025-12-06 09:01:07.302018557 +0000 UTC m=+0.132641566 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public)
Dec 06 09:01:07 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:01:26 np0005548788.localdomain podman[93756]: 2025-12-06 09:01:26.297864067 +0000 UTC m=+0.114380484 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:01:26 np0005548788.localdomain podman[93761]: 2025-12-06 09:01:26.335145938 +0000 UTC m=+0.147131214 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:01:26 np0005548788.localdomain podman[93756]: 2025-12-06 09:01:26.354833756 +0000 UTC m=+0.171350173 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:01:26 np0005548788.localdomain podman[93767]: 2025-12-06 09:01:26.261627748 +0000 UTC m=+0.076706680 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true)
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:01:26 np0005548788.localdomain podman[93767]: 2025-12-06 09:01:26.394255355 +0000 UTC m=+0.209334307 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:01:26 np0005548788.localdomain podman[93755]: 2025-12-06 09:01:26.360592885 +0000 UTC m=+0.182652923 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=collectd, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Dec 06 09:01:26 np0005548788.localdomain podman[93761]: 2025-12-06 09:01:26.416547942 +0000 UTC m=+0.228533298 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4)
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:01:26 np0005548788.localdomain podman[93755]: 2025-12-06 09:01:26.440489982 +0000 UTC m=+0.262549990 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, container_name=collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:01:26 np0005548788.localdomain podman[93754]: 2025-12-06 09:01:26.316499763 +0000 UTC m=+0.141948785 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc.)
Dec 06 09:01:26 np0005548788.localdomain podman[93754]: 2025-12-06 09:01:26.496530443 +0000 UTC m=+0.321979485 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 09:01:26 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:01:26 np0005548788.localdomain sudo[93863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:01:26 np0005548788.localdomain sudo[93863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:26 np0005548788.localdomain sudo[93863]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:26 np0005548788.localdomain sudo[93878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:01:26 np0005548788.localdomain sudo[93878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:27 np0005548788.localdomain systemd[1]: tmp-crun.MqvV6p.mount: Deactivated successfully.
Dec 06 09:01:27 np0005548788.localdomain sudo[93878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:27 np0005548788.localdomain sudo[93914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:01:27 np0005548788.localdomain sudo[93914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:27 np0005548788.localdomain sudo[93914]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:27 np0005548788.localdomain sudo[93929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:01:27 np0005548788.localdomain sudo[93929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:28 np0005548788.localdomain sudo[93929]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:28 np0005548788.localdomain sudo[93977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:01:28 np0005548788.localdomain sudo[93977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:28 np0005548788.localdomain sudo[93977]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:28 np0005548788.localdomain sudo[93992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 09:01:28 np0005548788.localdomain sudo[93992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:29 np0005548788.localdomain sshd[94041]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 2025-12-06 09:01:29.205947445 +0000 UTC m=+0.087488022 container create 34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_hopper, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 09:01:29 np0005548788.localdomain systemd[1]: Started libpod-conmon-34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16.scope.
Dec 06 09:01:29 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 2025-12-06 09:01:29.173140482 +0000 UTC m=+0.054681079 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 2025-12-06 09:01:29.281243811 +0000 UTC m=+0.162784388 container init 34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_hopper, GIT_CLEAN=True, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, release=1763362218, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, ceph=True, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 2025-12-06 09:01:29.290924801 +0000 UTC m=+0.172465378 container start 34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_hopper, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 2025-12-06 09:01:29.291387465 +0000 UTC m=+0.172928032 container attach 34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_hopper, release=1763362218, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 09:01:29 np0005548788.localdomain confident_hopper[94065]: 167 167
Dec 06 09:01:29 np0005548788.localdomain systemd[1]: libpod-34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16.scope: Deactivated successfully.
Dec 06 09:01:29 np0005548788.localdomain podman[94050]: 2025-12-06 09:01:29.29543863 +0000 UTC m=+0.176979267 container died 34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_hopper, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container)
Dec 06 09:01:29 np0005548788.localdomain podman[94070]: 2025-12-06 09:01:29.397695578 +0000 UTC m=+0.087720461 container remove 34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_hopper, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=1763362218, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-type=git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 09:01:29 np0005548788.localdomain systemd[1]: libpod-conmon-34dd06adfcdb6cb16f9d06c320bc90cc57dffdd6621d18290ac5762c38b05a16.scope: Deactivated successfully.
Dec 06 09:01:29 np0005548788.localdomain podman[94092]: 
Dec 06 09:01:29 np0005548788.localdomain podman[94092]: 2025-12-06 09:01:29.608245841 +0000 UTC m=+0.077870656 container create 677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cori, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.buildah.version=1.41.4, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1763362218, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:01:29 np0005548788.localdomain systemd[1]: Started libpod-conmon-677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058.scope.
Dec 06 09:01:29 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:01:29 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fbe5cf38e503c4770715d9d64792466c13506e59fbd98e86b9bd9c61bd041e6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:01:29 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fbe5cf38e503c4770715d9d64792466c13506e59fbd98e86b9bd9c61bd041e6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:01:29 np0005548788.localdomain podman[94092]: 2025-12-06 09:01:29.575795908 +0000 UTC m=+0.045420753 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:01:29 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fbe5cf38e503c4770715d9d64792466c13506e59fbd98e86b9bd9c61bd041e6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:01:29 np0005548788.localdomain podman[94092]: 2025-12-06 09:01:29.679152021 +0000 UTC m=+0.148776836 container init 677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cori, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, distribution-scope=public)
Dec 06 09:01:29 np0005548788.localdomain podman[94092]: 2025-12-06 09:01:29.691553244 +0000 UTC m=+0.161178059 container start 677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cori, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1763362218)
Dec 06 09:01:29 np0005548788.localdomain podman[94092]: 2025-12-06 09:01:29.691804772 +0000 UTC m=+0.161429587 container attach 677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cori, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, version=7, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=)
Dec 06 09:01:29 np0005548788.localdomain sshd[94041]: Received disconnect from 179.43.189.36 port 44880:11: Bye Bye [preauth]
Dec 06 09:01:29 np0005548788.localdomain sshd[94041]: Disconnected from authenticating user root 179.43.189.36 port 44880 [preauth]
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c79657e4fdd783a3717a40fde388c77f8a6431ca55fe5c3c7a42d1736b9c6092-merged.mount: Deactivated successfully.
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]: [
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:     {
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "available": false,
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "ceph_device": false,
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "lsm_data": {},
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "lvs": [],
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "path": "/dev/sr0",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "rejected_reasons": [
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "Has a FileSystem",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "Insufficient space (<5GB)"
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         ],
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         "sys_api": {
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "actuators": null,
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "device_nodes": "sr0",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "human_readable_size": "482.00 KB",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "id_bus": "ata",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "model": "QEMU DVD-ROM",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "nr_requests": "2",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "partitions": {},
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "path": "/dev/sr0",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "removable": "1",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "rev": "2.5+",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "ro": "0",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "rotational": "1",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "sas_address": "",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "sas_device_handle": "",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "scheduler_mode": "mq-deadline",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "sectors": 0,
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "sectorsize": "2048",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "size": 493568.0,
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "support_discard": "0",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "type": "disk",
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:             "vendor": "QEMU"
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:         }
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]:     }
Dec 06 09:01:30 np0005548788.localdomain nice_cori[94108]: ]
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: libpod-677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058.scope: Deactivated successfully.
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: libpod-677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058.scope: Consumed 1.033s CPU time.
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:01:30 np0005548788.localdomain podman[95886]: 2025-12-06 09:01:30.765433482 +0000 UTC m=+0.039514902 container died 677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cori, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=1763362218, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: tmp-crun.Eg7fH7.mount: Deactivated successfully.
Dec 06 09:01:30 np0005548788.localdomain podman[95892]: 2025-12-06 09:01:30.810111442 +0000 UTC m=+0.069167258 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible)
Dec 06 09:01:30 np0005548788.localdomain podman[95897]: 2025-12-06 09:01:30.819837262 +0000 UTC m=+0.074032248 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Dec 06 09:01:30 np0005548788.localdomain podman[95892]: 2025-12-06 09:01:30.828249011 +0000 UTC m=+0.087304807 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:01:30 np0005548788.localdomain podman[95892]: unhealthy
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:01:30 np0005548788.localdomain podman[95897]: 2025-12-06 09:01:30.858511716 +0000 UTC m=+0.112706722 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:01:30 np0005548788.localdomain podman[95897]: unhealthy
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:01:30 np0005548788.localdomain podman[95893]: 2025-12-06 09:01:30.86832335 +0000 UTC m=+0.123229507 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:01:30 np0005548788.localdomain podman[95886]: 2025-12-06 09:01:30.898439899 +0000 UTC m=+0.172521229 container remove 677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cori, distribution-scope=public, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 09:01:30 np0005548788.localdomain systemd[1]: libpod-conmon-677ce6c7811da4e3bb19beb55c66427bd9e65278aa31ed40ad934ae14c1ae058.scope: Deactivated successfully.
Dec 06 09:01:30 np0005548788.localdomain sudo[93992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6fbe5cf38e503c4770715d9d64792466c13506e59fbd98e86b9bd9c61bd041e6-merged.mount: Deactivated successfully.
Dec 06 09:01:31 np0005548788.localdomain podman[95893]: 2025-12-06 09:01:31.213655815 +0000 UTC m=+0.468562052 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Dec 06 09:01:31 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:01:31 np0005548788.localdomain sudo[95959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:01:31 np0005548788.localdomain sudo[95959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:31 np0005548788.localdomain sudo[95959]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:01:36 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:01:36 np0005548788.localdomain recover_tripleo_nova_virtqemud[95981]: 62021
Dec 06 09:01:36 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:01:36 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:01:36 np0005548788.localdomain podman[95974]: 2025-12-06 09:01:36.260058938 +0000 UTC m=+0.086225715 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, architecture=x86_64, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 06 09:01:36 np0005548788.localdomain podman[95974]: 2025-12-06 09:01:36.488728461 +0000 UTC m=+0.314895278 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:01:36 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:01:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:01:38 np0005548788.localdomain podman[96005]: 2025-12-06 09:01:38.250146194 +0000 UTC m=+0.075471931 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 06 09:01:38 np0005548788.localdomain podman[96005]: 2025-12-06 09:01:38.276534448 +0000 UTC m=+0.101860145 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:01:38 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:01:57 np0005548788.localdomain podman[96033]: 2025-12-06 09:01:57.300997932 +0000 UTC m=+0.124858126 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team)
Dec 06 09:01:57 np0005548788.localdomain podman[96032]: 2025-12-06 09:01:57.254156705 +0000 UTC m=+0.081988972 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:01:57 np0005548788.localdomain podman[96031]: 2025-12-06 09:01:57.360761438 +0000 UTC m=+0.188307037 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, name=rhosp17/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 09:01:57 np0005548788.localdomain podman[96031]: 2025-12-06 09:01:57.373603555 +0000 UTC m=+0.201149084 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:01:57 np0005548788.localdomain podman[96032]: 2025-12-06 09:01:57.387265537 +0000 UTC m=+0.215097824 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:01:57 np0005548788.localdomain podman[96033]: 2025-12-06 09:01:57.413248199 +0000 UTC m=+0.237108443 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:01:57 np0005548788.localdomain podman[96039]: 2025-12-06 09:01:57.337232581 +0000 UTC m=+0.151481249 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 06 09:01:57 np0005548788.localdomain podman[96050]: 2025-12-06 09:01:57.46507844 +0000 UTC m=+0.274228070 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:01:57 np0005548788.localdomain podman[96050]: 2025-12-06 09:01:57.496417928 +0000 UTC m=+0.305567568 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z)
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:01:57 np0005548788.localdomain podman[96039]: 2025-12-06 09:01:57.52272106 +0000 UTC m=+0.336802213 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 06 09:01:57 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:01:58 np0005548788.localdomain systemd[1]: tmp-crun.swsNj9.mount: Deactivated successfully.
Dec 06 09:01:59 np0005548788.localdomain sshd[96142]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:00 np0005548788.localdomain sshd[96142]: Received disconnect from 36.50.177.119 port 45478:11: Bye Bye [preauth]
Dec 06 09:02:00 np0005548788.localdomain sshd[96142]: Disconnected from authenticating user root 36.50.177.119 port 45478 [preauth]
Dec 06 09:02:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:02:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:02:01 np0005548788.localdomain systemd[1]: tmp-crun.QqkdsZ.mount: Deactivated successfully.
Dec 06 09:02:01 np0005548788.localdomain podman[96144]: 2025-12-06 09:02:01.049178877 +0000 UTC m=+0.096233623 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 06 09:02:01 np0005548788.localdomain podman[96144]: 2025-12-06 09:02:01.092588458 +0000 UTC m=+0.139643194 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044)
Dec 06 09:02:01 np0005548788.localdomain podman[96144]: unhealthy
Dec 06 09:02:01 np0005548788.localdomain systemd[1]: tmp-crun.GdbSGk.mount: Deactivated successfully.
Dec 06 09:02:01 np0005548788.localdomain podman[96145]: 2025-12-06 09:02:01.11112339 +0000 UTC m=+0.154463171 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, name=rhosp17/openstack-ovn-controller, release=1761123044, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 09:02:01 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:01 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:02:01 np0005548788.localdomain podman[96145]: 2025-12-06 09:02:01.131534381 +0000 UTC m=+0.174874232 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 09:02:01 np0005548788.localdomain podman[96145]: unhealthy
Dec 06 09:02:01 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:01 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:02:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:02:02 np0005548788.localdomain podman[96182]: 2025-12-06 09:02:02.271062366 +0000 UTC m=+0.099646298 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:02:02 np0005548788.localdomain podman[96182]: 2025-12-06 09:02:02.638948289 +0000 UTC m=+0.467532191 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 06 09:02:02 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:02:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:02:07 np0005548788.localdomain podman[96205]: 2025-12-06 09:02:07.286320756 +0000 UTC m=+0.112685812 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 09:02:07 np0005548788.localdomain podman[96205]: 2025-12-06 09:02:07.488627744 +0000 UTC m=+0.314992801 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, url=https://www.redhat.com)
Dec 06 09:02:07 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:02:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:02:09 np0005548788.localdomain podman[96234]: 2025-12-06 09:02:09.254189485 +0000 UTC m=+0.085821582 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step5, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:02:09 np0005548788.localdomain podman[96234]: 2025-12-06 09:02:09.282906951 +0000 UTC m=+0.114539088 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, release=1761123044, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_compute)
Dec 06 09:02:09 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:02:09 np0005548788.localdomain sshd[96260]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:10 np0005548788.localdomain sshd[96260]: Received disconnect from 148.227.3.232 port 49226:11: Bye Bye [preauth]
Dec 06 09:02:10 np0005548788.localdomain sshd[96260]: Disconnected from authenticating user root 148.227.3.232 port 49226 [preauth]
Dec 06 09:02:16 np0005548788.localdomain sshd[96262]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:16 np0005548788.localdomain sshd[96263]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:18 np0005548788.localdomain sshd[96263]: Received disconnect from 45.119.84.54 port 38336:11: Bye Bye [preauth]
Dec 06 09:02:18 np0005548788.localdomain sshd[96263]: Disconnected from authenticating user root 45.119.84.54 port 38336 [preauth]
Dec 06 09:02:20 np0005548788.localdomain sshd[96262]: Received disconnect from 101.47.142.76 port 34272:11: Bye Bye [preauth]
Dec 06 09:02:20 np0005548788.localdomain sshd[96262]: Disconnected from authenticating user root 101.47.142.76 port 34272 [preauth]
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:02:28 np0005548788.localdomain podman[96268]: 2025-12-06 09:02:28.289972787 +0000 UTC m=+0.094707436 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: tmp-crun.jUBHxO.mount: Deactivated successfully.
Dec 06 09:02:28 np0005548788.localdomain podman[96268]: 2025-12-06 09:02:28.342601943 +0000 UTC m=+0.147336592 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:02:28 np0005548788.localdomain podman[96267]: 2025-12-06 09:02:28.389146991 +0000 UTC m=+0.197177432 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 09:02:28 np0005548788.localdomain podman[96266]: 2025-12-06 09:02:28.343433498 +0000 UTC m=+0.156308609 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:02:28 np0005548788.localdomain podman[96266]: 2025-12-06 09:02:28.423311126 +0000 UTC m=+0.236186247 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:02:28 np0005548788.localdomain podman[96276]: 2025-12-06 09:02:28.505554835 +0000 UTC m=+0.302575496 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi)
Dec 06 09:02:28 np0005548788.localdomain podman[96276]: 2025-12-06 09:02:28.535661445 +0000 UTC m=+0.332682106 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true)
Dec 06 09:02:28 np0005548788.localdomain podman[96274]: 2025-12-06 09:02:28.54586522 +0000 UTC m=+0.347140492 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:02:28 np0005548788.localdomain podman[96267]: 2025-12-06 09:02:28.575666561 +0000 UTC m=+0.383697042 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 09:02:28 np0005548788.localdomain podman[96274]: 2025-12-06 09:02:28.585598378 +0000 UTC m=+0.386873660 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1)
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:02:28 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: tmp-crun.gD9Wyr.mount: Deactivated successfully.
Dec 06 09:02:31 np0005548788.localdomain podman[96380]: 2025-12-06 09:02:31.239596549 +0000 UTC m=+0.072303545 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:02:31 np0005548788.localdomain podman[96380]: 2025-12-06 09:02:31.285950481 +0000 UTC m=+0.118657437 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:02:31 np0005548788.localdomain podman[96380]: unhealthy
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:02:31 np0005548788.localdomain podman[96381]: 2025-12-06 09:02:31.294722622 +0000 UTC m=+0.123535987 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git)
Dec 06 09:02:31 np0005548788.localdomain podman[96381]: 2025-12-06 09:02:31.374723532 +0000 UTC m=+0.203536917 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller)
Dec 06 09:02:31 np0005548788.localdomain podman[96381]: unhealthy
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:31 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:02:31 np0005548788.localdomain sudo[96419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:02:31 np0005548788.localdomain sudo[96419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:02:31 np0005548788.localdomain sudo[96419]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:31 np0005548788.localdomain sudo[96434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:02:31 np0005548788.localdomain sudo[96434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:02:32 np0005548788.localdomain sudo[96434]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:32 np0005548788.localdomain sudo[96481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:02:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:02:33 np0005548788.localdomain sudo[96481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:02:33 np0005548788.localdomain sudo[96481]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:33 np0005548788.localdomain podman[96495]: 2025-12-06 09:02:33.2865172 +0000 UTC m=+0.058091146 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:02:33 np0005548788.localdomain podman[96495]: 2025-12-06 09:02:33.690545218 +0000 UTC m=+0.462119164 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, container_name=nova_migration_target, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:02:33 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:02:34 np0005548788.localdomain sshd[96522]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:35 np0005548788.localdomain sshd[96522]: Received disconnect from 179.43.189.36 port 42420:11: Bye Bye [preauth]
Dec 06 09:02:35 np0005548788.localdomain sshd[96522]: Disconnected from authenticating user root 179.43.189.36 port 42420 [preauth]
Dec 06 09:02:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:02:38 np0005548788.localdomain systemd[1]: tmp-crun.GZQPMZ.mount: Deactivated successfully.
Dec 06 09:02:38 np0005548788.localdomain podman[96524]: 2025-12-06 09:02:38.264164918 +0000 UTC m=+0.091405865 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:02:38 np0005548788.localdomain podman[96524]: 2025-12-06 09:02:38.495247244 +0000 UTC m=+0.322488251 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:02:38 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:02:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:02:40 np0005548788.localdomain podman[96553]: 2025-12-06 09:02:40.25676505 +0000 UTC m=+0.083147429 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z)
Dec 06 09:02:40 np0005548788.localdomain podman[96553]: 2025-12-06 09:02:40.287593763 +0000 UTC m=+0.113976132 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public)
Dec 06 09:02:40 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:02:59 np0005548788.localdomain podman[96589]: 2025-12-06 09:02:59.273493135 +0000 UTC m=+0.082496668 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, tcib_managed=true, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: tmp-crun.30Mzoa.mount: Deactivated successfully.
Dec 06 09:02:59 np0005548788.localdomain podman[96580]: 2025-12-06 09:02:59.324231842 +0000 UTC m=+0.149828688 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container)
Dec 06 09:02:59 np0005548788.localdomain podman[96589]: 2025-12-06 09:02:59.327588125 +0000 UTC m=+0.136591639 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:02:59 np0005548788.localdomain podman[96581]: 2025-12-06 09:02:59.373557596 +0000 UTC m=+0.193393684 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:02:59 np0005548788.localdomain podman[96580]: 2025-12-06 09:02:59.383051649 +0000 UTC m=+0.208648525 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=logrotate_crond, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true)
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:02:59 np0005548788.localdomain podman[96588]: 2025-12-06 09:02:59.423414396 +0000 UTC m=+0.237412574 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Dec 06 09:02:59 np0005548788.localdomain podman[96582]: 2025-12-06 09:02:59.326012557 +0000 UTC m=+0.143778911 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64)
Dec 06 09:02:59 np0005548788.localdomain podman[96588]: 2025-12-06 09:02:59.435363515 +0000 UTC m=+0.249361623 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid)
Dec 06 09:02:59 np0005548788.localdomain podman[96581]: 2025-12-06 09:02:59.435115527 +0000 UTC m=+0.254951605 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, distribution-scope=public)
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:02:59 np0005548788.localdomain podman[96582]: 2025-12-06 09:02:59.457701625 +0000 UTC m=+0.275468009 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:02:59 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:03:00 np0005548788.localdomain systemd[1]: tmp-crun.OiNrX8.mount: Deactivated successfully.
Dec 06 09:03:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:03:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:03:02 np0005548788.localdomain podman[96692]: 2025-12-06 09:03:02.254236407 +0000 UTC m=+0.084794600 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:03:02 np0005548788.localdomain podman[96693]: 2025-12-06 09:03:02.307585214 +0000 UTC m=+0.135500585 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044, vcs-type=git)
Dec 06 09:03:02 np0005548788.localdomain podman[96692]: 2025-12-06 09:03:02.325707544 +0000 UTC m=+0.156265777 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1)
Dec 06 09:03:02 np0005548788.localdomain podman[96692]: unhealthy
Dec 06 09:03:02 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:02 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:03:02 np0005548788.localdomain podman[96693]: 2025-12-06 09:03:02.376994368 +0000 UTC m=+0.204909739 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:03:02 np0005548788.localdomain podman[96693]: unhealthy
Dec 06 09:03:02 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:02 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:03:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:03:04 np0005548788.localdomain podman[96733]: 2025-12-06 09:03:04.253404843 +0000 UTC m=+0.084380157 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4)
Dec 06 09:03:04 np0005548788.localdomain podman[96733]: 2025-12-06 09:03:04.622189902 +0000 UTC m=+0.453165216 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1)
Dec 06 09:03:04 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:03:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:03:09 np0005548788.localdomain podman[96756]: 2025-12-06 09:03:09.261656068 +0000 UTC m=+0.089147435 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vcs-type=git, architecture=x86_64, distribution-scope=public, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:03:09 np0005548788.localdomain podman[96756]: 2025-12-06 09:03:09.459995023 +0000 UTC m=+0.287486390 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:03:09 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:03:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:03:11 np0005548788.localdomain systemd[1]: tmp-crun.VTCtSa.mount: Deactivated successfully.
Dec 06 09:03:11 np0005548788.localdomain podman[96785]: 2025-12-06 09:03:11.26784055 +0000 UTC m=+0.095401566 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vcs-type=git, release=1761123044, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 09:03:11 np0005548788.localdomain podman[96785]: 2025-12-06 09:03:11.321792696 +0000 UTC m=+0.149353662 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:03:11 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:03:16 np0005548788.localdomain sshd[96809]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:03:18 np0005548788.localdomain sshd[96809]: Received disconnect from 36.50.177.119 port 47026:11: Bye Bye [preauth]
Dec 06 09:03:18 np0005548788.localdomain sshd[96809]: Disconnected from authenticating user root 36.50.177.119 port 47026 [preauth]
Dec 06 09:03:18 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:03:18 np0005548788.localdomain recover_tripleo_nova_virtqemud[96812]: 62021
Dec 06 09:03:18 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:03:18 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:03:30 np0005548788.localdomain podman[96814]: 2025-12-06 09:03:30.271727054 +0000 UTC m=+0.096718130 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 09:03:30 np0005548788.localdomain podman[96814]: 2025-12-06 09:03:30.280187765 +0000 UTC m=+0.105178901 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044)
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:03:30 np0005548788.localdomain podman[96816]: 2025-12-06 09:03:30.32821895 +0000 UTC m=+0.146528020 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, version=17.1.12, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:03:30 np0005548788.localdomain podman[96815]: 2025-12-06 09:03:30.369979501 +0000 UTC m=+0.188213038 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vcs-type=git)
Dec 06 09:03:30 np0005548788.localdomain podman[96816]: 2025-12-06 09:03:30.373642074 +0000 UTC m=+0.191951124 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:03:30 np0005548788.localdomain podman[96813]: 2025-12-06 09:03:30.425253378 +0000 UTC m=+0.249778610 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, name=rhosp17/openstack-cron)
Dec 06 09:03:30 np0005548788.localdomain podman[96813]: 2025-12-06 09:03:30.431425179 +0000 UTC m=+0.255950411 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com)
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:03:30 np0005548788.localdomain podman[96815]: 2025-12-06 09:03:30.477246496 +0000 UTC m=+0.295480073 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git)
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:03:30 np0005548788.localdomain podman[96822]: 2025-12-06 09:03:30.479812465 +0000 UTC m=+0.293850982 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:03:30 np0005548788.localdomain podman[96822]: 2025-12-06 09:03:30.562603113 +0000 UTC m=+0.376641630 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64)
Dec 06 09:03:30 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: tmp-crun.CrqokF.mount: Deactivated successfully.
Dec 06 09:03:33 np0005548788.localdomain podman[96925]: 2025-12-06 09:03:33.267252588 +0000 UTC m=+0.092918023 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:03:33 np0005548788.localdomain podman[96925]: 2025-12-06 09:03:33.311625659 +0000 UTC m=+0.137291104 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:03:33 np0005548788.localdomain podman[96926]: 2025-12-06 09:03:33.313995081 +0000 UTC m=+0.136265461 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 09:03:33 np0005548788.localdomain podman[96926]: 2025-12-06 09:03:33.330516192 +0000 UTC m=+0.152786572 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4)
Dec 06 09:03:33 np0005548788.localdomain podman[96926]: unhealthy
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:03:33 np0005548788.localdomain podman[96925]: unhealthy
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:33 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:03:33 np0005548788.localdomain sudo[96967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:03:33 np0005548788.localdomain sudo[96967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:33 np0005548788.localdomain sudo[96967]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:33 np0005548788.localdomain sudo[96982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:03:33 np0005548788.localdomain sudo[96982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:33 np0005548788.localdomain sshd[96997]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:03:34 np0005548788.localdomain systemd[1]: tmp-crun.j6mwH3.mount: Deactivated successfully.
Dec 06 09:03:34 np0005548788.localdomain podman[97070]: 2025-12-06 09:03:34.579046567 +0000 UTC m=+0.108207525 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:03:34 np0005548788.localdomain podman[97070]: 2025-12-06 09:03:34.705816365 +0000 UTC m=+0.234977353 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, release=1763362218, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, version=7, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 09:03:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:03:34 np0005548788.localdomain podman[97103]: 2025-12-06 09:03:34.882653369 +0000 UTC m=+0.114033414 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 09:03:35 np0005548788.localdomain sudo[96982]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:35 np0005548788.localdomain sudo[97157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:03:35 np0005548788.localdomain sudo[97157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:35 np0005548788.localdomain sudo[97157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:35 np0005548788.localdomain sudo[97173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:03:35 np0005548788.localdomain sudo[97173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:35 np0005548788.localdomain podman[97103]: 2025-12-06 09:03:35.300757321 +0000 UTC m=+0.532137346 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:03:35 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:03:35 np0005548788.localdomain sshd[96997]: Received disconnect from 45.119.84.54 port 37946:11: Bye Bye [preauth]
Dec 06 09:03:35 np0005548788.localdomain sshd[96997]: Disconnected from authenticating user root 45.119.84.54 port 37946 [preauth]
Dec 06 09:03:35 np0005548788.localdomain sudo[97173]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:36 np0005548788.localdomain sudo[97221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:03:36 np0005548788.localdomain sudo[97221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:36 np0005548788.localdomain sudo[97221]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:03:40 np0005548788.localdomain podman[97236]: 2025-12-06 09:03:40.270444747 +0000 UTC m=+0.095416859 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:03:40 np0005548788.localdomain podman[97236]: 2025-12-06 09:03:40.474176403 +0000 UTC m=+0.299148525 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=)
Dec 06 09:03:40 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:03:40 np0005548788.localdomain sshd[97265]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:03:41 np0005548788.localdomain sshd[97265]: Received disconnect from 179.43.189.36 port 59972:11: Bye Bye [preauth]
Dec 06 09:03:41 np0005548788.localdomain sshd[97265]: Disconnected from authenticating user root 179.43.189.36 port 59972 [preauth]
Dec 06 09:03:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:03:42 np0005548788.localdomain podman[97267]: 2025-12-06 09:03:42.255118312 +0000 UTC m=+0.080636164 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:03:42 np0005548788.localdomain podman[97267]: 2025-12-06 09:03:42.292709394 +0000 UTC m=+0.118227266 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=nova_compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5)
Dec 06 09:03:42 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:04:01 np0005548788.localdomain podman[97293]: 2025-12-06 09:04:01.262404285 +0000 UTC m=+0.084071999 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-cron, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:04:01 np0005548788.localdomain podman[97293]: 2025-12-06 09:04:01.269640829 +0000 UTC m=+0.091308623 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12)
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:04:01 np0005548788.localdomain podman[97296]: 2025-12-06 09:04:01.319799429 +0000 UTC m=+0.131208836 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:04:01 np0005548788.localdomain podman[97296]: 2025-12-06 09:04:01.331422238 +0000 UTC m=+0.142831655 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044)
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:04:01 np0005548788.localdomain podman[97302]: 2025-12-06 09:04:01.384666224 +0000 UTC m=+0.192531561 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Dec 06 09:04:01 np0005548788.localdomain podman[97294]: 2025-12-06 09:04:01.435511105 +0000 UTC m=+0.249008186 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 09:04:01 np0005548788.localdomain podman[97302]: 2025-12-06 09:04:01.444323017 +0000 UTC m=+0.252188324 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:04:01 np0005548788.localdomain podman[97294]: 2025-12-06 09:04:01.496454899 +0000 UTC m=+0.309951980 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:04:01 np0005548788.localdomain podman[97295]: 2025-12-06 09:04:01.58644628 +0000 UTC m=+0.400317113 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vcs-type=git)
Dec 06 09:04:01 np0005548788.localdomain podman[97295]: 2025-12-06 09:04:01.620698848 +0000 UTC m=+0.434569721 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64)
Dec 06 09:04:01 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: tmp-crun.R44kWF.mount: Deactivated successfully.
Dec 06 09:04:04 np0005548788.localdomain podman[97398]: 2025-12-06 09:04:04.266913757 +0000 UTC m=+0.096283007 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:04:04 np0005548788.localdomain podman[97399]: 2025-12-06 09:04:04.226974472 +0000 UTC m=+0.058377105 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:04:04 np0005548788.localdomain podman[97398]: 2025-12-06 09:04:04.2867512 +0000 UTC m=+0.116120420 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, distribution-scope=public)
Dec 06 09:04:04 np0005548788.localdomain podman[97398]: unhealthy
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:04:04 np0005548788.localdomain podman[97399]: 2025-12-06 09:04:04.312237538 +0000 UTC m=+0.143640081 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, release=1761123044, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:04:04 np0005548788.localdomain podman[97399]: unhealthy
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:04 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:04:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:04:06 np0005548788.localdomain podman[97437]: 2025-12-06 09:04:06.263375006 +0000 UTC m=+0.088702642 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Dec 06 09:04:06 np0005548788.localdomain podman[97437]: 2025-12-06 09:04:06.637632132 +0000 UTC m=+0.462959818 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:04:06 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:04:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:04:11 np0005548788.localdomain podman[97460]: 2025-12-06 09:04:11.255691629 +0000 UTC m=+0.081550661 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr)
Dec 06 09:04:11 np0005548788.localdomain podman[97460]: 2025-12-06 09:04:11.474693797 +0000 UTC m=+0.300552869 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, distribution-scope=public, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044)
Dec 06 09:04:11 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:04:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:04:13 np0005548788.localdomain podman[97489]: 2025-12-06 09:04:13.250295911 +0000 UTC m=+0.077051332 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step5)
Dec 06 09:04:13 np0005548788.localdomain podman[97489]: 2025-12-06 09:04:13.303760333 +0000 UTC m=+0.130515764 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4)
Dec 06 09:04:13 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:04:32 np0005548788.localdomain podman[97516]: 2025-12-06 09:04:32.270924617 +0000 UTC m=+0.096942447 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, distribution-scope=public, config_id=tripleo_step3, release=1761123044, name=rhosp17/openstack-collectd)
Dec 06 09:04:32 np0005548788.localdomain podman[97516]: 2025-12-06 09:04:32.308696174 +0000 UTC m=+0.134714054 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-collectd-container, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, container_name=collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 06 09:04:32 np0005548788.localdomain podman[97518]: 2025-12-06 09:04:32.325454032 +0000 UTC m=+0.144694072 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true)
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:04:32 np0005548788.localdomain podman[97518]: 2025-12-06 09:04:32.361738073 +0000 UTC m=+0.180978123 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, tcib_managed=true)
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:04:32 np0005548788.localdomain podman[97526]: 2025-12-06 09:04:32.370458874 +0000 UTC m=+0.186129234 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:04:32 np0005548788.localdomain podman[97526]: 2025-12-06 09:04:32.425518405 +0000 UTC m=+0.241188735 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible)
Dec 06 09:04:32 np0005548788.localdomain podman[97515]: 2025-12-06 09:04:32.436398541 +0000 UTC m=+0.264886847 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:04:32 np0005548788.localdomain podman[97517]: 2025-12-06 09:04:32.437271978 +0000 UTC m=+0.258274683 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Dec 06 09:04:32 np0005548788.localdomain podman[97517]: 2025-12-06 09:04:32.460528877 +0000 UTC m=+0.281531622 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:04:32 np0005548788.localdomain podman[97515]: 2025-12-06 09:04:32.521663986 +0000 UTC m=+0.350152322 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public)
Dec 06 09:04:32 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: tmp-crun.EzgUR3.mount: Deactivated successfully.
Dec 06 09:04:35 np0005548788.localdomain podman[97624]: 2025-12-06 09:04:35.259133385 +0000 UTC m=+0.084130390 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, build-date=2025-11-19T00:14:25Z, distribution-scope=public, batch=17.1_20251118.1)
Dec 06 09:04:35 np0005548788.localdomain podman[97624]: 2025-12-06 09:04:35.304644312 +0000 UTC m=+0.129641307 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Dec 06 09:04:35 np0005548788.localdomain podman[97624]: unhealthy
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:04:35 np0005548788.localdomain podman[97625]: 2025-12-06 09:04:35.311911467 +0000 UTC m=+0.133159987 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:04:35 np0005548788.localdomain podman[97625]: 2025-12-06 09:04:35.392971031 +0000 UTC m=+0.214219511 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, container_name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:04:35 np0005548788.localdomain podman[97625]: unhealthy
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:35 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:04:36 np0005548788.localdomain sshd[97664]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:36 np0005548788.localdomain sudo[97665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:04:36 np0005548788.localdomain sudo[97665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:04:36 np0005548788.localdomain sudo[97665]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:36 np0005548788.localdomain sudo[97681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:04:36 np0005548788.localdomain sudo[97681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:04:36 np0005548788.localdomain sshd[97664]: Received disconnect from 148.227.3.232 port 41908:11: Bye Bye [preauth]
Dec 06 09:04:36 np0005548788.localdomain sshd[97664]: Disconnected from authenticating user root 148.227.3.232 port 41908 [preauth]
Dec 06 09:04:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:04:37 np0005548788.localdomain systemd[1]: tmp-crun.t43X66.mount: Deactivated successfully.
Dec 06 09:04:37 np0005548788.localdomain podman[97712]: 2025-12-06 09:04:37.056228563 +0000 UTC m=+0.110780654 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:04:37 np0005548788.localdomain sudo[97681]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:37 np0005548788.localdomain podman[97712]: 2025-12-06 09:04:37.431471189 +0000 UTC m=+0.486023270 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, container_name=nova_migration_target, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:04:37 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:04:37 np0005548788.localdomain sudo[97752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:04:37 np0005548788.localdomain sudo[97752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:04:37 np0005548788.localdomain sudo[97752]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:39 np0005548788.localdomain sshd[97768]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:40 np0005548788.localdomain sshd[97768]: Received disconnect from 36.50.177.119 port 59482:11: Bye Bye [preauth]
Dec 06 09:04:40 np0005548788.localdomain sshd[97768]: Disconnected from authenticating user root 36.50.177.119 port 59482 [preauth]
Dec 06 09:04:40 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:04:41 np0005548788.localdomain recover_tripleo_nova_virtqemud[97771]: 62021
Dec 06 09:04:41 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:04:41 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:04:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:04:42 np0005548788.localdomain podman[97772]: 2025-12-06 09:04:42.267236974 +0000 UTC m=+0.093600173 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, architecture=x86_64)
Dec 06 09:04:42 np0005548788.localdomain podman[97772]: 2025-12-06 09:04:42.486146329 +0000 UTC m=+0.312509548 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public)
Dec 06 09:04:42 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:04:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:04:44 np0005548788.localdomain systemd[1]: tmp-crun.hjN49w.mount: Deactivated successfully.
Dec 06 09:04:44 np0005548788.localdomain podman[97800]: 2025-12-06 09:04:44.246913185 +0000 UTC m=+0.081040526 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=nova_compute, config_id=tripleo_step5, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:04:44 np0005548788.localdomain podman[97800]: 2025-12-06 09:04:44.302312527 +0000 UTC m=+0.136439838 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Dec 06 09:04:44 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:04:46 np0005548788.localdomain sshd[97825]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:47 np0005548788.localdomain sshd[97825]: Received disconnect from 179.43.189.36 port 49434:11: Bye Bye [preauth]
Dec 06 09:04:47 np0005548788.localdomain sshd[97825]: Disconnected from authenticating user root 179.43.189.36 port 49434 [preauth]
Dec 06 09:04:51 np0005548788.localdomain sshd[97827]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:53 np0005548788.localdomain sshd[97827]: Received disconnect from 45.119.84.54 port 48630:11: Bye Bye [preauth]
Dec 06 09:04:53 np0005548788.localdomain sshd[97827]: Disconnected from authenticating user root 45.119.84.54 port 48630 [preauth]
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:05:03 np0005548788.localdomain podman[97838]: 2025-12-06 09:05:03.333456786 +0000 UTC m=+0.142153204 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 09:05:03 np0005548788.localdomain podman[97831]: 2025-12-06 09:05:03.291966994 +0000 UTC m=+0.106610946 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:05:03 np0005548788.localdomain podman[97831]: 2025-12-06 09:05:03.376707413 +0000 UTC m=+0.191351365 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.)
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:05:03 np0005548788.localdomain podman[97837]: 2025-12-06 09:05:03.394123881 +0000 UTC m=+0.205254803 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=)
Dec 06 09:05:03 np0005548788.localdomain podman[97830]: 2025-12-06 09:05:03.447738348 +0000 UTC m=+0.265978831 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-collectd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 09:05:03 np0005548788.localdomain podman[97830]: 2025-12-06 09:05:03.46170673 +0000 UTC m=+0.279947213 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:05:03 np0005548788.localdomain podman[97837]: 2025-12-06 09:05:03.514290155 +0000 UTC m=+0.325421097 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team)
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:05:03 np0005548788.localdomain podman[97829]: 2025-12-06 09:05:03.466660713 +0000 UTC m=+0.293262484 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public)
Dec 06 09:05:03 np0005548788.localdomain podman[97829]: 2025-12-06 09:05:03.599707024 +0000 UTC m=+0.426308815 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:05:03 np0005548788.localdomain podman[97838]: 2025-12-06 09:05:03.619516067 +0000 UTC m=+0.428212485 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:05:03 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: tmp-crun.W8PQD8.mount: Deactivated successfully.
Dec 06 09:05:06 np0005548788.localdomain podman[97943]: 2025-12-06 09:05:06.253293471 +0000 UTC m=+0.075388230 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 09:05:06 np0005548788.localdomain podman[97943]: 2025-12-06 09:05:06.265935203 +0000 UTC m=+0.088029982 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent)
Dec 06 09:05:06 np0005548788.localdomain podman[97943]: unhealthy
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:05:06 np0005548788.localdomain podman[97944]: 2025-12-06 09:05:06.322635845 +0000 UTC m=+0.139763991 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public)
Dec 06 09:05:06 np0005548788.localdomain podman[97944]: 2025-12-06 09:05:06.344617004 +0000 UTC m=+0.161745190 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:05:06 np0005548788.localdomain podman[97944]: unhealthy
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:06 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:05:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:05:08 np0005548788.localdomain podman[97983]: 2025-12-06 09:05:08.274447644 +0000 UTC m=+0.100670141 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:05:08 np0005548788.localdomain podman[97983]: 2025-12-06 09:05:08.647669948 +0000 UTC m=+0.473892435 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64)
Dec 06 09:05:08 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:05:09 np0005548788.localdomain sshd[98007]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:12 np0005548788.localdomain sshd[98007]: Connection closed by 101.47.142.76 port 57020 [preauth]
Dec 06 09:05:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:05:13 np0005548788.localdomain podman[98009]: 2025-12-06 09:05:13.003069118 +0000 UTC m=+0.090248140 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:05:13 np0005548788.localdomain podman[98009]: 2025-12-06 09:05:13.194292707 +0000 UTC m=+0.281471769 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:05:13 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:05:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:05:15 np0005548788.localdomain podman[98039]: 2025-12-06 09:05:15.247359606 +0000 UTC m=+0.073323637 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Dec 06 09:05:15 np0005548788.localdomain podman[98039]: 2025-12-06 09:05:15.274801384 +0000 UTC m=+0.100765395 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:05:15 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: tmp-crun.eTDlWt.mount: Deactivated successfully.
Dec 06 09:05:34 np0005548788.localdomain podman[98066]: 2025-12-06 09:05:34.279368114 +0000 UTC m=+0.096094851 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:05:34 np0005548788.localdomain podman[98066]: 2025-12-06 09:05:34.292461527 +0000 UTC m=+0.109188284 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:05:34 np0005548788.localdomain podman[98067]: 2025-12-06 09:05:34.339242223 +0000 UTC m=+0.152155652 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team)
Dec 06 09:05:34 np0005548788.localdomain podman[98068]: 2025-12-06 09:05:34.371276444 +0000 UTC m=+0.181131659 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Dec 06 09:05:34 np0005548788.localdomain podman[98067]: 2025-12-06 09:05:34.378870818 +0000 UTC m=+0.191784267 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, architecture=x86_64, name=rhosp17/openstack-collectd)
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:05:34 np0005548788.localdomain podman[98068]: 2025-12-06 09:05:34.397747732 +0000 UTC m=+0.207602947 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:05:34 np0005548788.localdomain podman[98076]: 2025-12-06 09:05:34.440978748 +0000 UTC m=+0.244709324 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com)
Dec 06 09:05:34 np0005548788.localdomain podman[98076]: 2025-12-06 09:05:34.467529208 +0000 UTC m=+0.271259774 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:12:45Z, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:05:34 np0005548788.localdomain podman[98069]: 2025-12-06 09:05:34.478553409 +0000 UTC m=+0.286302969 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=iscsid, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=)
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:05:34 np0005548788.localdomain podman[98069]: 2025-12-06 09:05:34.492394377 +0000 UTC m=+0.300143947 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, build-date=2025-11-18T23:44:13Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12)
Dec 06 09:05:34 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: tmp-crun.NvvKTM.mount: Deactivated successfully.
Dec 06 09:05:37 np0005548788.localdomain podman[98174]: 2025-12-06 09:05:37.252848346 +0000 UTC m=+0.078111496 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_id=tripleo_step4)
Dec 06 09:05:37 np0005548788.localdomain podman[98174]: 2025-12-06 09:05:37.294608336 +0000 UTC m=+0.119871546 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 06 09:05:37 np0005548788.localdomain podman[98174]: unhealthy
Dec 06 09:05:37 np0005548788.localdomain podman[98175]: 2025-12-06 09:05:37.305553135 +0000 UTC m=+0.127413999 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true)
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:05:37 np0005548788.localdomain podman[98175]: 2025-12-06 09:05:37.321611751 +0000 UTC m=+0.143472605 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:05:37 np0005548788.localdomain podman[98175]: unhealthy
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:37 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:05:38 np0005548788.localdomain sudo[98214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:05:38 np0005548788.localdomain sudo[98214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:05:38 np0005548788.localdomain sudo[98214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:05:38 np0005548788.localdomain sudo[98229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:05:38 np0005548788.localdomain sudo[98229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:05:38 np0005548788.localdomain sshd[98264]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:38 np0005548788.localdomain sudo[98229]: pam_unix(sudo:session): session closed for user root
Dec 06 09:05:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:05:39 np0005548788.localdomain podman[98277]: 2025-12-06 09:05:39.234954821 +0000 UTC m=+0.066286989 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 06 09:05:39 np0005548788.localdomain sudo[98299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:05:39 np0005548788.localdomain sudo[98299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:05:39 np0005548788.localdomain sudo[98299]: pam_unix(sudo:session): session closed for user root
Dec 06 09:05:39 np0005548788.localdomain podman[98277]: 2025-12-06 09:05:39.604734379 +0000 UTC m=+0.436066527 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:05:39 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:05:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:05:44 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:05:44 np0005548788.localdomain recover_tripleo_nova_virtqemud[98323]: 62021
Dec 06 09:05:44 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:05:44 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:05:44 np0005548788.localdomain podman[98316]: 2025-12-06 09:05:44.228045098 +0000 UTC m=+0.061280755 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4)
Dec 06 09:05:44 np0005548788.localdomain podman[98316]: 2025-12-06 09:05:44.408152954 +0000 UTC m=+0.241388601 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:05:44 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:05:44 np0005548788.localdomain sshd[98264]: Connection closed by 45.78.219.195 port 42896 [preauth]
Dec 06 09:05:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:05:46 np0005548788.localdomain podman[98348]: 2025-12-06 09:05:46.25790869 +0000 UTC m=+0.081718816 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute)
Dec 06 09:05:46 np0005548788.localdomain podman[98348]: 2025-12-06 09:05:46.305654816 +0000 UTC m=+0.129464872 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Dec 06 09:05:46 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:05:51 np0005548788.localdomain sshd[98375]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:52 np0005548788.localdomain sshd[98375]: Received disconnect from 179.43.189.36 port 36790:11: Bye Bye [preauth]
Dec 06 09:05:52 np0005548788.localdomain sshd[98375]: Disconnected from authenticating user root 179.43.189.36 port 36790 [preauth]
Dec 06 09:05:59 np0005548788.localdomain sshd[98377]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:01 np0005548788.localdomain sshd[98377]: Received disconnect from 36.50.177.119 port 58012:11: Bye Bye [preauth]
Dec 06 09:06:01 np0005548788.localdomain sshd[98377]: Disconnected from authenticating user root 36.50.177.119 port 58012 [preauth]
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:06:05 np0005548788.localdomain podman[98379]: 2025-12-06 09:06:05.275453502 +0000 UTC m=+0.097950618 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Dec 06 09:06:05 np0005548788.localdomain podman[98379]: 2025-12-06 09:06:05.312540129 +0000 UTC m=+0.135037205 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container)
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:06:05 np0005548788.localdomain podman[98380]: 2025-12-06 09:06:05.314047125 +0000 UTC m=+0.134222929 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:06:05 np0005548788.localdomain podman[98381]: 2025-12-06 09:06:05.384464791 +0000 UTC m=+0.198629760 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 09:06:05 np0005548788.localdomain podman[98380]: 2025-12-06 09:06:05.397849465 +0000 UTC m=+0.218025259 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, container_name=collectd, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:06:05 np0005548788.localdomain podman[98381]: 2025-12-06 09:06:05.418876155 +0000 UTC m=+0.233041084 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4)
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:06:05 np0005548788.localdomain podman[98385]: 2025-12-06 09:06:05.487299719 +0000 UTC m=+0.294777460 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1)
Dec 06 09:06:05 np0005548788.localdomain podman[98385]: 2025-12-06 09:06:05.52356616 +0000 UTC m=+0.331043841 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 09:06:05 np0005548788.localdomain podman[98393]: 2025-12-06 09:06:05.536003934 +0000 UTC m=+0.343813035 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:06:05 np0005548788.localdomain podman[98393]: 2025-12-06 09:06:05.569754597 +0000 UTC m=+0.377563658 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 06 09:06:05 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:06:06 np0005548788.localdomain systemd[1]: tmp-crun.gLm1WM.mount: Deactivated successfully.
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:06:08 np0005548788.localdomain podman[98486]: 2025-12-06 09:06:08.258305515 +0000 UTC m=+0.084284666 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git)
Dec 06 09:06:08 np0005548788.localdomain podman[98486]: 2025-12-06 09:06:08.2746398 +0000 UTC m=+0.100618961 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64)
Dec 06 09:06:08 np0005548788.localdomain podman[98486]: unhealthy
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: tmp-crun.SOZTKw.mount: Deactivated successfully.
Dec 06 09:06:08 np0005548788.localdomain podman[98485]: 2025-12-06 09:06:08.373647619 +0000 UTC m=+0.201080505 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Dec 06 09:06:08 np0005548788.localdomain podman[98485]: 2025-12-06 09:06:08.416848825 +0000 UTC m=+0.244281751 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Dec 06 09:06:08 np0005548788.localdomain podman[98485]: unhealthy
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:08 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:06:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:06:10 np0005548788.localdomain podman[98522]: 2025-12-06 09:06:10.258276552 +0000 UTC m=+0.085972218 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, url=https://www.redhat.com)
Dec 06 09:06:10 np0005548788.localdomain podman[98522]: 2025-12-06 09:06:10.62870864 +0000 UTC m=+0.456404266 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 09:06:10 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:06:12 np0005548788.localdomain sshd[98546]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:14 np0005548788.localdomain sshd[98546]: Received disconnect from 45.119.84.54 port 53932:11: Bye Bye [preauth]
Dec 06 09:06:14 np0005548788.localdomain sshd[98546]: Disconnected from authenticating user root 45.119.84.54 port 53932 [preauth]
Dec 06 09:06:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:06:15 np0005548788.localdomain podman[98548]: 2025-12-06 09:06:15.262507123 +0000 UTC m=+0.088809325 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step1, vendor=Red Hat, Inc.)
Dec 06 09:06:15 np0005548788.localdomain podman[98548]: 2025-12-06 09:06:15.460626656 +0000 UTC m=+0.286928798 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:06:15 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:06:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:06:17 np0005548788.localdomain podman[98577]: 2025-12-06 09:06:17.253537904 +0000 UTC m=+0.080927652 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:06:17 np0005548788.localdomain podman[98577]: 2025-12-06 09:06:17.30973206 +0000 UTC m=+0.137121768 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=nova_compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, tcib_managed=true)
Dec 06 09:06:17 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:06:20 np0005548788.localdomain sshd[98603]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:21 np0005548788.localdomain sshd[98604]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:21 np0005548788.localdomain sshd[98604]: error: kex_exchange_identification: read: Connection reset by peer
Dec 06 09:06:21 np0005548788.localdomain sshd[98604]: Connection reset by 45.140.17.97 port 4246
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: tmp-crun.rUrxoM.mount: Deactivated successfully.
Dec 06 09:06:36 np0005548788.localdomain podman[98608]: 2025-12-06 09:06:36.260349935 +0000 UTC m=+0.084247205 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 06 09:06:36 np0005548788.localdomain podman[98607]: 2025-12-06 09:06:36.311417422 +0000 UTC m=+0.135053284 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=)
Dec 06 09:06:36 np0005548788.localdomain podman[98608]: 2025-12-06 09:06:36.323349841 +0000 UTC m=+0.147247101 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1)
Dec 06 09:06:36 np0005548788.localdomain podman[98606]: 2025-12-06 09:06:36.361570173 +0000 UTC m=+0.191439787 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3)
Dec 06 09:06:36 np0005548788.localdomain podman[98607]: 2025-12-06 09:06:36.368890569 +0000 UTC m=+0.192526431 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:06:36 np0005548788.localdomain podman[98606]: 2025-12-06 09:06:36.424531488 +0000 UTC m=+0.254401082 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3)
Dec 06 09:06:36 np0005548788.localdomain podman[98609]: 2025-12-06 09:06:36.459408306 +0000 UTC m=+0.283174882 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:06:36 np0005548788.localdomain podman[98609]: 2025-12-06 09:06:36.518668947 +0000 UTC m=+0.342435573 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:06:36 np0005548788.localdomain podman[98605]: 2025-12-06 09:06:36.556374333 +0000 UTC m=+0.386733713 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:06:36 np0005548788.localdomain podman[98605]: 2025-12-06 09:06:36.591697714 +0000 UTC m=+0.422057114 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, container_name=logrotate_crond, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:06:36 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:06:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:06:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:06:39 np0005548788.localdomain podman[98708]: 2025-12-06 09:06:39.277434465 +0000 UTC m=+0.098662810 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:06:39 np0005548788.localdomain podman[98709]: 2025-12-06 09:06:39.322670243 +0000 UTC m=+0.140679998 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Dec 06 09:06:39 np0005548788.localdomain podman[98709]: 2025-12-06 09:06:39.36271678 +0000 UTC m=+0.180726585 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git)
Dec 06 09:06:39 np0005548788.localdomain podman[98709]: unhealthy
Dec 06 09:06:39 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:39 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:06:39 np0005548788.localdomain podman[98708]: 2025-12-06 09:06:39.374263017 +0000 UTC m=+0.195491312 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 06 09:06:39 np0005548788.localdomain podman[98708]: unhealthy
Dec 06 09:06:39 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:39 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:06:39 np0005548788.localdomain sudo[98749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:06:39 np0005548788.localdomain sudo[98749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:06:39 np0005548788.localdomain sudo[98749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:06:39 np0005548788.localdomain sudo[98764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:06:39 np0005548788.localdomain sudo[98764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:06:40 np0005548788.localdomain sudo[98764]: pam_unix(sudo:session): session closed for user root
Dec 06 09:06:40 np0005548788.localdomain sudo[98810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:06:40 np0005548788.localdomain sudo[98810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:06:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:06:40 np0005548788.localdomain sudo[98810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:06:41 np0005548788.localdomain systemd[1]: tmp-crun.glRBYh.mount: Deactivated successfully.
Dec 06 09:06:41 np0005548788.localdomain podman[98825]: 2025-12-06 09:06:41.057186777 +0000 UTC m=+0.091133917 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:06:41 np0005548788.localdomain podman[98825]: 2025-12-06 09:06:41.427021686 +0000 UTC m=+0.460968826 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4)
Dec 06 09:06:41 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:06:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:06:46 np0005548788.localdomain systemd[1]: tmp-crun.3MhcIs.mount: Deactivated successfully.
Dec 06 09:06:46 np0005548788.localdomain podman[98848]: 2025-12-06 09:06:46.271656626 +0000 UTC m=+0.093509212 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Dec 06 09:06:46 np0005548788.localdomain podman[98848]: 2025-12-06 09:06:46.494632176 +0000 UTC m=+0.316484512 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:06:46 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:06:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:06:48 np0005548788.localdomain systemd[1]: tmp-crun.YvHhsG.mount: Deactivated successfully.
Dec 06 09:06:48 np0005548788.localdomain podman[98877]: 2025-12-06 09:06:48.2534626 +0000 UTC m=+0.083527532 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4)
Dec 06 09:06:48 np0005548788.localdomain podman[98877]: 2025-12-06 09:06:48.286647256 +0000 UTC m=+0.116712198 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 06 09:06:48 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:06:58 np0005548788.localdomain sshd[98905]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:59 np0005548788.localdomain sshd[98905]: Received disconnect from 179.43.189.36 port 36216:11: Bye Bye [preauth]
Dec 06 09:06:59 np0005548788.localdomain sshd[98905]: Disconnected from authenticating user root 179.43.189.36 port 36216 [preauth]
Dec 06 09:06:59 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:06:59 np0005548788.localdomain recover_tripleo_nova_virtqemud[98908]: 62021
Dec 06 09:06:59 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:06:59 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:07:00 np0005548788.localdomain sshd[98909]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:01 np0005548788.localdomain sshd[98909]: Received disconnect from 148.227.3.232 port 46128:11: Bye Bye [preauth]
Dec 06 09:07:01 np0005548788.localdomain sshd[98909]: Disconnected from authenticating user root 148.227.3.232 port 46128 [preauth]
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:07:07 np0005548788.localdomain podman[98913]: 2025-12-06 09:07:07.2761969 +0000 UTC m=+0.094828500 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 09:07:07 np0005548788.localdomain podman[98913]: 2025-12-06 09:07:07.28914882 +0000 UTC m=+0.107780360 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:07:07 np0005548788.localdomain podman[98912]: 2025-12-06 09:07:07.343537351 +0000 UTC m=+0.158822528 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:07:07 np0005548788.localdomain podman[98912]: 2025-12-06 09:07:07.377049746 +0000 UTC m=+0.192334883 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:07:07 np0005548788.localdomain podman[98915]: 2025-12-06 09:07:07.392541106 +0000 UTC m=+0.204385137 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-iscsid-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:07 np0005548788.localdomain podman[98915]: 2025-12-06 09:07:07.433868143 +0000 UTC m=+0.245712224 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:07:07 np0005548788.localdomain podman[98919]: 2025-12-06 09:07:07.44998277 +0000 UTC m=+0.257958552 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, distribution-scope=public)
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:07:07 np0005548788.localdomain podman[98919]: 2025-12-06 09:07:07.48555795 +0000 UTC m=+0.293533772 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:07:07 np0005548788.localdomain podman[98914]: 2025-12-06 09:07:07.494656121 +0000 UTC m=+0.309474974 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:07:07 np0005548788.localdomain podman[98914]: 2025-12-06 09:07:07.529774257 +0000 UTC m=+0.344593130 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=)
Dec 06 09:07:07 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:07:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:07:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:07:10 np0005548788.localdomain podman[99019]: 2025-12-06 09:07:10.266187473 +0000 UTC m=+0.087592918 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:07:10 np0005548788.localdomain podman[99020]: 2025-12-06 09:07:10.328171769 +0000 UTC m=+0.146489178 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:10 np0005548788.localdomain podman[99019]: 2025-12-06 09:07:10.339050195 +0000 UTC m=+0.160455640 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 09:07:10 np0005548788.localdomain podman[99019]: unhealthy
Dec 06 09:07:10 np0005548788.localdomain podman[99020]: 2025-12-06 09:07:10.419655847 +0000 UTC m=+0.237973276 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:10 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:10 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:07:10 np0005548788.localdomain podman[99020]: unhealthy
Dec 06 09:07:10 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:10 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:07:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:07:12 np0005548788.localdomain podman[99061]: 2025-12-06 09:07:12.252726905 +0000 UTC m=+0.081888121 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 09:07:12 np0005548788.localdomain podman[99061]: 2025-12-06 09:07:12.680280438 +0000 UTC m=+0.509441644 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, vcs-type=git)
Dec 06 09:07:12 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:07:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:07:17 np0005548788.localdomain systemd[1]: tmp-crun.ArFz97.mount: Deactivated successfully.
Dec 06 09:07:17 np0005548788.localdomain podman[99084]: 2025-12-06 09:07:17.27326591 +0000 UTC m=+0.099795726 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd)
Dec 06 09:07:17 np0005548788.localdomain podman[99084]: 2025-12-06 09:07:17.47259727 +0000 UTC m=+0.299127056 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:07:17 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:07:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:07:19 np0005548788.localdomain podman[99115]: 2025-12-06 09:07:19.261892546 +0000 UTC m=+0.077915149 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute)
Dec 06 09:07:19 np0005548788.localdomain podman[99115]: 2025-12-06 09:07:19.29566377 +0000 UTC m=+0.111686333 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, container_name=nova_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12)
Dec 06 09:07:19 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:07:24 np0005548788.localdomain sshd[99141]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:26 np0005548788.localdomain sshd[99141]: Received disconnect from 36.50.177.119 port 40812:11: Bye Bye [preauth]
Dec 06 09:07:26 np0005548788.localdomain sshd[99141]: Disconnected from authenticating user root 36.50.177.119 port 40812 [preauth]
Dec 06 09:07:35 np0005548788.localdomain sshd[99143]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:37 np0005548788.localdomain sshd[99143]: Received disconnect from 45.119.84.54 port 37438:11: Bye Bye [preauth]
Dec 06 09:07:37 np0005548788.localdomain sshd[99143]: Disconnected from authenticating user root 45.119.84.54 port 37438 [preauth]
Dec 06 09:07:37 np0005548788.localdomain sshd[36176]: Received disconnect from 192.168.122.100 port 37056:11: disconnected by user
Dec 06 09:07:37 np0005548788.localdomain sshd[36176]: Disconnected from user tripleo-admin 192.168.122.100 port 37056
Dec 06 09:07:37 np0005548788.localdomain sshd[36155]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: session-28.scope: Consumed 7min 4.154s CPU time.
Dec 06 09:07:37 np0005548788.localdomain systemd-logind[765]: Session 28 logged out. Waiting for processes to exit.
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:07:37 np0005548788.localdomain systemd-logind[765]: Removed session 28.
Dec 06 09:07:37 np0005548788.localdomain podman[99146]: 2025-12-06 09:07:37.710399574 +0000 UTC m=+0.100172046 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 06 09:07:37 np0005548788.localdomain podman[99146]: 2025-12-06 09:07:37.726776631 +0000 UTC m=+0.116549093 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-type=git, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:07:37 np0005548788.localdomain podman[99147]: 2025-12-06 09:07:37.741836787 +0000 UTC m=+0.128817172 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:07:37 np0005548788.localdomain podman[99147]: 2025-12-06 09:07:37.801538311 +0000 UTC m=+0.188518696 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public)
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:07:37 np0005548788.localdomain podman[99154]: 2025-12-06 09:07:37.817851536 +0000 UTC m=+0.198899518 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 06 09:07:37 np0005548788.localdomain podman[99148]: 2025-12-06 09:07:37.860618977 +0000 UTC m=+0.242349241 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, version=17.1.12, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:07:37 np0005548788.localdomain podman[99154]: 2025-12-06 09:07:37.878696675 +0000 UTC m=+0.259744617 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, tcib_managed=true)
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:07:37 np0005548788.localdomain podman[99148]: 2025-12-06 09:07:37.90050753 +0000 UTC m=+0.282237764 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:07:37 np0005548788.localdomain podman[99145]: 2025-12-06 09:07:37.970486152 +0000 UTC m=+0.362735650 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z)
Dec 06 09:07:37 np0005548788.localdomain podman[99145]: 2025-12-06 09:07:37.982499563 +0000 UTC m=+0.374749081 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com)
Dec 06 09:07:37 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:07:41 np0005548788.localdomain sudo[99258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:07:41 np0005548788.localdomain sudo[99258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:07:41 np0005548788.localdomain sudo[99258]: pam_unix(sudo:session): session closed for user root
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: tmp-crun.H8467d.mount: Deactivated successfully.
Dec 06 09:07:41 np0005548788.localdomain podman[99271]: 2025-12-06 09:07:41.266114922 +0000 UTC m=+0.089489707 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vcs-type=git, container_name=ovn_metadata_agent, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:07:41 np0005548788.localdomain sudo[99298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:07:41 np0005548788.localdomain podman[99271]: 2025-12-06 09:07:41.309151021 +0000 UTC m=+0.132525796 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:07:41 np0005548788.localdomain sudo[99298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:07:41 np0005548788.localdomain podman[99271]: unhealthy
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:07:41 np0005548788.localdomain podman[99273]: 2025-12-06 09:07:41.322149293 +0000 UTC m=+0.141923257 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:07:41 np0005548788.localdomain podman[99273]: 2025-12-06 09:07:41.403023312 +0000 UTC m=+0.222797266 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Dec 06 09:07:41 np0005548788.localdomain podman[99273]: unhealthy
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:41 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:07:42 np0005548788.localdomain sudo[99298]: pam_unix(sudo:session): session closed for user root
Dec 06 09:07:42 np0005548788.localdomain sudo[99363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:07:42 np0005548788.localdomain sudo[99363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:07:42 np0005548788.localdomain sudo[99363]: pam_unix(sudo:session): session closed for user root
Dec 06 09:07:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:07:43 np0005548788.localdomain podman[99378]: 2025-12-06 09:07:43.277230363 +0000 UTC m=+0.100554579 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:07:43 np0005548788.localdomain podman[99378]: 2025-12-06 09:07:43.676741369 +0000 UTC m=+0.500065545 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=nova_migration_target, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044)
Dec 06 09:07:43 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Activating special unit Exit the Session...
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Removed slice User Background Tasks Slice.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped target Main User Target.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped target Basic System.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped target Paths.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped target Sockets.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped target Timers.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Closed D-Bus User Message Bus Socket.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Removed slice User Application Slice.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Reached target Shutdown.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Finished Exit the Session.
Dec 06 09:07:47 np0005548788.localdomain systemd[36159]: Reached target Exit the Session.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: user@1003.service: Consumed 4.508s CPU time, read 0B from disk, written 7.0K to disk.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 06 09:07:47 np0005548788.localdomain podman[99401]: 2025-12-06 09:07:47.85098043 +0000 UTC m=+0.171911603 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=metrics_qdr, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-type=git, distribution-scope=public, release=1761123044, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 06 09:07:47 np0005548788.localdomain systemd[1]: user-1003.slice: Consumed 7min 8.692s CPU time.
Dec 06 09:07:48 np0005548788.localdomain podman[99401]: 2025-12-06 09:07:48.046691288 +0000 UTC m=+0.367622441 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4)
Dec 06 09:07:48 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:07:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:07:50 np0005548788.localdomain podman[99431]: 2025-12-06 09:07:50.262406183 +0000 UTC m=+0.087224736 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:50 np0005548788.localdomain podman[99431]: 2025-12-06 09:07:50.29561593 +0000 UTC m=+0.120434443 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:07:50 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:08:07 np0005548788.localdomain sshd[99457]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:08:08 np0005548788.localdomain podman[99461]: 2025-12-06 09:08:08.263014098 +0000 UTC m=+0.078676443 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true)
Dec 06 09:08:08 np0005548788.localdomain podman[99478]: 2025-12-06 09:08:08.318479132 +0000 UTC m=+0.124353544 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:08:08 np0005548788.localdomain podman[99459]: 2025-12-06 09:08:08.371694386 +0000 UTC m=+0.193833111 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Dec 06 09:08:08 np0005548788.localdomain podman[99459]: 2025-12-06 09:08:08.379996723 +0000 UTC m=+0.202135558 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, container_name=logrotate_crond, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:08:08 np0005548788.localdomain podman[99461]: 2025-12-06 09:08:08.392900232 +0000 UTC m=+0.208562667 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute)
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:08:08 np0005548788.localdomain podman[99478]: 2025-12-06 09:08:08.424367024 +0000 UTC m=+0.230241476 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, container_name=ceilometer_agent_ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:08:08 np0005548788.localdomain podman[99467]: 2025-12-06 09:08:08.291674964 +0000 UTC m=+0.099573559 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=)
Dec 06 09:08:08 np0005548788.localdomain podman[99460]: 2025-12-06 09:08:08.480022134 +0000 UTC m=+0.296836214 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 09:08:08 np0005548788.localdomain podman[99460]: 2025-12-06 09:08:08.515372527 +0000 UTC m=+0.332186667 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z)
Dec 06 09:08:08 np0005548788.localdomain sshd[99457]: Received disconnect from 179.43.189.36 port 46228:11: Bye Bye [preauth]
Dec 06 09:08:08 np0005548788.localdomain sshd[99457]: Disconnected from authenticating user root 179.43.189.36 port 46228 [preauth]
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:08:08 np0005548788.localdomain podman[99467]: 2025-12-06 09:08:08.572261545 +0000 UTC m=+0.380160180 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-iscsid-container, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 09:08:08 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:08:09 np0005548788.localdomain systemd[1]: tmp-crun.boXMfU.mount: Deactivated successfully.
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: tmp-crun.HftKyB.mount: Deactivated successfully.
Dec 06 09:08:12 np0005548788.localdomain podman[99574]: 2025-12-06 09:08:12.273925871 +0000 UTC m=+0.101974762 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:08:12 np0005548788.localdomain podman[99575]: 2025-12-06 09:08:12.321903524 +0000 UTC m=+0.145231928 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:08:12 np0005548788.localdomain podman[99575]: 2025-12-06 09:08:12.338381303 +0000 UTC m=+0.161709667 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:08:12 np0005548788.localdomain podman[99575]: unhealthy
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:08:12 np0005548788.localdomain podman[99574]: 2025-12-06 09:08:12.394176238 +0000 UTC m=+0.222225109 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 09:08:12 np0005548788.localdomain podman[99574]: unhealthy
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:12 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:08:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:08:14 np0005548788.localdomain podman[99612]: 2025-12-06 09:08:14.254642274 +0000 UTC m=+0.084385999 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:08:14 np0005548788.localdomain podman[99612]: 2025-12-06 09:08:14.635324768 +0000 UTC m=+0.465068513 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:08:14 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:08:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:08:18 np0005548788.localdomain podman[99637]: 2025-12-06 09:08:18.262791342 +0000 UTC m=+0.089230089 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-qdrouterd, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step1)
Dec 06 09:08:18 np0005548788.localdomain podman[99637]: 2025-12-06 09:08:18.471713009 +0000 UTC m=+0.298151756 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, release=1761123044)
Dec 06 09:08:18 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:08:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:08:21 np0005548788.localdomain podman[99666]: 2025-12-06 09:08:21.262780494 +0000 UTC m=+0.088005770 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step5, release=1761123044, tcib_managed=true, url=https://www.redhat.com)
Dec 06 09:08:21 np0005548788.localdomain podman[99666]: 2025-12-06 09:08:21.32477825 +0000 UTC m=+0.150003576 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git)
Dec 06 09:08:21 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:08:39 np0005548788.localdomain recover_tripleo_nova_virtqemud[99724]: 62021
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: tmp-crun.fsTjIC.mount: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain podman[99693]: 2025-12-06 09:08:39.282758337 +0000 UTC m=+0.103444848 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:08:39 np0005548788.localdomain podman[99693]: 2025-12-06 09:08:39.294948113 +0000 UTC m=+0.115634674 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, tcib_managed=true)
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain podman[99695]: 2025-12-06 09:08:39.374829761 +0000 UTC m=+0.189282610 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=)
Dec 06 09:08:39 np0005548788.localdomain podman[99694]: 2025-12-06 09:08:39.248827057 +0000 UTC m=+0.070041975 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:08:39 np0005548788.localdomain podman[99692]: 2025-12-06 09:08:39.440398008 +0000 UTC m=+0.263645308 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.12, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:08:39 np0005548788.localdomain podman[99692]: 2025-12-06 09:08:39.453525403 +0000 UTC m=+0.276772723 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:08:39 np0005548788.localdomain podman[99695]: 2025-12-06 09:08:39.464259985 +0000 UTC m=+0.278712884 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain podman[99694]: 2025-12-06 09:08:39.533923368 +0000 UTC m=+0.355138336 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:08:39 np0005548788.localdomain podman[99701]: 2025-12-06 09:08:39.59545387 +0000 UTC m=+0.407417112 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:08:39 np0005548788.localdomain podman[99701]: 2025-12-06 09:08:39.652695229 +0000 UTC m=+0.464658441 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:08:39 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:08:42 np0005548788.localdomain sudo[99804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:08:42 np0005548788.localdomain sudo[99804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:08:42 np0005548788.localdomain sudo[99804]: pam_unix(sudo:session): session closed for user root
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:08:42 np0005548788.localdomain sudo[99831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:08:42 np0005548788.localdomain sudo[99831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:08:42 np0005548788.localdomain podman[99820]: 2025-12-06 09:08:42.872912958 +0000 UTC m=+0.081936743 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:08:42 np0005548788.localdomain podman[99820]: 2025-12-06 09:08:42.91568393 +0000 UTC m=+0.124707785 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team)
Dec 06 09:08:42 np0005548788.localdomain podman[99820]: unhealthy
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: tmp-crun.nuyX8b.mount: Deactivated successfully.
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:08:42 np0005548788.localdomain podman[99819]: 2025-12-06 09:08:42.935683438 +0000 UTC m=+0.146422296 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 06 09:08:42 np0005548788.localdomain podman[99819]: 2025-12-06 09:08:42.97683474 +0000 UTC m=+0.187573578 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:08:42 np0005548788.localdomain podman[99819]: unhealthy
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:42 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:08:43 np0005548788.localdomain sudo[99831]: pam_unix(sudo:session): session closed for user root
Dec 06 09:08:44 np0005548788.localdomain sudo[99904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:08:44 np0005548788.localdomain sudo[99904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:08:44 np0005548788.localdomain sudo[99904]: pam_unix(sudo:session): session closed for user root
Dec 06 09:08:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:08:45 np0005548788.localdomain podman[99919]: 2025-12-06 09:08:45.270555876 +0000 UTC m=+0.087228557 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:08:45 np0005548788.localdomain podman[99919]: 2025-12-06 09:08:45.636760643 +0000 UTC m=+0.453433284 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:08:45 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:08:48 np0005548788.localdomain sshd[99942]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:08:49 np0005548788.localdomain podman[99944]: 2025-12-06 09:08:49.263249006 +0000 UTC m=+0.086935148 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Dec 06 09:08:49 np0005548788.localdomain podman[99944]: 2025-12-06 09:08:49.487016631 +0000 UTC m=+0.310702713 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, name=rhosp17/openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr)
Dec 06 09:08:49 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:08:50 np0005548788.localdomain sshd[99942]: Received disconnect from 36.50.177.119 port 40976:11: Bye Bye [preauth]
Dec 06 09:08:50 np0005548788.localdomain sshd[99942]: Disconnected from authenticating user root 36.50.177.119 port 40976 [preauth]
Dec 06 09:08:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:08:52 np0005548788.localdomain systemd[1]: tmp-crun.ht24XZ.mount: Deactivated successfully.
Dec 06 09:08:52 np0005548788.localdomain podman[99973]: 2025-12-06 09:08:52.26945872 +0000 UTC m=+0.089808956 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, container_name=nova_compute)
Dec 06 09:08:52 np0005548788.localdomain podman[99973]: 2025-12-06 09:08:52.298309542 +0000 UTC m=+0.118659758 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 06 09:08:52 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:08:58 np0005548788.localdomain sshd[100000]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:00 np0005548788.localdomain sshd[100000]: Received disconnect from 45.119.84.54 port 51350:11: Bye Bye [preauth]
Dec 06 09:09:00 np0005548788.localdomain sshd[100000]: Disconnected from authenticating user root 45.119.84.54 port 51350 [preauth]
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:09:10 np0005548788.localdomain podman[100002]: 2025-12-06 09:09:10.274581633 +0000 UTC m=+0.101402145 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=logrotate_crond, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-cron, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:09:10 np0005548788.localdomain podman[100002]: 2025-12-06 09:09:10.283466247 +0000 UTC m=+0.110286769 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true)
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:09:10 np0005548788.localdomain podman[100004]: 2025-12-06 09:09:10.378745352 +0000 UTC m=+0.199184987 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Dec 06 09:09:10 np0005548788.localdomain podman[100003]: 2025-12-06 09:09:10.390409962 +0000 UTC m=+0.211911040 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:09:10 np0005548788.localdomain podman[100003]: 2025-12-06 09:09:10.42365553 +0000 UTC m=+0.245156558 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:09:10 np0005548788.localdomain podman[100004]: 2025-12-06 09:09:10.43562926 +0000 UTC m=+0.256068915 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible)
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:09:10 np0005548788.localdomain podman[100010]: 2025-12-06 09:09:10.431156031 +0000 UTC m=+0.243446664 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_ipmi)
Dec 06 09:09:10 np0005548788.localdomain podman[100005]: 2025-12-06 09:09:10.34117798 +0000 UTC m=+0.156051333 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, name=rhosp17/openstack-iscsid, architecture=x86_64, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 06 09:09:10 np0005548788.localdomain podman[100010]: 2025-12-06 09:09:10.514835107 +0000 UTC m=+0.327125750 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:09:10 np0005548788.localdomain podman[100005]: 2025-12-06 09:09:10.525636481 +0000 UTC m=+0.340509874 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:09:10 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:09:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:09:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:09:13 np0005548788.localdomain podman[100112]: 2025-12-06 09:09:13.254727271 +0000 UTC m=+0.084058199 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent)
Dec 06 09:09:13 np0005548788.localdomain podman[100112]: 2025-12-06 09:09:13.269448666 +0000 UTC m=+0.098779584 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 09:09:13 np0005548788.localdomain podman[100112]: unhealthy
Dec 06 09:09:13 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:13 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:09:13 np0005548788.localdomain podman[100113]: 2025-12-06 09:09:13.36440001 +0000 UTC m=+0.187804745 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, config_id=tripleo_step4, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:09:13 np0005548788.localdomain podman[100113]: 2025-12-06 09:09:13.403661403 +0000 UTC m=+0.227066118 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 09:09:13 np0005548788.localdomain podman[100113]: unhealthy
Dec 06 09:09:13 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:13 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:09:15 np0005548788.localdomain sshd[100153]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:16 np0005548788.localdomain sshd[100153]: Received disconnect from 179.43.189.36 port 51006:11: Bye Bye [preauth]
Dec 06 09:09:16 np0005548788.localdomain sshd[100153]: Disconnected from authenticating user root 179.43.189.36 port 51006 [preauth]
Dec 06 09:09:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:09:16 np0005548788.localdomain podman[100155]: 2025-12-06 09:09:16.130170414 +0000 UTC m=+0.083549893 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, name=rhosp17/openstack-nova-compute, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Dec 06 09:09:16 np0005548788.localdomain podman[100155]: 2025-12-06 09:09:16.505233885 +0000 UTC m=+0.458613304 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1)
Dec 06 09:09:16 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:09:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:09:20 np0005548788.localdomain podman[100178]: 2025-12-06 09:09:20.255184344 +0000 UTC m=+0.081610403 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:09:20 np0005548788.localdomain podman[100178]: 2025-12-06 09:09:20.472611023 +0000 UTC m=+0.299037062 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, name=rhosp17/openstack-qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:09:20 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:09:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:09:23 np0005548788.localdomain podman[100207]: 2025-12-06 09:09:23.242744912 +0000 UTC m=+0.067367193 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:09:23 np0005548788.localdomain podman[100207]: 2025-12-06 09:09:23.264114532 +0000 UTC m=+0.088736783 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 09:09:23 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:09:29 np0005548788.localdomain sshd[100234]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:30 np0005548788.localdomain sshd[100234]: Received disconnect from 148.227.3.232 port 53526:11: Bye Bye [preauth]
Dec 06 09:09:30 np0005548788.localdomain sshd[100234]: Disconnected from authenticating user root 148.227.3.232 port 53526 [preauth]
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:09:41 np0005548788.localdomain podman[100236]: 2025-12-06 09:09:41.328914139 +0000 UTC m=+0.146889571 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:09:41 np0005548788.localdomain podman[100245]: 2025-12-06 09:09:41.297369894 +0000 UTC m=+0.100118545 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi)
Dec 06 09:09:41 np0005548788.localdomain podman[100236]: 2025-12-06 09:09:41.365695885 +0000 UTC m=+0.183671337 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-cron, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Dec 06 09:09:41 np0005548788.localdomain podman[100245]: 2025-12-06 09:09:41.377430398 +0000 UTC m=+0.180179039 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:09:41 np0005548788.localdomain podman[100238]: 2025-12-06 09:09:41.382762963 +0000 UTC m=+0.197412082 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:09:41 np0005548788.localdomain podman[100237]: 2025-12-06 09:09:41.44513237 +0000 UTC m=+0.259197271 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 09:09:41 np0005548788.localdomain podman[100237]: 2025-12-06 09:09:41.479559165 +0000 UTC m=+0.293624026 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 09:09:41 np0005548788.localdomain podman[100239]: 2025-12-06 09:09:41.491321168 +0000 UTC m=+0.300010553 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:09:41 np0005548788.localdomain podman[100239]: 2025-12-06 09:09:41.499546712 +0000 UTC m=+0.308236077 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:09:41 np0005548788.localdomain podman[100238]: 2025-12-06 09:09:41.516355522 +0000 UTC m=+0.331004651 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:09:41 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:09:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:09:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:09:44 np0005548788.localdomain podman[100347]: 2025-12-06 09:09:44.263788698 +0000 UTC m=+0.088210247 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:09:44 np0005548788.localdomain podman[100348]: 2025-12-06 09:09:44.316810766 +0000 UTC m=+0.136308143 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:09:44 np0005548788.localdomain podman[100347]: 2025-12-06 09:09:44.333708458 +0000 UTC m=+0.158130057 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent)
Dec 06 09:09:44 np0005548788.localdomain podman[100347]: unhealthy
Dec 06 09:09:44 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:44 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:09:44 np0005548788.localdomain sudo[100382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:09:44 np0005548788.localdomain sudo[100382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:09:44 np0005548788.localdomain sudo[100382]: pam_unix(sudo:session): session closed for user root
Dec 06 09:09:44 np0005548788.localdomain podman[100348]: 2025-12-06 09:09:44.386633044 +0000 UTC m=+0.206130391 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, release=1761123044, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, architecture=x86_64, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:09:44 np0005548788.localdomain podman[100348]: unhealthy
Dec 06 09:09:44 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:44 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:09:44 np0005548788.localdomain sudo[100401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:09:44 np0005548788.localdomain sudo[100401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:09:45 np0005548788.localdomain sudo[100401]: pam_unix(sudo:session): session closed for user root
Dec 06 09:09:45 np0005548788.localdomain sudo[100447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:09:45 np0005548788.localdomain sudo[100447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:09:45 np0005548788.localdomain sudo[100447]: pam_unix(sudo:session): session closed for user root
Dec 06 09:09:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:09:47 np0005548788.localdomain podman[100462]: 2025-12-06 09:09:47.269924431 +0000 UTC m=+0.090011854 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, architecture=x86_64)
Dec 06 09:09:47 np0005548788.localdomain podman[100462]: 2025-12-06 09:09:47.642942208 +0000 UTC m=+0.463029591 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:09:47 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:09:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:09:51 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:09:51 np0005548788.localdomain recover_tripleo_nova_virtqemud[100491]: 62021
Dec 06 09:09:51 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:09:51 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:09:51 np0005548788.localdomain podman[100486]: 2025-12-06 09:09:51.262442736 +0000 UTC m=+0.091700515 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=metrics_qdr, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, architecture=x86_64, distribution-scope=public, version=17.1.12, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:09:51 np0005548788.localdomain podman[100486]: 2025-12-06 09:09:51.459039292 +0000 UTC m=+0.288297111 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:09:51 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:09:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:09:54 np0005548788.localdomain podman[100516]: 2025-12-06 09:09:54.281353462 +0000 UTC m=+0.086730282 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:09:54 np0005548788.localdomain podman[100516]: 2025-12-06 09:09:54.314645661 +0000 UTC m=+0.120022491 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1)
Dec 06 09:09:54 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:10:09 np0005548788.localdomain sshd[100544]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:10:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:10:10 np0005548788.localdomain sshd[100544]: Received disconnect from 36.50.177.119 port 37844:11: Bye Bye [preauth]
Dec 06 09:10:10 np0005548788.localdomain sshd[100544]: Disconnected from authenticating user root 36.50.177.119 port 37844 [preauth]
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:10:12 np0005548788.localdomain podman[100546]: 2025-12-06 09:10:12.274234648 +0000 UTC m=+0.096206424 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:32Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true)
Dec 06 09:10:12 np0005548788.localdomain podman[100546]: 2025-12-06 09:10:12.283276177 +0000 UTC m=+0.105248003 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:10:12 np0005548788.localdomain podman[100548]: 2025-12-06 09:10:12.332457828 +0000 UTC m=+0.146486158 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:10:12 np0005548788.localdomain podman[100548]: 2025-12-06 09:10:12.36102699 +0000 UTC m=+0.175055390 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:10:12 np0005548788.localdomain podman[100555]: 2025-12-06 09:10:12.389232682 +0000 UTC m=+0.195305996 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 06 09:10:12 np0005548788.localdomain podman[100547]: 2025-12-06 09:10:12.447987208 +0000 UTC m=+0.265494697 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:10:12 np0005548788.localdomain podman[100555]: 2025-12-06 09:10:12.470186024 +0000 UTC m=+0.276259348 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:10:12 np0005548788.localdomain podman[100547]: 2025-12-06 09:10:12.50988143 +0000 UTC m=+0.327388859 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:10:12 np0005548788.localdomain podman[100549]: 2025-12-06 09:10:12.587468918 +0000 UTC m=+0.398208237 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 09:10:12 np0005548788.localdomain podman[100549]: 2025-12-06 09:10:12.619327913 +0000 UTC m=+0.430067252 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public)
Dec 06 09:10:12 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:10:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:10:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:10:15 np0005548788.localdomain podman[100654]: 2025-12-06 09:10:15.259794814 +0000 UTC m=+0.084839262 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, release=1761123044, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true)
Dec 06 09:10:15 np0005548788.localdomain podman[100654]: 2025-12-06 09:10:15.300691368 +0000 UTC m=+0.125735776 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1)
Dec 06 09:10:15 np0005548788.localdomain podman[100654]: unhealthy
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: tmp-crun.fZKBul.mount: Deactivated successfully.
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:10:15 np0005548788.localdomain podman[100653]: 2025-12-06 09:10:15.320882822 +0000 UTC m=+0.147428937 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:10:15 np0005548788.localdomain podman[100653]: 2025-12-06 09:10:15.359775374 +0000 UTC m=+0.186321479 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:10:15 np0005548788.localdomain podman[100653]: unhealthy
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:15 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:10:17 np0005548788.localdomain sshd[100693]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:10:18 np0005548788.localdomain podman[100695]: 2025-12-06 09:10:18.258153266 +0000 UTC m=+0.084533933 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target)
Dec 06 09:10:18 np0005548788.localdomain sshd[100693]: Received disconnect from 179.43.189.36 port 56088:11: Bye Bye [preauth]
Dec 06 09:10:18 np0005548788.localdomain sshd[100693]: Disconnected from authenticating user root 179.43.189.36 port 56088 [preauth]
Dec 06 09:10:18 np0005548788.localdomain podman[100695]: 2025-12-06 09:10:18.658014934 +0000 UTC m=+0.484395601 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 06 09:10:18 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:10:18 np0005548788.localdomain sshd[100718]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:20 np0005548788.localdomain sshd[100718]: Received disconnect from 45.119.84.54 port 52216:11: Bye Bye [preauth]
Dec 06 09:10:20 np0005548788.localdomain sshd[100718]: Disconnected from authenticating user root 45.119.84.54 port 52216 [preauth]
Dec 06 09:10:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:10:22 np0005548788.localdomain systemd[1]: tmp-crun.ANb42n.mount: Deactivated successfully.
Dec 06 09:10:22 np0005548788.localdomain podman[100720]: 2025-12-06 09:10:22.264074046 +0000 UTC m=+0.091474697 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:46Z, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd)
Dec 06 09:10:22 np0005548788.localdomain podman[100720]: 2025-12-06 09:10:22.46932997 +0000 UTC m=+0.296730651 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:10:22 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:10:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:10:25 np0005548788.localdomain podman[100749]: 2025-12-06 09:10:25.258877218 +0000 UTC m=+0.084901904 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:10:25 np0005548788.localdomain podman[100749]: 2025-12-06 09:10:25.287034958 +0000 UTC m=+0.113059664 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:10:25 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:10:36 np0005548788.localdomain sshd[100774]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:38 np0005548788.localdomain sshd[100776]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:39 np0005548788.localdomain sshd[100774]: Received disconnect from 45.78.219.195 port 51590:11: Bye Bye [preauth]
Dec 06 09:10:39 np0005548788.localdomain sshd[100774]: Disconnected from authenticating user root 45.78.219.195 port 51590 [preauth]
Dec 06 09:10:41 np0005548788.localdomain sshd[100776]: Connection closed by 101.47.142.76 port 56862 [preauth]
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:10:43 np0005548788.localdomain podman[100782]: 2025-12-06 09:10:43.308961271 +0000 UTC m=+0.123606692 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, version=17.1.12, architecture=x86_64, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:10:43 np0005548788.localdomain podman[100782]: 2025-12-06 09:10:43.366493279 +0000 UTC m=+0.181138690 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible)
Dec 06 09:10:43 np0005548788.localdomain podman[100781]: 2025-12-06 09:10:43.365741936 +0000 UTC m=+0.183999718 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12)
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:10:43 np0005548788.localdomain podman[100778]: 2025-12-06 09:10:43.339073451 +0000 UTC m=+0.160975315 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Dec 06 09:10:43 np0005548788.localdomain podman[100781]: 2025-12-06 09:10:43.400330104 +0000 UTC m=+0.218587916 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public)
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:10:43 np0005548788.localdomain podman[100779]: 2025-12-06 09:10:43.454667174 +0000 UTC m=+0.274564266 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, container_name=collectd, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:10:43 np0005548788.localdomain podman[100779]: 2025-12-06 09:10:43.468257803 +0000 UTC m=+0.288154845 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:10:43 np0005548788.localdomain podman[100780]: 2025-12-06 09:10:43.555428488 +0000 UTC m=+0.374000130 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vcs-type=git)
Dec 06 09:10:43 np0005548788.localdomain podman[100778]: 2025-12-06 09:10:43.576442557 +0000 UTC m=+0.398344431 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:10:43 np0005548788.localdomain podman[100780]: 2025-12-06 09:10:43.586473617 +0000 UTC m=+0.405045159 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:10:43 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:10:45 np0005548788.localdomain sudo[100889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:10:45 np0005548788.localdomain sudo[100889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:10:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:10:45 np0005548788.localdomain sudo[100889]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:10:46 np0005548788.localdomain sudo[100906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:10:46 np0005548788.localdomain sudo[100906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:10:46 np0005548788.localdomain podman[100905]: 2025-12-06 09:10:46.051223536 +0000 UTC m=+0.082110528 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container)
Dec 06 09:10:46 np0005548788.localdomain podman[100905]: 2025-12-06 09:10:46.067874541 +0000 UTC m=+0.098761593 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Dec 06 09:10:46 np0005548788.localdomain podman[100905]: unhealthy
Dec 06 09:10:46 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:46 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:10:46 np0005548788.localdomain podman[100904]: 2025-12-06 09:10:46.153451746 +0000 UTC m=+0.187674781 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:10:46 np0005548788.localdomain podman[100904]: 2025-12-06 09:10:46.169305336 +0000 UTC m=+0.203528361 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 09:10:46 np0005548788.localdomain podman[100904]: unhealthy
Dec 06 09:10:46 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:46 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:10:46 np0005548788.localdomain sudo[100906]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:10:49 np0005548788.localdomain podman[100991]: 2025-12-06 09:10:49.257428112 +0000 UTC m=+0.080786278 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:10:49 np0005548788.localdomain podman[100991]: 2025-12-06 09:10:49.648603991 +0000 UTC m=+0.471962207 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:10:49 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:10:49 np0005548788.localdomain sudo[101014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:10:49 np0005548788.localdomain sudo[101014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:10:49 np0005548788.localdomain sudo[101014]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:10:53 np0005548788.localdomain systemd[1]: tmp-crun.Pxhk4G.mount: Deactivated successfully.
Dec 06 09:10:53 np0005548788.localdomain podman[101029]: 2025-12-06 09:10:53.276843249 +0000 UTC m=+0.098123314 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:10:53 np0005548788.localdomain podman[101029]: 2025-12-06 09:10:53.494930929 +0000 UTC m=+0.316210984 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:10:53 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:10:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:10:56 np0005548788.localdomain podman[101059]: 2025-12-06 09:10:56.267367879 +0000 UTC m=+0.092434078 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4)
Dec 06 09:10:56 np0005548788.localdomain podman[101059]: 2025-12-06 09:10:56.328835388 +0000 UTC m=+0.153901537 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vcs-type=git, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=)
Dec 06 09:10:56 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:11:14 np0005548788.localdomain podman[101085]: 2025-12-06 09:11:14.259287685 +0000 UTC m=+0.085908347 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:11:14 np0005548788.localdomain podman[101085]: 2025-12-06 09:11:14.271486011 +0000 UTC m=+0.098106713 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:11:14 np0005548788.localdomain podman[101086]: 2025-12-06 09:11:14.32516133 +0000 UTC m=+0.146012413 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Dec 06 09:11:14 np0005548788.localdomain podman[101086]: 2025-12-06 09:11:14.34068166 +0000 UTC m=+0.161532823 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:11:14 np0005548788.localdomain podman[101094]: 2025-12-06 09:11:14.429999481 +0000 UTC m=+0.243124775 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:11:14 np0005548788.localdomain podman[101094]: 2025-12-06 09:11:14.465543498 +0000 UTC m=+0.278668852 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:11:14 np0005548788.localdomain podman[101087]: 2025-12-06 09:11:14.483896336 +0000 UTC m=+0.302293613 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:11:14 np0005548788.localdomain podman[101087]: 2025-12-06 09:11:14.517844305 +0000 UTC m=+0.336241572 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:11:14 np0005548788.localdomain podman[101093]: 2025-12-06 09:11:14.533941463 +0000 UTC m=+0.348204093 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Dec 06 09:11:14 np0005548788.localdomain podman[101093]: 2025-12-06 09:11:14.571477552 +0000 UTC m=+0.385740152 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, container_name=iscsid, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:11:14 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:11:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:11:16 np0005548788.localdomain podman[101201]: 2025-12-06 09:11:16.262529233 +0000 UTC m=+0.089004202 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.openshift.expose-services=)
Dec 06 09:11:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:11:16 np0005548788.localdomain podman[101201]: 2025-12-06 09:11:16.280660983 +0000 UTC m=+0.107135922 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:11:16 np0005548788.localdomain podman[101201]: unhealthy
Dec 06 09:11:16 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:16 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:11:16 np0005548788.localdomain podman[101222]: 2025-12-06 09:11:16.360447869 +0000 UTC m=+0.073966037 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64)
Dec 06 09:11:16 np0005548788.localdomain podman[101222]: 2025-12-06 09:11:16.373495602 +0000 UTC m=+0.087013710 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 06 09:11:16 np0005548788.localdomain podman[101222]: unhealthy
Dec 06 09:11:16 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:16 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:11:20 np0005548788.localdomain sshd[101243]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:11:20 np0005548788.localdomain podman[101245]: 2025-12-06 09:11:20.267860445 +0000 UTC m=+0.088723523 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:11:20 np0005548788.localdomain podman[101245]: 2025-12-06 09:11:20.596523822 +0000 UTC m=+0.417386850 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:11:20 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:11:20 np0005548788.localdomain sshd[101243]: Received disconnect from 179.43.189.36 port 58930:11: Bye Bye [preauth]
Dec 06 09:11:20 np0005548788.localdomain sshd[101243]: Disconnected from authenticating user root 179.43.189.36 port 58930 [preauth]
Dec 06 09:11:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:11:24 np0005548788.localdomain systemd[1]: tmp-crun.5aYd2Z.mount: Deactivated successfully.
Dec 06 09:11:24 np0005548788.localdomain podman[101268]: 2025-12-06 09:11:24.250369341 +0000 UTC m=+0.081092977 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=metrics_qdr, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 06 09:11:24 np0005548788.localdomain podman[101268]: 2025-12-06 09:11:24.450535108 +0000 UTC m=+0.281258694 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 09:11:24 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:11:24 np0005548788.localdomain sshd[101298]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:26 np0005548788.localdomain sshd[101298]: Received disconnect from 36.50.177.119 port 45358:11: Bye Bye [preauth]
Dec 06 09:11:26 np0005548788.localdomain sshd[101298]: Disconnected from authenticating user root 36.50.177.119 port 45358 [preauth]
Dec 06 09:11:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:11:26 np0005548788.localdomain systemd[1]: tmp-crun.4eUzEO.mount: Deactivated successfully.
Dec 06 09:11:26 np0005548788.localdomain podman[101300]: 2025-12-06 09:11:26.661402233 +0000 UTC m=+0.097010439 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:11:26 np0005548788.localdomain podman[101300]: 2025-12-06 09:11:26.723786711 +0000 UTC m=+0.159394937 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Dec 06 09:11:26 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:11:34 np0005548788.localdomain sshd[101327]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:36 np0005548788.localdomain sshd[101327]: Received disconnect from 45.119.84.54 port 49548:11: Bye Bye [preauth]
Dec 06 09:11:36 np0005548788.localdomain sshd[101327]: Disconnected from authenticating user root 45.119.84.54 port 49548 [preauth]
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:11:45 np0005548788.localdomain podman[101332]: 2025-12-06 09:11:45.274719092 +0000 UTC m=+0.095123610 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc.)
Dec 06 09:11:45 np0005548788.localdomain podman[101332]: 2025-12-06 09:11:45.309237059 +0000 UTC m=+0.129641547 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:11:45 np0005548788.localdomain podman[101330]: 2025-12-06 09:11:45.32641243 +0000 UTC m=+0.149258794 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:11:45 np0005548788.localdomain podman[101330]: 2025-12-06 09:11:45.333798458 +0000 UTC m=+0.156644862 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-collectd)
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:11:45 np0005548788.localdomain podman[101329]: 2025-12-06 09:11:45.378318774 +0000 UTC m=+0.202190059 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Dec 06 09:11:45 np0005548788.localdomain podman[101329]: 2025-12-06 09:11:45.389623733 +0000 UTC m=+0.213495028 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:32Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:11:45 np0005548788.localdomain recover_tripleo_nova_virtqemud[101417]: 62021
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:11:45 np0005548788.localdomain podman[101331]: 2025-12-06 09:11:45.492522854 +0000 UTC m=+0.312342264 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:11:45 np0005548788.localdomain podman[101333]: 2025-12-06 09:11:45.452675652 +0000 UTC m=+0.269228511 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:11:45 np0005548788.localdomain podman[101331]: 2025-12-06 09:11:45.522550812 +0000 UTC m=+0.342370222 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute)
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:11:45 np0005548788.localdomain podman[101333]: 2025-12-06 09:11:45.53546065 +0000 UTC m=+0.352013449 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:11:45 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:11:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:11:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:11:47 np0005548788.localdomain podman[101441]: 2025-12-06 09:11:47.250171712 +0000 UTC m=+0.074298027 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 09:11:47 np0005548788.localdomain podman[101441]: 2025-12-06 09:11:47.265316731 +0000 UTC m=+0.089443066 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:11:47 np0005548788.localdomain podman[101441]: unhealthy
Dec 06 09:11:47 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:47 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:11:47 np0005548788.localdomain podman[101440]: 2025-12-06 09:11:47.307383051 +0000 UTC m=+0.135951343 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:11:47 np0005548788.localdomain podman[101440]: 2025-12-06 09:11:47.345540759 +0000 UTC m=+0.174109041 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:11:47 np0005548788.localdomain podman[101440]: unhealthy
Dec 06 09:11:47 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:47 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:11:50 np0005548788.localdomain sudo[101477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:11:50 np0005548788.localdomain sudo[101477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:50 np0005548788.localdomain sudo[101477]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:50 np0005548788.localdomain sudo[101492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:11:50 np0005548788.localdomain sudo[101492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:11:50 np0005548788.localdomain sudo[101492]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:50 np0005548788.localdomain podman[101525]: 2025-12-06 09:11:50.72565221 +0000 UTC m=+0.082726678 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Dec 06 09:11:50 np0005548788.localdomain sudo[101550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:11:50 np0005548788.localdomain sudo[101550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:50 np0005548788.localdomain sudo[101550]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:50 np0005548788.localdomain sudo[101565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:11:50 np0005548788.localdomain sudo[101565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:51 np0005548788.localdomain podman[101525]: 2025-12-06 09:11:51.068467704 +0000 UTC m=+0.425542142 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:11:51 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:11:51 np0005548788.localdomain sudo[101565]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:52 np0005548788.localdomain sudo[101613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:11:52 np0005548788.localdomain sudo[101613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:52 np0005548788.localdomain sudo[101613]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:11:55 np0005548788.localdomain podman[101628]: 2025-12-06 09:11:55.278257144 +0000 UTC m=+0.098357810 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:11:55 np0005548788.localdomain podman[101628]: 2025-12-06 09:11:55.471495706 +0000 UTC m=+0.291596332 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:11:55 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:11:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:11:57 np0005548788.localdomain systemd[1]: tmp-crun.HkN6tn.mount: Deactivated successfully.
Dec 06 09:11:57 np0005548788.localdomain podman[101657]: 2025-12-06 09:11:57.262613989 +0000 UTC m=+0.086455602 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Dec 06 09:11:57 np0005548788.localdomain podman[101657]: 2025-12-06 09:11:57.322704386 +0000 UTC m=+0.146546029 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z)
Dec 06 09:11:57 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:11:59 np0005548788.localdomain sshd[101684]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:59 np0005548788.localdomain sshd[101684]: Received disconnect from 148.227.3.232 port 55132:11: Bye Bye [preauth]
Dec 06 09:11:59 np0005548788.localdomain sshd[101684]: Disconnected from authenticating user root 148.227.3.232 port 55132 [preauth]
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:12:16 np0005548788.localdomain podman[101688]: 2025-12-06 09:12:16.281307742 +0000 UTC m=+0.098412143 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:12:16 np0005548788.localdomain podman[101693]: 2025-12-06 09:12:16.332900782 +0000 UTC m=+0.141717276 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:12:16 np0005548788.localdomain podman[101688]: 2025-12-06 09:12:16.363353277 +0000 UTC m=+0.180457728 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 09:12:16 np0005548788.localdomain podman[101693]: 2025-12-06 09:12:16.374782421 +0000 UTC m=+0.183598895 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:12:16 np0005548788.localdomain podman[101700]: 2025-12-06 09:12:16.366597317 +0000 UTC m=+0.170667594 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public)
Dec 06 09:12:16 np0005548788.localdomain podman[101686]: 2025-12-06 09:12:16.436362711 +0000 UTC m=+0.259060395 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:16 np0005548788.localdomain podman[101686]: 2025-12-06 09:12:16.450778098 +0000 UTC m=+0.273475812 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.12, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:12:16 np0005548788.localdomain podman[101700]: 2025-12-06 09:12:16.501283464 +0000 UTC m=+0.305353691 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:12:16 np0005548788.localdomain podman[101687]: 2025-12-06 09:12:16.59372148 +0000 UTC m=+0.410442819 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:16 np0005548788.localdomain podman[101687]: 2025-12-06 09:12:16.603048229 +0000 UTC m=+0.419769528 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, url=https://www.redhat.com, container_name=collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3)
Dec 06 09:12:16 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:12:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:12:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:12:18 np0005548788.localdomain podman[101798]: 2025-12-06 09:12:18.247452924 +0000 UTC m=+0.076640577 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4)
Dec 06 09:12:18 np0005548788.localdomain podman[101798]: 2025-12-06 09:12:18.261873391 +0000 UTC m=+0.091061024 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:12:18 np0005548788.localdomain podman[101798]: unhealthy
Dec 06 09:12:18 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:18 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:12:18 np0005548788.localdomain podman[101799]: 2025-12-06 09:12:18.304741162 +0000 UTC m=+0.128301150 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container)
Dec 06 09:12:18 np0005548788.localdomain podman[101799]: 2025-12-06 09:12:18.349548971 +0000 UTC m=+0.173108949 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:12:18 np0005548788.localdomain podman[101799]: unhealthy
Dec 06 09:12:18 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:18 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:12:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:12:21 np0005548788.localdomain podman[101838]: 2025-12-06 09:12:21.259812852 +0000 UTC m=+0.085925616 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, tcib_managed=true, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Dec 06 09:12:21 np0005548788.localdomain podman[101838]: 2025-12-06 09:12:21.599266899 +0000 UTC m=+0.425379623 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:12:21 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:12:22 np0005548788.localdomain sshd[101862]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:23 np0005548788.localdomain sshd[101862]: Received disconnect from 179.43.189.36 port 46658:11: Bye Bye [preauth]
Dec 06 09:12:23 np0005548788.localdomain sshd[101862]: Disconnected from authenticating user root 179.43.189.36 port 46658 [preauth]
Dec 06 09:12:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:12:26 np0005548788.localdomain podman[101864]: 2025-12-06 09:12:26.256102383 +0000 UTC m=+0.082627463 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container)
Dec 06 09:12:26 np0005548788.localdomain podman[101864]: 2025-12-06 09:12:26.477297202 +0000 UTC m=+0.303822262 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 09:12:26 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:12:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:12:28 np0005548788.localdomain podman[101893]: 2025-12-06 09:12:28.253126313 +0000 UTC m=+0.079278019 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:12:28 np0005548788.localdomain podman[101893]: 2025-12-06 09:12:28.281832513 +0000 UTC m=+0.107984219 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:12:28 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:12:40 np0005548788.localdomain sshd[101920]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:42 np0005548788.localdomain sshd[101920]: Received disconnect from 36.50.177.119 port 50870:11: Bye Bye [preauth]
Dec 06 09:12:42 np0005548788.localdomain sshd[101920]: Disconnected from authenticating user root 36.50.177.119 port 50870 [preauth]
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: tmp-crun.xd1lWP.mount: Deactivated successfully.
Dec 06 09:12:47 np0005548788.localdomain podman[101925]: 2025-12-06 09:12:47.284023283 +0000 UTC m=+0.095107321 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:12:47 np0005548788.localdomain podman[101925]: 2025-12-06 09:12:47.321721603 +0000 UTC m=+0.132805661 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:12:47 np0005548788.localdomain podman[101923]: 2025-12-06 09:12:47.328706969 +0000 UTC m=+0.145819413 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:12:47 np0005548788.localdomain podman[101924]: 2025-12-06 09:12:47.341657581 +0000 UTC m=+0.154900715 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:11:48Z, release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:12:47 np0005548788.localdomain podman[101924]: 2025-12-06 09:12:47.39517416 +0000 UTC m=+0.208417284 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4)
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:12:47 np0005548788.localdomain podman[101922]: 2025-12-06 09:12:47.26780567 +0000 UTC m=+0.089492216 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:47 np0005548788.localdomain podman[101923]: 2025-12-06 09:12:47.420539377 +0000 UTC m=+0.237651831 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, batch=17.1_20251118.1)
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:12:47 np0005548788.localdomain podman[101931]: 2025-12-06 09:12:47.40032467 +0000 UTC m=+0.206926708 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:12:47 np0005548788.localdomain podman[101931]: 2025-12-06 09:12:47.480545127 +0000 UTC m=+0.287147115 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, version=17.1.12)
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:12:47 np0005548788.localdomain podman[101922]: 2025-12-06 09:12:47.504632765 +0000 UTC m=+0.326319361 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container)
Dec 06 09:12:47 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:12:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:12:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:12:49 np0005548788.localdomain podman[102031]: 2025-12-06 09:12:49.254970786 +0000 UTC m=+0.085014859 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com)
Dec 06 09:12:49 np0005548788.localdomain podman[102031]: 2025-12-06 09:12:49.268166375 +0000 UTC m=+0.098210478 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:12:49 np0005548788.localdomain podman[102031]: unhealthy
Dec 06 09:12:49 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:49 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:12:49 np0005548788.localdomain podman[102032]: 2025-12-06 09:12:49.366885806 +0000 UTC m=+0.193096389 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ovn_controller, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64)
Dec 06 09:12:49 np0005548788.localdomain podman[102032]: 2025-12-06 09:12:49.386649399 +0000 UTC m=+0.212859962 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:12:49 np0005548788.localdomain podman[102032]: unhealthy
Dec 06 09:12:49 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:49 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:12:51 np0005548788.localdomain sshd[102071]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:12:52 np0005548788.localdomain podman[102073]: 2025-12-06 09:12:52.268346295 +0000 UTC m=+0.094715688 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:12:52 np0005548788.localdomain sudo[102084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:12:52 np0005548788.localdomain sudo[102084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:12:52 np0005548788.localdomain sudo[102084]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:52 np0005548788.localdomain sudo[102110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:12:52 np0005548788.localdomain sudo[102110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:12:52 np0005548788.localdomain podman[102073]: 2025-12-06 09:12:52.630854916 +0000 UTC m=+0.457224309 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_migration_target, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:12:52 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:12:52 np0005548788.localdomain sudo[102110]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:53 np0005548788.localdomain sshd[102071]: Received disconnect from 45.119.84.54 port 59232:11: Bye Bye [preauth]
Dec 06 09:12:53 np0005548788.localdomain sshd[102071]: Disconnected from authenticating user root 45.119.84.54 port 59232 [preauth]
Dec 06 09:12:53 np0005548788.localdomain sudo[102157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:12:53 np0005548788.localdomain sudo[102157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:12:53 np0005548788.localdomain sudo[102157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:12:57 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:12:57 np0005548788.localdomain recover_tripleo_nova_virtqemud[102174]: 62021
Dec 06 09:12:57 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:12:57 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:12:57 np0005548788.localdomain podman[102172]: 2025-12-06 09:12:57.265833232 +0000 UTC m=+0.086917716 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1)
Dec 06 09:12:57 np0005548788.localdomain podman[102172]: 2025-12-06 09:12:57.483267416 +0000 UTC m=+0.304351940 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 06 09:12:57 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:12:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:12:59 np0005548788.localdomain podman[102203]: 2025-12-06 09:12:59.260300213 +0000 UTC m=+0.082694575 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:12:59 np0005548788.localdomain podman[102203]: 2025-12-06 09:12:59.298274281 +0000 UTC m=+0.120668723 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, container_name=nova_compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute)
Dec 06 09:12:59 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:13:01 np0005548788.localdomain sshd[102229]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:01 np0005548788.localdomain anacron[93694]: Job `cron.daily' started
Dec 06 09:13:01 np0005548788.localdomain anacron[93694]: Job `cron.daily' terminated
Dec 06 09:13:07 np0005548788.localdomain sshd[102229]: Connection closed by 45.78.219.195 port 35884 [preauth]
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:13:18 np0005548788.localdomain podman[102236]: 2025-12-06 09:13:18.283905605 +0000 UTC m=+0.092244832 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, tcib_managed=true, container_name=iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container)
Dec 06 09:13:18 np0005548788.localdomain podman[102236]: 2025-12-06 09:13:18.295405592 +0000 UTC m=+0.103744879 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, container_name=iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:13:18 np0005548788.localdomain podman[102234]: 2025-12-06 09:13:18.389735707 +0000 UTC m=+0.205650898 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1)
Dec 06 09:13:18 np0005548788.localdomain podman[102247]: 2025-12-06 09:13:18.35145211 +0000 UTC m=+0.154936376 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:13:18 np0005548788.localdomain podman[102247]: 2025-12-06 09:13:18.437593521 +0000 UTC m=+0.241077827 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4)
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:13:18 np0005548788.localdomain podman[102235]: 2025-12-06 09:13:18.485566289 +0000 UTC m=+0.297616950 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-ceilometer-compute)
Dec 06 09:13:18 np0005548788.localdomain podman[102233]: 2025-12-06 09:13:18.441940697 +0000 UTC m=+0.258115677 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, tcib_managed=true, container_name=logrotate_crond, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:13:18 np0005548788.localdomain podman[102234]: 2025-12-06 09:13:18.503594079 +0000 UTC m=+0.319509200 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, io.openshift.expose-services=)
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:13:18 np0005548788.localdomain podman[102233]: 2025-12-06 09:13:18.525643941 +0000 UTC m=+0.341818911 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Dec 06 09:13:18 np0005548788.localdomain podman[102235]: 2025-12-06 09:13:18.537792889 +0000 UTC m=+0.349843590 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12)
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:13:18 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:13:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:13:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:13:20 np0005548788.localdomain podman[102342]: 2025-12-06 09:13:20.248335175 +0000 UTC m=+0.075634547 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, container_name=ovn_controller, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:13:20 np0005548788.localdomain podman[102342]: 2025-12-06 09:13:20.262097242 +0000 UTC m=+0.089396684 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:13:20 np0005548788.localdomain podman[102342]: unhealthy
Dec 06 09:13:20 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:20 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:13:20 np0005548788.localdomain podman[102341]: 2025-12-06 09:13:20.311313728 +0000 UTC m=+0.140033443 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, container_name=ovn_metadata_agent, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:13:20 np0005548788.localdomain podman[102341]: 2025-12-06 09:13:20.352915608 +0000 UTC m=+0.181635263 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:13:20 np0005548788.localdomain podman[102341]: unhealthy
Dec 06 09:13:20 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:20 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:13:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:13:23 np0005548788.localdomain podman[102382]: 2025-12-06 09:13:23.245245772 +0000 UTC m=+0.076314237 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:13:23 np0005548788.localdomain podman[102382]: 2025-12-06 09:13:23.624613577 +0000 UTC m=+0.455682042 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 06 09:13:23 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:13:27 np0005548788.localdomain sshd[102406]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:28 np0005548788.localdomain sshd[102406]: Received disconnect from 179.43.189.36 port 45262:11: Bye Bye [preauth]
Dec 06 09:13:28 np0005548788.localdomain sshd[102406]: Disconnected from authenticating user root 179.43.189.36 port 45262 [preauth]
Dec 06 09:13:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:13:28 np0005548788.localdomain podman[102408]: 2025-12-06 09:13:28.202560655 +0000 UTC m=+0.087560816 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Dec 06 09:13:28 np0005548788.localdomain podman[102408]: 2025-12-06 09:13:28.482910089 +0000 UTC m=+0.367910210 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 09:13:28 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:13:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:13:30 np0005548788.localdomain podman[102437]: 2025-12-06 09:13:30.243472455 +0000 UTC m=+0.069611100 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, config_id=tripleo_step5, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:13:30 np0005548788.localdomain podman[102437]: 2025-12-06 09:13:30.296583832 +0000 UTC m=+0.122722477 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:13:30 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:13:49 np0005548788.localdomain podman[102463]: 2025-12-06 09:13:49.243656613 +0000 UTC m=+0.062439927 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, config_id=tripleo_step3, version=17.1.12, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: tmp-crun.Gwxxft.mount: Deactivated successfully.
Dec 06 09:13:49 np0005548788.localdomain podman[102470]: 2025-12-06 09:13:49.31805394 +0000 UTC m=+0.128911528 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:13:49 np0005548788.localdomain podman[102462]: 2025-12-06 09:13:49.276718938 +0000 UTC m=+0.096910705 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:13:49 np0005548788.localdomain podman[102476]: 2025-12-06 09:13:49.332226789 +0000 UTC m=+0.130623821 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Dec 06 09:13:49 np0005548788.localdomain podman[102470]: 2025-12-06 09:13:49.358884537 +0000 UTC m=+0.169742155 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, config_id=tripleo_step3, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:13:49 np0005548788.localdomain podman[102464]: 2025-12-06 09:13:49.378755003 +0000 UTC m=+0.189579650 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 09:13:49 np0005548788.localdomain podman[102462]: 2025-12-06 09:13:49.409537467 +0000 UTC m=+0.229729404 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044)
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:13:49 np0005548788.localdomain podman[102463]: 2025-12-06 09:13:49.429718943 +0000 UTC m=+0.248502297 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=collectd, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:13:49 np0005548788.localdomain podman[102464]: 2025-12-06 09:13:49.462924743 +0000 UTC m=+0.273749400 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:13:49 np0005548788.localdomain podman[102476]: 2025-12-06 09:13:49.4821728 +0000 UTC m=+0.280569842 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Dec 06 09:13:49 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:13:51 np0005548788.localdomain podman[102572]: 2025-12-06 09:13:51.243923444 +0000 UTC m=+0.074651906 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 09:13:51 np0005548788.localdomain podman[102572]: 2025-12-06 09:13:51.257459984 +0000 UTC m=+0.088188436 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller)
Dec 06 09:13:51 np0005548788.localdomain podman[102572]: unhealthy
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: tmp-crun.bvoflh.mount: Deactivated successfully.
Dec 06 09:13:51 np0005548788.localdomain podman[102571]: 2025-12-06 09:13:51.345910686 +0000 UTC m=+0.175899695 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:13:51 np0005548788.localdomain podman[102571]: 2025-12-06 09:13:51.386595738 +0000 UTC m=+0.216584757 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:13:51 np0005548788.localdomain podman[102571]: unhealthy
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:51 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:13:53 np0005548788.localdomain sudo[102612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:13:54 np0005548788.localdomain sudo[102612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:13:54 np0005548788.localdomain sudo[102612]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:54 np0005548788.localdomain sudo[102633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:13:54 np0005548788.localdomain sudo[102633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:54 np0005548788.localdomain podman[102626]: 2025-12-06 09:13:54.118542018 +0000 UTC m=+0.089901919 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:13:54 np0005548788.localdomain podman[102626]: 2025-12-06 09:13:54.502281619 +0000 UTC m=+0.473641480 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:13:54 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:13:55 np0005548788.localdomain podman[102734]: 2025-12-06 09:13:55.054652509 +0000 UTC m=+0.103162911 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=1763362218, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 09:13:55 np0005548788.localdomain podman[102734]: 2025-12-06 09:13:55.167802927 +0000 UTC m=+0.216313309 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1763362218, architecture=x86_64, name=rhceph, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7)
Dec 06 09:13:55 np0005548788.localdomain sudo[102633]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:55 np0005548788.localdomain sudo[102800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:13:55 np0005548788.localdomain sudo[102800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:55 np0005548788.localdomain sudo[102800]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:55 np0005548788.localdomain sudo[102815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:13:55 np0005548788.localdomain sudo[102815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:56 np0005548788.localdomain sudo[102815]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:56 np0005548788.localdomain sudo[102861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:13:56 np0005548788.localdomain sudo[102861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:56 np0005548788.localdomain sudo[102861]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:13:59 np0005548788.localdomain podman[102876]: 2025-12-06 09:13:59.273088328 +0000 UTC m=+0.095504962 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, container_name=metrics_qdr, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:13:59 np0005548788.localdomain podman[102876]: 2025-12-06 09:13:59.492585115 +0000 UTC m=+0.315001659 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 09:13:59 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:14:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:14:01 np0005548788.localdomain podman[102905]: 2025-12-06 09:14:01.271754069 +0000 UTC m=+0.095724529 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, tcib_managed=true, container_name=nova_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.buildah.version=1.41.4)
Dec 06 09:14:01 np0005548788.localdomain podman[102905]: 2025-12-06 09:14:01.300872862 +0000 UTC m=+0.124843272 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1761123044, version=17.1.12)
Dec 06 09:14:01 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:14:01 np0005548788.localdomain sshd[102931]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:03 np0005548788.localdomain sshd[102931]: Received disconnect from 36.50.177.119 port 57618:11: Bye Bye [preauth]
Dec 06 09:14:03 np0005548788.localdomain sshd[102931]: Disconnected from authenticating user root 36.50.177.119 port 57618 [preauth]
Dec 06 09:14:10 np0005548788.localdomain sshd[102933]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:12 np0005548788.localdomain sshd[102933]: Received disconnect from 45.119.84.54 port 44916:11: Bye Bye [preauth]
Dec 06 09:14:12 np0005548788.localdomain sshd[102933]: Disconnected from authenticating user root 45.119.84.54 port 44916 [preauth]
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:14:20 np0005548788.localdomain podman[102938]: 2025-12-06 09:14:20.281107292 +0000 UTC m=+0.096509103 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, distribution-scope=public, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1)
Dec 06 09:14:20 np0005548788.localdomain podman[102937]: 2025-12-06 09:14:20.329008087 +0000 UTC m=+0.146729141 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z)
Dec 06 09:14:20 np0005548788.localdomain podman[102938]: 2025-12-06 09:14:20.367646146 +0000 UTC m=+0.183047887 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid)
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:14:20 np0005548788.localdomain podman[102937]: 2025-12-06 09:14:20.419095821 +0000 UTC m=+0.236816885 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:14:20 np0005548788.localdomain podman[102945]: 2025-12-06 09:14:20.435810359 +0000 UTC m=+0.246964109 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 09:14:20 np0005548788.localdomain podman[102945]: 2025-12-06 09:14:20.471446065 +0000 UTC m=+0.282599815 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:14:20 np0005548788.localdomain podman[102936]: 2025-12-06 09:14:20.416681166 +0000 UTC m=+0.238689063 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:14:20 np0005548788.localdomain podman[102936]: 2025-12-06 09:14:20.555769599 +0000 UTC m=+0.377777536 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:14:20 np0005548788.localdomain podman[102935]: 2025-12-06 09:14:20.371694341 +0000 UTC m=+0.196048591 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, architecture=x86_64, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1)
Dec 06 09:14:20 np0005548788.localdomain podman[102935]: 2025-12-06 09:14:20.607878276 +0000 UTC m=+0.432232526 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:14:20 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:14:21 np0005548788.localdomain podman[103050]: 2025-12-06 09:14:21.405473009 +0000 UTC m=+0.102796718 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 09:14:21 np0005548788.localdomain podman[103050]: 2025-12-06 09:14:21.422507068 +0000 UTC m=+0.119830807 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 06 09:14:21 np0005548788.localdomain podman[103050]: unhealthy
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: tmp-crun.W9R1ZI.mount: Deactivated successfully.
Dec 06 09:14:21 np0005548788.localdomain podman[103070]: 2025-12-06 09:14:21.543301464 +0000 UTC m=+0.087294008 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 06 09:14:21 np0005548788.localdomain podman[103070]: 2025-12-06 09:14:21.563529871 +0000 UTC m=+0.107522395 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=)
Dec 06 09:14:21 np0005548788.localdomain podman[103070]: unhealthy
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:14:21 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:14:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:14:25 np0005548788.localdomain podman[103090]: 2025-12-06 09:14:25.252376446 +0000 UTC m=+0.080864399 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_migration_target, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:14:25 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:14:25 np0005548788.localdomain podman[103090]: 2025-12-06 09:14:25.646624512 +0000 UTC m=+0.475112445 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:14:25 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:14:25 np0005548788.localdomain recover_tripleo_nova_virtqemud[103114]: 62021
Dec 06 09:14:25 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:14:25 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:14:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:14:30 np0005548788.localdomain podman[103116]: 2025-12-06 09:14:30.276764657 +0000 UTC m=+0.092946593 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git)
Dec 06 09:14:30 np0005548788.localdomain podman[103116]: 2025-12-06 09:14:30.491674822 +0000 UTC m=+0.307856708 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 09:14:30 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:14:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:14:32 np0005548788.localdomain podman[103145]: 2025-12-06 09:14:32.261072383 +0000 UTC m=+0.089053322 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5)
Dec 06 09:14:32 np0005548788.localdomain podman[103145]: 2025-12-06 09:14:32.321757005 +0000 UTC m=+0.149737934 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step5, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.expose-services=, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:14:32 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:14:33 np0005548788.localdomain sshd[103170]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:33 np0005548788.localdomain sshd[103172]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:34 np0005548788.localdomain sshd[103172]: Received disconnect from 148.227.3.232 port 45144:11: Bye Bye [preauth]
Dec 06 09:14:34 np0005548788.localdomain sshd[103172]: Disconnected from authenticating user root 148.227.3.232 port 45144 [preauth]
Dec 06 09:14:34 np0005548788.localdomain sshd[103170]: Received disconnect from 179.43.189.36 port 56776:11: Bye Bye [preauth]
Dec 06 09:14:34 np0005548788.localdomain sshd[103170]: Disconnected from authenticating user root 179.43.189.36 port 56776 [preauth]
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:14:51 np0005548788.localdomain podman[103174]: 2025-12-06 09:14:51.269472527 +0000 UTC m=+0.090551830 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-cron, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: tmp-crun.C1qoXT.mount: Deactivated successfully.
Dec 06 09:14:51 np0005548788.localdomain podman[103176]: 2025-12-06 09:14:51.334620077 +0000 UTC m=+0.144686038 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:14:51 np0005548788.localdomain podman[103174]: 2025-12-06 09:14:51.353580805 +0000 UTC m=+0.174660138 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-cron-container)
Dec 06 09:14:51 np0005548788.localdomain podman[103175]: 2025-12-06 09:14:51.304041598 +0000 UTC m=+0.119972931 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:14:51 np0005548788.localdomain podman[103176]: 2025-12-06 09:14:51.377826596 +0000 UTC m=+0.187892557 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com)
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:14:51 np0005548788.localdomain podman[103175]: 2025-12-06 09:14:51.388702694 +0000 UTC m=+0.204633967 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, url=https://www.redhat.com, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:14:51 np0005548788.localdomain podman[103189]: 2025-12-06 09:14:51.453502533 +0000 UTC m=+0.251330694 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z)
Dec 06 09:14:51 np0005548788.localdomain podman[103189]: 2025-12-06 09:14:51.484558207 +0000 UTC m=+0.282386368 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi)
Dec 06 09:14:51 np0005548788.localdomain podman[103182]: 2025-12-06 09:14:51.49917856 +0000 UTC m=+0.300736068 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid)
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:14:51 np0005548788.localdomain podman[103182]: 2025-12-06 09:14:51.538712336 +0000 UTC m=+0.340269804 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, container_name=iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:14:51 np0005548788.localdomain podman[103287]: 2025-12-06 09:14:51.595188367 +0000 UTC m=+0.084623725 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., release=1761123044, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:14:51 np0005548788.localdomain podman[103287]: 2025-12-06 09:14:51.612613897 +0000 UTC m=+0.102049255 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:14:51 np0005548788.localdomain podman[103287]: unhealthy
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:14:51 np0005548788.localdomain podman[103308]: 2025-12-06 09:14:51.729790851 +0000 UTC m=+0.082621582 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:14:51 np0005548788.localdomain podman[103308]: 2025-12-06 09:14:51.745457648 +0000 UTC m=+0.098288309 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Dec 06 09:14:51 np0005548788.localdomain podman[103308]: unhealthy
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:14:51 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:14:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:14:56 np0005548788.localdomain podman[103328]: 2025-12-06 09:14:56.269340048 +0000 UTC m=+0.095491153 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Dec 06 09:14:56 np0005548788.localdomain podman[103328]: 2025-12-06 09:14:56.664888104 +0000 UTC m=+0.491039229 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12)
Dec 06 09:14:56 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:14:57 np0005548788.localdomain sudo[103351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:14:57 np0005548788.localdomain sudo[103351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:14:57 np0005548788.localdomain sudo[103351]: pam_unix(sudo:session): session closed for user root
Dec 06 09:14:57 np0005548788.localdomain sudo[103366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:14:57 np0005548788.localdomain sudo[103366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:14:57 np0005548788.localdomain sudo[103366]: pam_unix(sudo:session): session closed for user root
Dec 06 09:14:58 np0005548788.localdomain sudo[103414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:14:58 np0005548788.localdomain sudo[103414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:14:58 np0005548788.localdomain sudo[103414]: pam_unix(sudo:session): session closed for user root
Dec 06 09:15:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:15:01 np0005548788.localdomain podman[103429]: 2025-12-06 09:15:01.272614087 +0000 UTC m=+0.094457331 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64)
Dec 06 09:15:01 np0005548788.localdomain podman[103429]: 2025-12-06 09:15:01.479681357 +0000 UTC m=+0.301524561 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:15:01 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:15:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:15:03 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:15:03 np0005548788.localdomain recover_tripleo_nova_virtqemud[103465]: 62021
Dec 06 09:15:03 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:15:03 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:15:03 np0005548788.localdomain podman[103458]: 2025-12-06 09:15:03.257094717 +0000 UTC m=+0.081270071 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:15:03 np0005548788.localdomain podman[103458]: 2025-12-06 09:15:03.293595589 +0000 UTC m=+0.117770903 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 06 09:15:03 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:15:20 np0005548788.localdomain sshd[103486]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:22 np0005548788.localdomain sshd[103486]: Received disconnect from 36.50.177.119 port 36032:11: Bye Bye [preauth]
Dec 06 09:15:22 np0005548788.localdomain sshd[103486]: Disconnected from authenticating user root 36.50.177.119 port 36032 [preauth]
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: tmp-crun.rfTNLf.mount: Deactivated successfully.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: tmp-crun.RDZMdI.mount: Deactivated successfully.
Dec 06 09:15:22 np0005548788.localdomain podman[103488]: 2025-12-06 09:15:22.206244184 +0000 UTC m=+0.118159045 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 09:15:22 np0005548788.localdomain podman[103527]: 2025-12-06 09:15:22.291588881 +0000 UTC m=+0.148831627 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z)
Dec 06 09:15:22 np0005548788.localdomain podman[103527]: 2025-12-06 09:15:22.306470262 +0000 UTC m=+0.163713018 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, release=1761123044, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Dec 06 09:15:22 np0005548788.localdomain podman[103527]: unhealthy
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:15:22 np0005548788.localdomain podman[103528]: 2025-12-06 09:15:22.231901029 +0000 UTC m=+0.086538495 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 06 09:15:22 np0005548788.localdomain podman[103490]: 2025-12-06 09:15:22.258489704 +0000 UTC m=+0.164180703 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vcs-type=git, container_name=collectd, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 09:15:22 np0005548788.localdomain podman[103497]: 2025-12-06 09:15:22.37314852 +0000 UTC m=+0.270507300 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:15:22 np0005548788.localdomain podman[103488]: 2025-12-06 09:15:22.391131637 +0000 UTC m=+0.303046518 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, container_name=logrotate_crond, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-cron, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:15:22 np0005548788.localdomain podman[103491]: 2025-12-06 09:15:22.31122934 +0000 UTC m=+0.213456981 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z)
Dec 06 09:15:22 np0005548788.localdomain podman[103497]: 2025-12-06 09:15:22.414492091 +0000 UTC m=+0.311850851 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 09:15:22 np0005548788.localdomain podman[103528]: 2025-12-06 09:15:22.421463928 +0000 UTC m=+0.276101404 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:15:22 np0005548788.localdomain podman[103489]: 2025-12-06 09:15:22.46508182 +0000 UTC m=+0.336855037 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Dec 06 09:15:22 np0005548788.localdomain podman[103489]: 2025-12-06 09:15:22.478535228 +0000 UTC m=+0.350308425 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Dec 06 09:15:22 np0005548788.localdomain podman[103489]: unhealthy
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:15:22 np0005548788.localdomain podman[103490]: 2025-12-06 09:15:22.496294849 +0000 UTC m=+0.401985898 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd)
Dec 06 09:15:22 np0005548788.localdomain podman[103491]: 2025-12-06 09:15:22.495836854 +0000 UTC m=+0.398064225 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, release=1761123044, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:15:22 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:15:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:15:27 np0005548788.localdomain podman[103639]: 2025-12-06 09:15:27.27334144 +0000 UTC m=+0.085155532 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:15:27 np0005548788.localdomain sshd[103663]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:27 np0005548788.localdomain podman[103639]: 2025-12-06 09:15:27.652501038 +0000 UTC m=+0.464315010 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public)
Dec 06 09:15:27 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:15:28 np0005548788.localdomain sshd[103663]: Received disconnect from 45.119.84.54 port 58070:11: Bye Bye [preauth]
Dec 06 09:15:28 np0005548788.localdomain sshd[103663]: Disconnected from authenticating user root 45.119.84.54 port 58070 [preauth]
Dec 06 09:15:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:15:32 np0005548788.localdomain podman[103666]: 2025-12-06 09:15:32.259324172 +0000 UTC m=+0.085306837 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:15:32 np0005548788.localdomain podman[103666]: 2025-12-06 09:15:32.453779872 +0000 UTC m=+0.279762527 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd)
Dec 06 09:15:32 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:15:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:15:34 np0005548788.localdomain podman[103696]: 2025-12-06 09:15:34.255155635 +0000 UTC m=+0.082686535 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=nova_compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5)
Dec 06 09:15:34 np0005548788.localdomain podman[103696]: 2025-12-06 09:15:34.291553743 +0000 UTC m=+0.119084634 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1)
Dec 06 09:15:34 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Deactivated successfully.
Dec 06 09:15:40 np0005548788.localdomain sshd[103722]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:41 np0005548788.localdomain sshd[103722]: Received disconnect from 179.43.189.36 port 35786:11: Bye Bye [preauth]
Dec 06 09:15:41 np0005548788.localdomain sshd[103722]: Disconnected from authenticating user root 179.43.189.36 port 35786 [preauth]
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:15:53 np0005548788.localdomain podman[103733]: 2025-12-06 09:15:53.296105686 +0000 UTC m=+0.100187698 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:15:53 np0005548788.localdomain podman[103733]: 2025-12-06 09:15:53.333633339 +0000 UTC m=+0.137715341 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 09:15:53 np0005548788.localdomain podman[103725]: 2025-12-06 09:15:53.344589979 +0000 UTC m=+0.159683323 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_id=tripleo_step4)
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:15:53 np0005548788.localdomain podman[103725]: 2025-12-06 09:15:53.392111503 +0000 UTC m=+0.207204827 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:15:53 np0005548788.localdomain podman[103725]: unhealthy
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:15:53 np0005548788.localdomain podman[103731]: 2025-12-06 09:15:53.397693556 +0000 UTC m=+0.200977703 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:15:53 np0005548788.localdomain podman[103726]: 2025-12-06 09:15:53.452585429 +0000 UTC m=+0.262099520 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:15:53 np0005548788.localdomain podman[103724]: 2025-12-06 09:15:53.503036423 +0000 UTC m=+0.322583224 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:15:53 np0005548788.localdomain podman[103726]: 2025-12-06 09:15:53.513752256 +0000 UTC m=+0.323266317 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:15:53 np0005548788.localdomain podman[103731]: 2025-12-06 09:15:53.527229194 +0000 UTC m=+0.330513331 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:15:53 np0005548788.localdomain podman[103738]: 2025-12-06 09:15:53.558798443 +0000 UTC m=+0.358347214 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:15:53 np0005548788.localdomain podman[103724]: 2025-12-06 09:15:53.565737577 +0000 UTC m=+0.385284348 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git)
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:15:53 np0005548788.localdomain podman[103738]: 2025-12-06 09:15:53.599666639 +0000 UTC m=+0.399215410 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:15:53 np0005548788.localdomain podman[103738]: unhealthy
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:15:53 np0005548788.localdomain podman[103746]: 2025-12-06 09:15:53.656528803 +0000 UTC m=+0.453727321 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:15:53 np0005548788.localdomain podman[103746]: 2025-12-06 09:15:53.714852232 +0000 UTC m=+0.512050740 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_ipmi)
Dec 06 09:15:53 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:15:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:15:58 np0005548788.localdomain systemd[1]: tmp-crun.dT0Oos.mount: Deactivated successfully.
Dec 06 09:15:58 np0005548788.localdomain podman[103869]: 2025-12-06 09:15:58.257485354 +0000 UTC m=+0.083484240 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:15:58 np0005548788.localdomain sudo[103890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:15:58 np0005548788.localdomain sudo[103890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:15:58 np0005548788.localdomain sudo[103890]: pam_unix(sudo:session): session closed for user root
Dec 06 09:15:58 np0005548788.localdomain podman[103869]: 2025-12-06 09:15:58.669309185 +0000 UTC m=+0.495308091 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:15:58 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:15:58 np0005548788.localdomain sudo[103905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:15:58 np0005548788.localdomain sudo[103905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:15:59 np0005548788.localdomain sudo[103905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:15:59 np0005548788.localdomain sudo[103953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:15:59 np0005548788.localdomain sudo[103953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:15:59 np0005548788.localdomain sudo[103953]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:16:03 np0005548788.localdomain podman[103968]: 2025-12-06 09:16:03.274889999 +0000 UTC m=+0.097089263 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1)
Dec 06 09:16:03 np0005548788.localdomain podman[103968]: 2025-12-06 09:16:03.478656358 +0000 UTC m=+0.300855602 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:16:03 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:16:05 np0005548788.localdomain sshd[103997]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:16:05 np0005548788.localdomain podman[103999]: 2025-12-06 09:16:05.26584142 +0000 UTC m=+0.089501766 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:16:05 np0005548788.localdomain podman[103999]: 2025-12-06 09:16:05.317690507 +0000 UTC m=+0.141350923 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:16:05 np0005548788.localdomain podman[103999]: unhealthy
Dec 06 09:16:05 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:05 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:16:09 np0005548788.localdomain sshd[103997]: Received disconnect from 101.47.142.76 port 42770:11: Bye Bye [preauth]
Dec 06 09:16:09 np0005548788.localdomain sshd[103997]: Disconnected from authenticating user root 101.47.142.76 port 42770 [preauth]
Dec 06 09:16:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25330 DF PROTO=TCP SPT=34272 DPT=9102 SEQ=1525795616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D084340000000001030307) 
Dec 06 09:16:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49498 DF PROTO=TCP SPT=42314 DPT=9882 SEQ=1432571632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D084D90000000001030307) 
Dec 06 09:16:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25331 DF PROTO=TCP SPT=34272 DPT=9102 SEQ=1525795616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D088300000000001030307) 
Dec 06 09:16:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49499 DF PROTO=TCP SPT=42314 DPT=9882 SEQ=1432571632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D088F00000000001030307) 
Dec 06 09:16:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25332 DF PROTO=TCP SPT=34272 DPT=9102 SEQ=1525795616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D090300000000001030307) 
Dec 06 09:16:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49500 DF PROTO=TCP SPT=42314 DPT=9882 SEQ=1432571632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D090F00000000001030307) 
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:16:24 np0005548788.localdomain podman[104022]: 2025-12-06 09:16:24.276277495 +0000 UTC m=+0.092667444 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:16:24 np0005548788.localdomain podman[104022]: 2025-12-06 09:16:24.29287885 +0000 UTC m=+0.109268859 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, container_name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:16:24 np0005548788.localdomain podman[104022]: unhealthy
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:16:24 np0005548788.localdomain podman[104023]: 2025-12-06 09:16:24.382371496 +0000 UTC m=+0.194698910 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12)
Dec 06 09:16:24 np0005548788.localdomain podman[104023]: 2025-12-06 09:16:24.392657424 +0000 UTC m=+0.204984788 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:16:24 np0005548788.localdomain podman[104031]: 2025-12-06 09:16:24.402863281 +0000 UTC m=+0.202855452 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com)
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:16:24 np0005548788.localdomain podman[104024]: 2025-12-06 09:16:24.346139042 +0000 UTC m=+0.154387129 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:11:48Z)
Dec 06 09:16:24 np0005548788.localdomain podman[104031]: 2025-12-06 09:16:24.469172787 +0000 UTC m=+0.269164988 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com)
Dec 06 09:16:24 np0005548788.localdomain podman[104024]: 2025-12-06 09:16:24.476480073 +0000 UTC m=+0.284728190 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, release=1761123044, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true)
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:16:24 np0005548788.localdomain podman[104021]: 2025-12-06 09:16:24.491039076 +0000 UTC m=+0.310864152 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 06 09:16:24 np0005548788.localdomain podman[104021]: 2025-12-06 09:16:24.503577664 +0000 UTC m=+0.323402750 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:16:24 np0005548788.localdomain podman[104042]: 2025-12-06 09:16:24.453169891 +0000 UTC m=+0.250838500 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4)
Dec 06 09:16:24 np0005548788.localdomain podman[104042]: 2025-12-06 09:16:24.584027359 +0000 UTC m=+0.381695918 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044)
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:16:24 np0005548788.localdomain podman[104036]: 2025-12-06 09:16:24.647253419 +0000 UTC m=+0.448085136 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:16:24 np0005548788.localdomain podman[104036]: 2025-12-06 09:16:24.666909949 +0000 UTC m=+0.467741646 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:16:24 np0005548788.localdomain podman[104036]: unhealthy
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:24 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:16:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25333 DF PROTO=TCP SPT=34272 DPT=9102 SEQ=1525795616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D09FF00000000001030307) 
Dec 06 09:16:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49501 DF PROTO=TCP SPT=42314 DPT=9882 SEQ=1432571632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0A0B10000000001030307) 
Dec 06 09:16:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:16:29 np0005548788.localdomain podman[104170]: 2025-12-06 09:16:29.259190441 +0000 UTC m=+0.087519516 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:16:29 np0005548788.localdomain podman[104170]: 2025-12-06 09:16:29.63741384 +0000 UTC m=+0.465742845 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:16:29 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:16:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:16:34 np0005548788.localdomain podman[104193]: 2025-12-06 09:16:34.251097286 +0000 UTC m=+0.082129128 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd)
Dec 06 09:16:34 np0005548788.localdomain podman[104193]: 2025-12-06 09:16:34.490625734 +0000 UTC m=+0.321657526 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Dec 06 09:16:34 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:16:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25334 DF PROTO=TCP SPT=34272 DPT=9102 SEQ=1525795616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0BFF00000000001030307) 
Dec 06 09:16:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22117 DF PROTO=TCP SPT=37212 DPT=9101 SEQ=3929335519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0C03E0000000001030307) 
Dec 06 09:16:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49502 DF PROTO=TCP SPT=42314 DPT=9882 SEQ=1432571632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0C1F10000000001030307) 
Dec 06 09:16:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22118 DF PROTO=TCP SPT=37212 DPT=9101 SEQ=3929335519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0C4300000000001030307) 
Dec 06 09:16:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:16:36 np0005548788.localdomain podman[104222]: 2025-12-06 09:16:36.256061701 +0000 UTC m=+0.079656691 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:16:36 np0005548788.localdomain podman[104222]: 2025-12-06 09:16:36.276164214 +0000 UTC m=+0.099759144 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:16:36 np0005548788.localdomain podman[104222]: unhealthy
Dec 06 09:16:36 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:36 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:16:37 np0005548788.localdomain sshd[104244]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:37 np0005548788.localdomain sshd[104244]: Accepted publickey for zuul from 192.168.122.31 port 50090 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:16:37 np0005548788.localdomain systemd-logind[765]: New session 36 of user zuul.
Dec 06 09:16:37 np0005548788.localdomain systemd[1]: Started Session 36 of User zuul.
Dec 06 09:16:37 np0005548788.localdomain sshd[104244]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:16:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22119 DF PROTO=TCP SPT=37212 DPT=9101 SEQ=3929335519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0CC310000000001030307) 
Dec 06 09:16:38 np0005548788.localdomain sudo[104337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkegjflgdtvndjxsffzototmzxjawvtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012597.9575331-26-207212232975017/AnsiballZ_stat.py
Dec 06 09:16:38 np0005548788.localdomain sudo[104337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:38 np0005548788.localdomain python3.9[104339]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:16:38 np0005548788.localdomain sudo[104337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:38 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29704 DF PROTO=TCP SPT=46786 DPT=9105 SEQ=2845156926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0CFFF0000000001030307) 
Dec 06 09:16:39 np0005548788.localdomain sshd[104388]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:39 np0005548788.localdomain sudo[104433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgmgulmdjhepmhzrcumvrozkcqluusqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012598.8566778-62-14774637427779/AnsiballZ_command.py
Dec 06 09:16:39 np0005548788.localdomain sudo[104433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:39 np0005548788.localdomain python3.9[104435]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:16:39 np0005548788.localdomain sudo[104433]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:39 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29705 DF PROTO=TCP SPT=46786 DPT=9105 SEQ=2845156926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0D3F10000000001030307) 
Dec 06 09:16:40 np0005548788.localdomain sudo[104526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkyvouibfpixpcrfgultxfvrpprkvrfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012599.815108-86-116447280175168/AnsiballZ_stat.py
Dec 06 09:16:40 np0005548788.localdomain sudo[104526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:40 np0005548788.localdomain python3.9[104528]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:16:40 np0005548788.localdomain sudo[104526]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:40 np0005548788.localdomain sshd[104388]: Received disconnect from 36.50.177.119 port 47292:11: Bye Bye [preauth]
Dec 06 09:16:40 np0005548788.localdomain sshd[104388]: Disconnected from authenticating user root 36.50.177.119 port 47292 [preauth]
Dec 06 09:16:40 np0005548788.localdomain sudo[104620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klnsiczcvtmtdtrufiiiqjfespgdckdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012600.4825037-110-272424281542418/AnsiballZ_command.py
Dec 06 09:16:40 np0005548788.localdomain sudo[104620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:41 np0005548788.localdomain python3.9[104622]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:16:41 np0005548788.localdomain sudo[104620]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:41 np0005548788.localdomain sudo[104713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtcngevpchajkuytvxzisdyddpyzwpfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012601.3358765-137-116608240569471/AnsiballZ_command.py
Dec 06 09:16:41 np0005548788.localdomain sudo[104713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:41 np0005548788.localdomain python3.9[104715]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:16:41 np0005548788.localdomain sudo[104713]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29706 DF PROTO=TCP SPT=46786 DPT=9105 SEQ=2845156926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0DBF00000000001030307) 
Dec 06 09:16:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22120 DF PROTO=TCP SPT=37212 DPT=9101 SEQ=3929335519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0DBF00000000001030307) 
Dec 06 09:16:42 np0005548788.localdomain python3.9[104806]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 06 09:16:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63111 DF PROTO=TCP SPT=46082 DPT=9100 SEQ=2901438704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0E25F0000000001030307) 
Dec 06 09:16:43 np0005548788.localdomain sshd[104847]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:44 np0005548788.localdomain sshd[104899]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:44 np0005548788.localdomain python3.9[104898]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:16:44 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63112 DF PROTO=TCP SPT=46082 DPT=9100 SEQ=2901438704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0E6710000000001030307) 
Dec 06 09:16:44 np0005548788.localdomain python3.9[104992]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 06 09:16:44 np0005548788.localdomain sshd[104899]: Received disconnect from 179.43.189.36 port 42902:11: Bye Bye [preauth]
Dec 06 09:16:44 np0005548788.localdomain sshd[104899]: Disconnected from authenticating user root 179.43.189.36 port 42902 [preauth]
Dec 06 09:16:45 np0005548788.localdomain sshd[104847]: Received disconnect from 45.119.84.54 port 42604:11: Bye Bye [preauth]
Dec 06 09:16:45 np0005548788.localdomain sshd[104847]: Disconnected from authenticating user root 45.119.84.54 port 42604 [preauth]
Dec 06 09:16:45 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29707 DF PROTO=TCP SPT=46786 DPT=9105 SEQ=2845156926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0EBB00000000001030307) 
Dec 06 09:16:46 np0005548788.localdomain python3.9[105082]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:16:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63113 DF PROTO=TCP SPT=46082 DPT=9100 SEQ=2901438704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0EE710000000001030307) 
Dec 06 09:16:46 np0005548788.localdomain python3.9[105130]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:16:47 np0005548788.localdomain sshd[104244]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:16:47 np0005548788.localdomain systemd[1]: session-36.scope: Deactivated successfully.
Dec 06 09:16:47 np0005548788.localdomain systemd[1]: session-36.scope: Consumed 5.130s CPU time.
Dec 06 09:16:47 np0005548788.localdomain systemd-logind[765]: Session 36 logged out. Waiting for processes to exit.
Dec 06 09:16:47 np0005548788.localdomain systemd-logind[765]: Removed session 36.
Dec 06 09:16:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47263 DF PROTO=TCP SPT=52594 DPT=9102 SEQ=293429940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0F9650000000001030307) 
Dec 06 09:16:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51087 DF PROTO=TCP SPT=40398 DPT=9882 SEQ=949404690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0FA0A0000000001030307) 
Dec 06 09:16:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22121 DF PROTO=TCP SPT=37212 DPT=9101 SEQ=3929335519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0FBF00000000001030307) 
Dec 06 09:16:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47264 DF PROTO=TCP SPT=52594 DPT=9102 SEQ=293429940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D0FD700000000001030307) 
Dec 06 09:16:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47265 DF PROTO=TCP SPT=52594 DPT=9102 SEQ=293429940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D105700000000001030307) 
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:16:55 np0005548788.localdomain recover_tripleo_nova_virtqemud[105195]: 62021
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:16:55 np0005548788.localdomain podman[105164]: 2025-12-06 09:16:55.299325005 +0000 UTC m=+0.100413795 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12)
Dec 06 09:16:55 np0005548788.localdomain podman[105147]: 2025-12-06 09:16:55.274485424 +0000 UTC m=+0.094488501 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 06 09:16:55 np0005548788.localdomain podman[105164]: 2025-12-06 09:16:55.340376788 +0000 UTC m=+0.141465588 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12)
Dec 06 09:16:55 np0005548788.localdomain podman[105164]: unhealthy
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:16:55 np0005548788.localdomain podman[105147]: 2025-12-06 09:16:55.360500662 +0000 UTC m=+0.180503729 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z)
Dec 06 09:16:55 np0005548788.localdomain podman[105147]: unhealthy
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:16:55 np0005548788.localdomain podman[105146]: 2025-12-06 09:16:55.343800884 +0000 UTC m=+0.164726149 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron)
Dec 06 09:16:55 np0005548788.localdomain podman[105149]: 2025-12-06 09:16:55.410766491 +0000 UTC m=+0.224910186 container health_status 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, architecture=x86_64)
Dec 06 09:16:55 np0005548788.localdomain podman[105146]: 2025-12-06 09:16:55.42558817 +0000 UTC m=+0.246513425 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=logrotate_crond, architecture=x86_64, release=1761123044, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:16:55 np0005548788.localdomain podman[105149]: 2025-12-06 09:16:55.444448865 +0000 UTC m=+0.258592560 container exec_died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=)
Dec 06 09:16:55 np0005548788.localdomain podman[105148]: 2025-12-06 09:16:55.446923752 +0000 UTC m=+0.253301186 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Deactivated successfully.
Dec 06 09:16:55 np0005548788.localdomain podman[105172]: 2025-12-06 09:16:55.509627816 +0000 UTC m=+0.309748596 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Dec 06 09:16:55 np0005548788.localdomain podman[105148]: 2025-12-06 09:16:55.531159504 +0000 UTC m=+0.337536938 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, com.redhat.component=openstack-collectd-container)
Dec 06 09:16:55 np0005548788.localdomain podman[105172]: 2025-12-06 09:16:55.542046372 +0000 UTC m=+0.342167192 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:16:55 np0005548788.localdomain podman[105152]: 2025-12-06 09:16:55.549558584 +0000 UTC m=+0.356418554 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid)
Dec 06 09:16:55 np0005548788.localdomain podman[105152]: 2025-12-06 09:16:55.633583351 +0000 UTC m=+0.440443290 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:16:55 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:16:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47266 DF PROTO=TCP SPT=52594 DPT=9102 SEQ=293429940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D115310000000001030307) 
Dec 06 09:16:57 np0005548788.localdomain sshd[105301]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:57 np0005548788.localdomain sshd[105301]: Accepted publickey for zuul from 192.168.122.31 port 47316 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:16:57 np0005548788.localdomain systemd-logind[765]: New session 37 of user zuul.
Dec 06 09:16:57 np0005548788.localdomain systemd[1]: Started Session 37 of User zuul.
Dec 06 09:16:57 np0005548788.localdomain sshd[105301]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:16:58 np0005548788.localdomain sudo[105394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyqrjntiktgfqwrgoaeedudnoetdlpmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012617.7839391-23-184271721412819/AnsiballZ_systemd_service.py
Dec 06 09:16:58 np0005548788.localdomain sudo[105394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:58 np0005548788.localdomain python3.9[105396]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:16:58 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:16:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63115 DF PROTO=TCP SPT=46082 DPT=9100 SEQ=2901438704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D11DF00000000001030307) 
Dec 06 09:16:58 np0005548788.localdomain systemd-rc-local-generator[105419]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:16:58 np0005548788.localdomain systemd-sysv-generator[105422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:16:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:16:59 np0005548788.localdomain sudo[105394]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:00 np0005548788.localdomain python3.9[105522]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:17:00 np0005548788.localdomain network[105539]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:17:00 np0005548788.localdomain network[105540]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:17:00 np0005548788.localdomain network[105541]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:17:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:17:00 np0005548788.localdomain sudo[105551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:17:00 np0005548788.localdomain sudo[105551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:17:00 np0005548788.localdomain sudo[105551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:00 np0005548788.localdomain podman[105545]: 2025-12-06 09:17:00.278711451 +0000 UTC m=+0.101692095 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Dec 06 09:17:00 np0005548788.localdomain sudo[105578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:17:00 np0005548788.localdomain sudo[105578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:17:00 np0005548788.localdomain podman[105545]: 2025-12-06 09:17:00.64948922 +0000 UTC m=+0.472469864 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12)
Dec 06 09:17:00 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:17:01 np0005548788.localdomain sudo[105578]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:01 np0005548788.localdomain sshd[105647]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:01 np0005548788.localdomain sudo[105633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:17:01 np0005548788.localdomain sudo[105633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:17:01 np0005548788.localdomain sudo[105633]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:02 np0005548788.localdomain sshd[105647]: Received disconnect from 148.227.3.232 port 54060:11: Bye Bye [preauth]
Dec 06 09:17:02 np0005548788.localdomain sshd[105647]: Disconnected from authenticating user root 148.227.3.232 port 54060 [preauth]
Dec 06 09:17:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:17:04 np0005548788.localdomain podman[105740]: 2025-12-06 09:17:04.634498349 +0000 UTC m=+0.096901966 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:17:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28098 DF PROTO=TCP SPT=54288 DPT=9101 SEQ=1704503360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1356E0000000001030307) 
Dec 06 09:17:04 np0005548788.localdomain podman[105740]: 2025-12-06 09:17:04.856720391 +0000 UTC m=+0.319123998 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 06 09:17:04 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:17:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47267 DF PROTO=TCP SPT=52594 DPT=9102 SEQ=293429940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D135F00000000001030307) 
Dec 06 09:17:06 np0005548788.localdomain python3.9[105869]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:17:06 np0005548788.localdomain network[105886]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:17:06 np0005548788.localdomain network[105887]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:17:06 np0005548788.localdomain network[105888]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:17:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:17:06 np0005548788.localdomain podman[105894]: 2025-12-06 09:17:06.722928824 +0000 UTC m=+0.086101242 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute)
Dec 06 09:17:06 np0005548788.localdomain podman[105894]: 2025-12-06 09:17:06.741422287 +0000 UTC m=+0.104594705 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute)
Dec 06 09:17:06 np0005548788.localdomain podman[105894]: unhealthy
Dec 06 09:17:07 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:07 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:17:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28100 DF PROTO=TCP SPT=54288 DPT=9101 SEQ=1704503360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D141700000000001030307) 
Dec 06 09:17:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29709 DF PROTO=TCP SPT=46786 DPT=9105 SEQ=2845156926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D14BF00000000001030307) 
Dec 06 09:17:10 np0005548788.localdomain sudo[106107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hviojsccxdkqsxooyomdgphdprprdupm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012630.3978386-113-160362816740695/AnsiballZ_systemd_service.py
Dec 06 09:17:10 np0005548788.localdomain sudo[106107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:17:10 np0005548788.localdomain python3.9[106109]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:17:11 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:17:11 np0005548788.localdomain systemd-rc-local-generator[106139]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:17:11 np0005548788.localdomain systemd-sysv-generator[106142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:17:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:11 np0005548788.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Dec 06 09:17:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27583 DF PROTO=TCP SPT=43486 DPT=9100 SEQ=2545570766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D157980000000001030307) 
Dec 06 09:17:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27585 DF PROTO=TCP SPT=43486 DPT=9100 SEQ=2545570766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D163B00000000001030307) 
Dec 06 09:17:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41272 DF PROTO=TCP SPT=55694 DPT=9882 SEQ=4092302388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D16F390000000001030307) 
Dec 06 09:17:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49788 DF PROTO=TCP SPT=39092 DPT=9102 SEQ=5883904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D17AB10000000001030307) 
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:17:25 np0005548788.localdomain podman[106184]: 2025-12-06 09:17:25.861858704 +0000 UTC m=+0.149836618 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller)
Dec 06 09:17:25 np0005548788.localdomain podman[106191]: 2025-12-06 09:17:25.820076399 +0000 UTC m=+0.101316683 container health_status e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:17:25 np0005548788.localdomain podman[106184]: 2025-12-06 09:17:25.874671751 +0000 UTC m=+0.162649725 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 06 09:17:25 np0005548788.localdomain podman[106184]: unhealthy
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:25 np0005548788.localdomain podman[106166]: 2025-12-06 09:17:25.886386655 +0000 UTC m=+0.185310828 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12)
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:17:25 np0005548788.localdomain podman[106166]: 2025-12-06 09:17:25.89558719 +0000 UTC m=+0.194511323 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64)
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:17:25 np0005548788.localdomain podman[106172]: Error: container 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 is not running
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Failed with result 'exit-code'.
Dec 06 09:17:25 np0005548788.localdomain podman[106164]: 2025-12-06 09:17:25.797137337 +0000 UTC m=+0.105783901 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 06 09:17:25 np0005548788.localdomain podman[106191]: 2025-12-06 09:17:25.950824323 +0000 UTC m=+0.232064657 container exec_died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Deactivated successfully.
Dec 06 09:17:25 np0005548788.localdomain podman[106164]: 2025-12-06 09:17:25.979484512 +0000 UTC m=+0.288131056 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:17:25 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:17:26 np0005548788.localdomain podman[106175]: 2025-12-06 09:17:26.066086668 +0000 UTC m=+0.352356738 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:17:26 np0005548788.localdomain podman[106165]: 2025-12-06 09:17:26.099629428 +0000 UTC m=+0.401611596 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:17:26 np0005548788.localdomain podman[106165]: 2025-12-06 09:17:26.119580247 +0000 UTC m=+0.421562445 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, distribution-scope=public, config_id=tripleo_step4)
Dec 06 09:17:26 np0005548788.localdomain podman[106165]: unhealthy
Dec 06 09:17:26 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:26 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:17:26 np0005548788.localdomain podman[106175]: 2025-12-06 09:17:26.15196242 +0000 UTC m=+0.438232460 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 09:17:26 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:17:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49789 DF PROTO=TCP SPT=39092 DPT=9102 SEQ=5883904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D18A700000000001030307) 
Dec 06 09:17:26 np0005548788.localdomain systemd[1]: tmp-crun.RxBFe6.mount: Deactivated successfully.
Dec 06 09:17:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27587 DF PROTO=TCP SPT=43486 DPT=9100 SEQ=2545570766 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D193F00000000001030307) 
Dec 06 09:17:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:17:31 np0005548788.localdomain podman[106298]: 2025-12-06 09:17:31.012969937 +0000 UTC m=+0.088502526 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044)
Dec 06 09:17:31 np0005548788.localdomain podman[106298]: 2025-12-06 09:17:31.380608418 +0000 UTC m=+0.456140977 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:17:31 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:17:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49790 DF PROTO=TCP SPT=39092 DPT=9102 SEQ=5883904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1A9F00000000001030307) 
Dec 06 09:17:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28105 DF PROTO=TCP SPT=56330 DPT=9101 SEQ=2327181084 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1AA9E0000000001030307) 
Dec 06 09:17:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:17:35 np0005548788.localdomain podman[106321]: 2025-12-06 09:17:35.25059621 +0000 UTC m=+0.077308638 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:17:35 np0005548788.localdomain podman[106321]: 2025-12-06 09:17:35.484688069 +0000 UTC m=+0.311400497 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:17:35 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:17:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:17:37 np0005548788.localdomain podman[106351]: 2025-12-06 09:17:37.256916168 +0000 UTC m=+0.081655913 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Dec 06 09:17:37 np0005548788.localdomain podman[106351]: 2025-12-06 09:17:37.280749678 +0000 UTC m=+0.105489433 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:17:37 np0005548788.localdomain podman[106351]: unhealthy
Dec 06 09:17:37 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:37 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:17:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28107 DF PROTO=TCP SPT=56330 DPT=9101 SEQ=2327181084 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1B6B00000000001030307) 
Dec 06 09:17:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26364 DF PROTO=TCP SPT=56552 DPT=9105 SEQ=444361219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1C1F00000000001030307) 
Dec 06 09:17:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17458 DF PROTO=TCP SPT=43596 DPT=9100 SEQ=3933713671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1CCBF0000000001030307) 
Dec 06 09:17:45 np0005548788.localdomain sshd[106374]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:46 np0005548788.localdomain sshd[106374]: Received disconnect from 179.43.189.36 port 34166:11: Bye Bye [preauth]
Dec 06 09:17:46 np0005548788.localdomain sshd[106374]: Disconnected from authenticating user root 179.43.189.36 port 34166 [preauth]
Dec 06 09:17:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17460 DF PROTO=TCP SPT=43596 DPT=9100 SEQ=3933713671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1D8B00000000001030307) 
Dec 06 09:17:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50268 DF PROTO=TCP SPT=36412 DPT=9882 SEQ=1353232596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1E46A0000000001030307) 
Dec 06 09:17:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48260 DF PROTO=TCP SPT=42232 DPT=9102 SEQ=2808642685 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1EFB00000000001030307) 
Dec 06 09:17:53 np0005548788.localdomain podman[106150]: time="2025-12-06T09:17:53Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: libpod-57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.scope: Deactivated successfully.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: libpod-57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.scope: Consumed 5.285s CPU time.
Dec 06 09:17:53 np0005548788.localdomain podman[106150]: 2025-12-06 09:17:53.546227458 +0000 UTC m=+42.091897429 container stop 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Dec 06 09:17:53 np0005548788.localdomain podman[106150]: 2025-12-06 09:17:53.583588766 +0000 UTC m=+42.129258757 container died 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.timer: Deactivated successfully.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Failed to open /run/systemd/transient/57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: No such file or directory
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669-userdata-shm.mount: Deactivated successfully.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-78c1ff78886d1d879971aeaf25162bcf2766722ebe427ed02a36d32d0aa52834-merged.mount: Deactivated successfully.
Dec 06 09:17:53 np0005548788.localdomain podman[106150]: 2025-12-06 09:17:53.646735655 +0000 UTC m=+42.192405606 container cleanup 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 09:17:53 np0005548788.localdomain podman[106150]: ceilometer_agent_compute
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.timer: Failed to open /run/systemd/transient/57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.timer: No such file or directory
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Failed to open /run/systemd/transient/57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: No such file or directory
Dec 06 09:17:53 np0005548788.localdomain podman[106377]: 2025-12-06 09:17:53.661725269 +0000 UTC m=+0.094075657 container cleanup 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: libpod-conmon-57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.scope: Deactivated successfully.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.timer: Failed to open /run/systemd/transient/57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.timer: No such file or directory
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: Failed to open /run/systemd/transient/57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669.service: No such file or directory
Dec 06 09:17:53 np0005548788.localdomain podman[106389]: 2025-12-06 09:17:53.766698815 +0000 UTC m=+0.070961302 container cleanup 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 06 09:17:53 np0005548788.localdomain podman[106389]: ceilometer_agent_compute
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Dec 06 09:17:53 np0005548788.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.110s CPU time, no IO.
Dec 06 09:17:53 np0005548788.localdomain sudo[106107]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:54 np0005548788.localdomain sudo[106491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nizwghinozbidvueavklqdubllghettx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012673.9414907-113-249537642646965/AnsiballZ_systemd_service.py
Dec 06 09:17:54 np0005548788.localdomain sudo[106491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:17:54 np0005548788.localdomain sshd[106494]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:54 np0005548788.localdomain python3.9[106493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:17:54 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:17:54 np0005548788.localdomain systemd-sysv-generator[106523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:17:54 np0005548788.localdomain systemd-rc-local-generator[106519]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:17:54 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:55 np0005548788.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:17:56 np0005548788.localdomain podman[106551]: 2025-12-06 09:17:56.276331902 +0000 UTC m=+0.100684253 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git)
Dec 06 09:17:56 np0005548788.localdomain podman[106551]: 2025-12-06 09:17:56.286155796 +0000 UTC m=+0.110508087 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:17:56 np0005548788.localdomain podman[106563]: Error: container e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf is not running
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Failed with result 'exit-code'.
Dec 06 09:17:56 np0005548788.localdomain podman[106571]: 2025-12-06 09:17:56.301344818 +0000 UTC m=+0.106437062 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:17:56 np0005548788.localdomain podman[106552]: 2025-12-06 09:17:56.387914793 +0000 UTC m=+0.204255646 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1)
Dec 06 09:17:56 np0005548788.localdomain sshd[106494]: Received disconnect from 36.50.177.119 port 52996:11: Bye Bye [preauth]
Dec 06 09:17:56 np0005548788.localdomain sshd[106494]: Disconnected from authenticating user root 36.50.177.119 port 52996 [preauth]
Dec 06 09:17:56 np0005548788.localdomain podman[106552]: 2025-12-06 09:17:56.433574928 +0000 UTC m=+0.249915781 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:17:56 np0005548788.localdomain podman[106552]: unhealthy
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:17:56 np0005548788.localdomain podman[106571]: 2025-12-06 09:17:56.484731315 +0000 UTC m=+0.289823529 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, url=https://www.redhat.com)
Dec 06 09:17:56 np0005548788.localdomain podman[106553]: 2025-12-06 09:17:56.500352699 +0000 UTC m=+0.319251461 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-collectd, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:17:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48261 DF PROTO=TCP SPT=42232 DPT=9102 SEQ=2808642685 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D1FF700000000001030307) 
Dec 06 09:17:56 np0005548788.localdomain podman[106553]: 2025-12-06 09:17:56.534722545 +0000 UTC m=+0.353621297 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:17:56 np0005548788.localdomain podman[106554]: 2025-12-06 09:17:56.440012898 +0000 UTC m=+0.254607207 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container)
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:17:56 np0005548788.localdomain podman[106554]: 2025-12-06 09:17:56.573061484 +0000 UTC m=+0.387655803 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z)
Dec 06 09:17:56 np0005548788.localdomain podman[106554]: unhealthy
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:56 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:17:58 np0005548788.localdomain sshd[106658]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17462 DF PROTO=TCP SPT=43596 DPT=9100 SEQ=3933713671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D207F00000000001030307) 
Dec 06 09:18:00 np0005548788.localdomain sshd[106660]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:01 np0005548788.localdomain sshd[106660]: Received disconnect from 45.119.84.54 port 38628:11: Bye Bye [preauth]
Dec 06 09:18:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:18:01 np0005548788.localdomain sshd[106660]: Disconnected from authenticating user root 45.119.84.54 port 38628 [preauth]
Dec 06 09:18:02 np0005548788.localdomain podman[106662]: 2025-12-06 09:18:02.010348193 +0000 UTC m=+0.085392489 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=nova_migration_target, url=https://www.redhat.com)
Dec 06 09:18:02 np0005548788.localdomain sudo[106671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:18:02 np0005548788.localdomain sudo[106671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:18:02 np0005548788.localdomain sudo[106671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:02 np0005548788.localdomain sudo[106699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:18:02 np0005548788.localdomain sudo[106699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:18:02 np0005548788.localdomain podman[106662]: 2025-12-06 09:18:02.446046434 +0000 UTC m=+0.521090660 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:18:02 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:18:02 np0005548788.localdomain sudo[106699]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:03 np0005548788.localdomain sudo[106746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:18:03 np0005548788.localdomain sudo[106746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:18:03 np0005548788.localdomain sudo[106746]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:04 np0005548788.localdomain sshd[106658]: Received disconnect from 45.78.219.195 port 59376:11: Bye Bye [preauth]
Dec 06 09:18:04 np0005548788.localdomain sshd[106658]: Disconnected from authenticating user root 45.78.219.195 port 59376 [preauth]
Dec 06 09:18:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42094 DF PROTO=TCP SPT=34554 DPT=9101 SEQ=3220849883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D21FCE0000000001030307) 
Dec 06 09:18:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48262 DF PROTO=TCP SPT=42232 DPT=9102 SEQ=2808642685 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D21FF00000000001030307) 
Dec 06 09:18:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:18:05 np0005548788.localdomain podman[106761]: 2025-12-06 09:18:05.7672993 +0000 UTC m=+0.087681631 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr)
Dec 06 09:18:05 np0005548788.localdomain podman[106761]: 2025-12-06 09:18:05.973876025 +0000 UTC m=+0.294258316 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, vcs-type=git, version=17.1.12)
Dec 06 09:18:05 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:18:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:18:08 np0005548788.localdomain podman[106789]: 2025-12-06 09:18:08.016300283 +0000 UTC m=+0.085836922 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 06 09:18:08 np0005548788.localdomain podman[106789]: 2025-12-06 09:18:08.039330938 +0000 UTC m=+0.108867597 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:18:08 np0005548788.localdomain podman[106789]: unhealthy
Dec 06 09:18:08 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:08 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:18:08 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42096 DF PROTO=TCP SPT=34554 DPT=9101 SEQ=3220849883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D22D300000000001030307) 
Dec 06 09:18:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59615 DF PROTO=TCP SPT=47000 DPT=9105 SEQ=3598553396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D235F00000000001030307) 
Dec 06 09:18:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26738 DF PROTO=TCP SPT=42554 DPT=9100 SEQ=675682730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D241EF0000000001030307) 
Dec 06 09:18:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26740 DF PROTO=TCP SPT=42554 DPT=9100 SEQ=675682730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D24DF00000000001030307) 
Dec 06 09:18:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22995 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=2289745797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D258F80000000001030307) 
Dec 06 09:18:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22997 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=2289745797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D264F00000000001030307) 
Dec 06 09:18:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22998 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=2289745797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D274B00000000001030307) 
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: tmp-crun.xYPU56.mount: Deactivated successfully.
Dec 06 09:18:26 np0005548788.localdomain podman[106815]: 2025-12-06 09:18:26.787250772 +0000 UTC m=+0.101126117 container health_status 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=)
Dec 06 09:18:26 np0005548788.localdomain podman[106835]: Error: container e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf is not running
Dec 06 09:18:26 np0005548788.localdomain podman[106814]: 2025-12-06 09:18:26.825290532 +0000 UTC m=+0.141671115 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent)
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Failed with result 'exit-code'.
Dec 06 09:18:26 np0005548788.localdomain podman[106815]: 2025-12-06 09:18:26.850625467 +0000 UTC m=+0.164500822 container exec_died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1)
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Deactivated successfully.
Dec 06 09:18:26 np0005548788.localdomain podman[106814]: 2025-12-06 09:18:26.861769813 +0000 UTC m=+0.178150406 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git)
Dec 06 09:18:26 np0005548788.localdomain podman[106814]: unhealthy
Dec 06 09:18:26 np0005548788.localdomain podman[106813]: 2025-12-06 09:18:26.766511159 +0000 UTC m=+0.086281717 container health_status 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:32Z)
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:18:26 np0005548788.localdomain podman[106813]: 2025-12-06 09:18:26.900670138 +0000 UTC m=+0.220440706 container exec_died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:18:26 np0005548788.localdomain podman[106824]: 2025-12-06 09:18:26.803436274 +0000 UTC m=+0.108580189 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 09:18:26 np0005548788.localdomain podman[106824]: 2025-12-06 09:18:26.933511267 +0000 UTC m=+0.238655232 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, release=1761123044, version=17.1.12, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 09:18:26 np0005548788.localdomain podman[106824]: unhealthy
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:26 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:18:26 np0005548788.localdomain podman[106819]: 2025-12-06 09:18:26.982190727 +0000 UTC m=+0.290071687 container health_status 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, name=rhosp17/openstack-iscsid, version=17.1.12, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:18:27 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Deactivated successfully.
Dec 06 09:18:27 np0005548788.localdomain podman[106819]: 2025-12-06 09:18:27.020777553 +0000 UTC m=+0.328658563 container exec_died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044)
Dec 06 09:18:27 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Deactivated successfully.
Dec 06 09:18:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26742 DF PROTO=TCP SPT=42554 DPT=9100 SEQ=675682730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D27DF10000000001030307) 
Dec 06 09:18:31 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:18:31 np0005548788.localdomain recover_tripleo_nova_virtqemud[106917]: 62021
Dec 06 09:18:31 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:18:31 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:18:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:18:33 np0005548788.localdomain podman[106918]: 2025-12-06 09:18:33.261523908 +0000 UTC m=+0.085875905 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:18:33 np0005548788.localdomain podman[106918]: 2025-12-06 09:18:33.66177078 +0000 UTC m=+0.486122837 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:18:33 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:18:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55769 DF PROTO=TCP SPT=38486 DPT=9101 SEQ=4035988435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D294FE0000000001030307) 
Dec 06 09:18:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49136 DF PROTO=TCP SPT=53580 DPT=9882 SEQ=151742411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D295F00000000001030307) 
Dec 06 09:18:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:18:36 np0005548788.localdomain systemd[1]: tmp-crun.pWbNMf.mount: Deactivated successfully.
Dec 06 09:18:36 np0005548788.localdomain podman[106941]: 2025-12-06 09:18:36.259596552 +0000 UTC m=+0.090269670 container health_status 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, version=17.1.12, config_id=tripleo_step1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:18:36 np0005548788.localdomain podman[106941]: 2025-12-06 09:18:36.463911108 +0000 UTC m=+0.294584246 container exec_died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step1, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 09:18:36 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain podman[106536]: time="2025-12-06T09:18:37Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: libpod-e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.scope: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: libpod-e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.scope: Consumed 5.532s CPU time.
Dec 06 09:18:37 np0005548788.localdomain podman[106536]: 2025-12-06 09:18:37.202301357 +0000 UTC m=+42.088965691 container stop e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 09:18:37 np0005548788.localdomain podman[106536]: 2025-12-06 09:18:37.235505517 +0000 UTC m=+42.122169831 container died e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public)
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.timer: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Failed to open /run/systemd/transient/e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: No such file or directory
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c957b2bd9594b3144e0926de62e77156dbabeefc8e2acf756f315a98b85b5f52-merged.mount: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain podman[106536]: 2025-12-06 09:18:37.289778849 +0000 UTC m=+42.176443113 container cleanup e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:18:37 np0005548788.localdomain podman[106536]: ceilometer_agent_ipmi
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.timer: Failed to open /run/systemd/transient/e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.timer: No such file or directory
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Failed to open /run/systemd/transient/e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: No such file or directory
Dec 06 09:18:37 np0005548788.localdomain podman[106972]: 2025-12-06 09:18:37.305654482 +0000 UTC m=+0.083745048 container cleanup e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: libpod-conmon-e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.scope: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.timer: Failed to open /run/systemd/transient/e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.timer: No such file or directory
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: Failed to open /run/systemd/transient/e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf.service: No such file or directory
Dec 06 09:18:37 np0005548788.localdomain podman[106986]: 2025-12-06 09:18:37.406490149 +0000 UTC m=+0.069755824 container cleanup e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4)
Dec 06 09:18:37 np0005548788.localdomain podman[106986]: ceilometer_agent_ipmi
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Dec 06 09:18:37 np0005548788.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Dec 06 09:18:37 np0005548788.localdomain sudo[106491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55771 DF PROTO=TCP SPT=38486 DPT=9101 SEQ=4035988435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2A0F00000000001030307) 
Dec 06 09:18:37 np0005548788.localdomain sudo[107086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ismhykmnszjygprbkgqzotygzcemseza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012717.6251159-113-98894476250043/AnsiballZ_systemd_service.py
Dec 06 09:18:37 np0005548788.localdomain sudo[107086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:18:38 np0005548788.localdomain python3.9[107088]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:38 np0005548788.localdomain podman[107089]: 2025-12-06 09:18:38.266623133 +0000 UTC m=+0.092793289 container health_status 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Dec 06 09:18:38 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:18:38 np0005548788.localdomain podman[107089]: 2025-12-06 09:18:38.31009428 +0000 UTC m=+0.136264426 container exec_died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step5, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Dec 06 09:18:38 np0005548788.localdomain podman[107089]: unhealthy
Dec 06 09:18:38 np0005548788.localdomain systemd-sysv-generator[107140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:38 np0005548788.localdomain systemd-rc-local-generator[107137]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:38 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:38 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:18:38 np0005548788.localdomain systemd[1]: Stopping collectd container...
Dec 06 09:18:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43045 DF PROTO=TCP SPT=37984 DPT=9105 SEQ=4191079405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2ABF00000000001030307) 
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: libpod-33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.scope: Deactivated successfully.
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: libpod-33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.scope: Consumed 2.153s CPU time.
Dec 06 09:18:42 np0005548788.localdomain podman[107151]: 2025-12-06 09:18:42.708711425 +0000 UTC m=+4.028958172 container died 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.timer: Deactivated successfully.
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Failed to open /run/systemd/transient/33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: No such file or directory
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd5208a191e3dd329f7e505764b52a58d757ae8eaee9e9d3bc670d6f12b2b08-merged.mount: Deactivated successfully.
Dec 06 09:18:42 np0005548788.localdomain podman[107151]: 2025-12-06 09:18:42.756067024 +0000 UTC m=+4.076313751 container cleanup 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd)
Dec 06 09:18:42 np0005548788.localdomain podman[107151]: collectd
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.timer: Failed to open /run/systemd/transient/33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.timer: No such file or directory
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Failed to open /run/systemd/transient/33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: No such file or directory
Dec 06 09:18:42 np0005548788.localdomain podman[107165]: 2025-12-06 09:18:42.816933102 +0000 UTC m=+0.093553162 container cleanup 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: libpod-conmon-33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.scope: Deactivated successfully.
Dec 06 09:18:42 np0005548788.localdomain podman[107194]: error opening file `/run/crun/33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d/status`: No such file or directory
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.timer: Failed to open /run/systemd/transient/33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.timer: No such file or directory
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: Failed to open /run/systemd/transient/33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d.service: No such file or directory
Dec 06 09:18:42 np0005548788.localdomain podman[107182]: 2025-12-06 09:18:42.933795016 +0000 UTC m=+0.081857839 container cleanup 33a1e6cc13f401655e3c1a14bfacd9faed982c9176486a6f1d27262a0f1c3d5d (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 09:18:42 np0005548788.localdomain podman[107182]: collectd
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Dec 06 09:18:42 np0005548788.localdomain systemd[1]: Stopped collectd container.
Dec 06 09:18:42 np0005548788.localdomain sudo[107086]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:43 np0005548788.localdomain sudo[107286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgcfsckzdyrooblzmjcbcbqwzqgjbeon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012723.1132429-113-150471844692348/AnsiballZ_systemd_service.py
Dec 06 09:18:43 np0005548788.localdomain sudo[107286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32887 DF PROTO=TCP SPT=37558 DPT=9100 SEQ=3295680806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2B7200000000001030307) 
Dec 06 09:18:43 np0005548788.localdomain python3.9[107288]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:43 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:18:43 np0005548788.localdomain systemd-rc-local-generator[107313]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:43 np0005548788.localdomain systemd-sysv-generator[107321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:43 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: Stopping iscsid container...
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: libpod-6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.scope: Deactivated successfully.
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: libpod-6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.scope: Consumed 1.134s CPU time.
Dec 06 09:18:44 np0005548788.localdomain podman[107329]: 2025-12-06 09:18:44.255035088 +0000 UTC m=+0.076246815 container died 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.timer: Deactivated successfully.
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Failed to open /run/systemd/transient/6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: No such file or directory
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:44 np0005548788.localdomain podman[107329]: 2025-12-06 09:18:44.308430934 +0000 UTC m=+0.129642681 container cleanup 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:18:44 np0005548788.localdomain podman[107329]: iscsid
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.timer: Failed to open /run/systemd/transient/6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.timer: No such file or directory
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Failed to open /run/systemd/transient/6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: No such file or directory
Dec 06 09:18:44 np0005548788.localdomain podman[107343]: 2025-12-06 09:18:44.327043892 +0000 UTC m=+0.059009561 container cleanup 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, container_name=iscsid, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid)
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: libpod-conmon-6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.scope: Deactivated successfully.
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.timer: Failed to open /run/systemd/transient/6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.timer: No such file or directory
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: Failed to open /run/systemd/transient/6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33.service: No such file or directory
Dec 06 09:18:44 np0005548788.localdomain podman[107357]: 2025-12-06 09:18:44.439026134 +0000 UTC m=+0.074001635 container cleanup 6aa24b57c2a72eb2d141da45b07e2cf6752a9296d4cd2e588b44f41925f15d33 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z)
Dec 06 09:18:44 np0005548788.localdomain podman[107357]: iscsid
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: Stopped iscsid container.
Dec 06 09:18:44 np0005548788.localdomain sudo[107286]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-42c49c691de3a79d7c64c1e2b1baf2d52b814d8ec4049f7fba3b2602f1480e6a-merged.mount: Deactivated successfully.
Dec 06 09:18:44 np0005548788.localdomain sudo[107459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qomepmpoxqdwjlzwtlngxyycmstazguv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012724.6075144-113-1645042398971/AnsiballZ_systemd_service.py
Dec 06 09:18:44 np0005548788.localdomain sudo[107459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:45 np0005548788.localdomain python3.9[107461]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:18:45 np0005548788.localdomain systemd-rc-local-generator[107484]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:45 np0005548788.localdomain systemd-sysv-generator[107488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: Stopping logrotate_crond container...
Dec 06 09:18:45 np0005548788.localdomain crond[68953]: (CRON) INFO (Shutting down)
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: libpod-0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.scope: Deactivated successfully.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: libpod-0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.scope: Consumed 1.127s CPU time.
Dec 06 09:18:45 np0005548788.localdomain podman[107502]: 2025-12-06 09:18:45.725753527 +0000 UTC m=+0.080684492 container died 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.timer: Deactivated successfully.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Failed to open /run/systemd/transient/0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: No such file or directory
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-836e61d982b1d47c704516d9b7a84248dd1953565408f432976e6b13308663a5-merged.mount: Deactivated successfully.
Dec 06 09:18:45 np0005548788.localdomain podman[107502]: 2025-12-06 09:18:45.776844342 +0000 UTC m=+0.131775297 container cleanup 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, distribution-scope=public, container_name=logrotate_crond, version=17.1.12, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:18:45 np0005548788.localdomain podman[107502]: logrotate_crond
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.timer: Failed to open /run/systemd/transient/0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.timer: No such file or directory
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Failed to open /run/systemd/transient/0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: No such file or directory
Dec 06 09:18:45 np0005548788.localdomain podman[107515]: 2025-12-06 09:18:45.821901789 +0000 UTC m=+0.086010818 container cleanup 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: libpod-conmon-0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.scope: Deactivated successfully.
Dec 06 09:18:45 np0005548788.localdomain podman[107546]: error opening file `/run/crun/0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9/status`: No such file or directory
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.timer: Failed to open /run/systemd/transient/0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.timer: No such file or directory
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: Failed to open /run/systemd/transient/0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9.service: No such file or directory
Dec 06 09:18:45 np0005548788.localdomain podman[107534]: 2025-12-06 09:18:45.935278055 +0000 UTC m=+0.079760804 container cleanup 0f206c34d970f765fec8e8659cff8ba1b815e5e3a0a342dc3ef16d75900118d9 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 09:18:45 np0005548788.localdomain podman[107534]: logrotate_crond
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Dec 06 09:18:45 np0005548788.localdomain systemd[1]: Stopped logrotate_crond container.
Dec 06 09:18:45 np0005548788.localdomain sudo[107459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:46 np0005548788.localdomain sudo[107638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoopumpxgkmbgbdyokdnzifakofxwlvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012726.0940301-113-142179707961284/AnsiballZ_systemd_service.py
Dec 06 09:18:46 np0005548788.localdomain sudo[107638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32889 DF PROTO=TCP SPT=37558 DPT=9100 SEQ=3295680806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2C3310000000001030307) 
Dec 06 09:18:46 np0005548788.localdomain python3.9[107640]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:46 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:18:46 np0005548788.localdomain systemd-rc-local-generator[107664]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:46 np0005548788.localdomain systemd-sysv-generator[107668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:46 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: Stopping metrics_qdr container...
Dec 06 09:18:47 np0005548788.localdomain kernel: qdrouterd[54717]: segfault at 0 ip 00007fd1f04627cb sp 00007ffe4ea365b0 error 4 in libc.so.6[7fd1f03ff000+175000]
Dec 06 09:18:47 np0005548788.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: Started Process Core Dump (PID 107695/UID 0).
Dec 06 09:18:47 np0005548788.localdomain systemd-coredump[107696]: Resource limits disable core dumping for process 54717 (qdrouterd).
Dec 06 09:18:47 np0005548788.localdomain systemd-coredump[107696]: Process 54717 (qdrouterd) of user 42465 dumped core.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: systemd-coredump@0-107695-0.service: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain podman[107681]: 2025-12-06 09:18:47.408490251 +0000 UTC m=+0.252795350 container died 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible)
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: libpod-3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.scope: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: libpod-3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.scope: Consumed 29.125s CPU time.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.timer: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Failed to open /run/systemd/transient/3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: No such file or directory
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: tmp-crun.x9w7A3.mount: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain podman[107681]: 2025-12-06 09:18:47.527621296 +0000 UTC m=+0.371926365 container cleanup 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=)
Dec 06 09:18:47 np0005548788.localdomain podman[107681]: metrics_qdr
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.timer: Failed to open /run/systemd/transient/3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.timer: No such file or directory
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Failed to open /run/systemd/transient/3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: No such file or directory
Dec 06 09:18:47 np0005548788.localdomain podman[107700]: 2025-12-06 09:18:47.542500077 +0000 UTC m=+0.116117703 container cleanup 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com)
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: libpod-conmon-3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.scope: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.timer: Failed to open /run/systemd/transient/3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.timer: No such file or directory
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: Failed to open /run/systemd/transient/3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f.service: No such file or directory
Dec 06 09:18:47 np0005548788.localdomain podman[107715]: 2025-12-06 09:18:47.646635366 +0000 UTC m=+0.071698325 container cleanup 3d68ca0b1520e204864b4b7e81507863284f5420758fb49a59c606a18538784f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '81961d6936cf88d92c0300cf23428c94'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 09:18:47 np0005548788.localdomain podman[107715]: metrics_qdr
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: Stopped metrics_qdr container.
Dec 06 09:18:47 np0005548788.localdomain sudo[107638]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-33d4ba4a6e0259b5150b68a23f46c9e702457315d900e4a8419ae01ffeed1203-merged.mount: Deactivated successfully.
Dec 06 09:18:47 np0005548788.localdomain sshd[107754]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:48 np0005548788.localdomain sudo[107818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huvcleavdrikncmweakalyzophlrovvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012727.8307354-113-228100258748031/AnsiballZ_systemd_service.py
Dec 06 09:18:48 np0005548788.localdomain sudo[107818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:48 np0005548788.localdomain python3.9[107820]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:48 np0005548788.localdomain sudo[107818]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:48 np0005548788.localdomain sudo[107911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fastmauybliaxdwjtuucjgiobzgnrrpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012728.6128893-113-201643087234150/AnsiballZ_systemd_service.py
Dec 06 09:18:48 np0005548788.localdomain sudo[107911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:49 np0005548788.localdomain python3.9[107913]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:49 np0005548788.localdomain sudo[107911]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:49 np0005548788.localdomain sshd[107926]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13989 DF PROTO=TCP SPT=44384 DPT=9882 SEQ=1416599097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2CECA0000000001030307) 
Dec 06 09:18:49 np0005548788.localdomain sudo[108006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjisggowspbloorvcsghhpijpfncuyxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012729.374454-113-104864001214605/AnsiballZ_systemd_service.py
Dec 06 09:18:49 np0005548788.localdomain sudo[108006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:49 np0005548788.localdomain python3.9[108008]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:49 np0005548788.localdomain sudo[108006]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:50 np0005548788.localdomain sshd[107926]: Received disconnect from 179.43.189.36 port 48834:11: Bye Bye [preauth]
Dec 06 09:18:50 np0005548788.localdomain sshd[107926]: Disconnected from authenticating user root 179.43.189.36 port 48834 [preauth]
Dec 06 09:18:50 np0005548788.localdomain sudo[108099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrpmfesdgxovgpiavktzdvkgpsnxfnrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012730.090198-113-272947443246466/AnsiballZ_systemd_service.py
Dec 06 09:18:50 np0005548788.localdomain sudo[108099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:50 np0005548788.localdomain python3.9[108101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:50 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:18:50 np0005548788.localdomain systemd-rc-local-generator[108131]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:50 np0005548788.localdomain systemd-sysv-generator[108134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:50 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:51 np0005548788.localdomain systemd[1]: Stopping nova_compute container...
Dec 06 09:18:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38386 DF PROTO=TCP SPT=59278 DPT=9102 SEQ=3909320554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2DA300000000001030307) 
Dec 06 09:18:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38387 DF PROTO=TCP SPT=59278 DPT=9102 SEQ=3909320554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2E9F00000000001030307) 
Dec 06 09:18:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:18:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:18:57 np0005548788.localdomain podman[108156]: 2025-12-06 09:18:57.264403005 +0000 UTC m=+0.082075446 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.12, config_id=tripleo_step4, release=1761123044, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:18:57 np0005548788.localdomain podman[108156]: 2025-12-06 09:18:57.309320008 +0000 UTC m=+0.126992439 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com)
Dec 06 09:18:57 np0005548788.localdomain podman[108156]: unhealthy
Dec 06 09:18:57 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:57 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:18:57 np0005548788.localdomain podman[108155]: 2025-12-06 09:18:57.316235073 +0000 UTC m=+0.140464187 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, io.openshift.expose-services=)
Dec 06 09:18:57 np0005548788.localdomain podman[108155]: 2025-12-06 09:18:57.399785704 +0000 UTC m=+0.224014788 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent)
Dec 06 09:18:57 np0005548788.localdomain podman[108155]: unhealthy
Dec 06 09:18:57 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:57 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:18:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32891 DF PROTO=TCP SPT=37558 DPT=9100 SEQ=3295680806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D2F3F00000000001030307) 
Dec 06 09:19:03 np0005548788.localdomain sudo[108193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:19:03 np0005548788.localdomain sudo[108193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:19:03 np0005548788.localdomain sudo[108193]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:03 np0005548788.localdomain sudo[108208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:19:03 np0005548788.localdomain sudo[108208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:19:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:19:04 np0005548788.localdomain podman[108223]: 2025-12-06 09:19:04.013305217 +0000 UTC m=+0.087731442 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:19:04 np0005548788.localdomain sudo[108208]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:04 np0005548788.localdomain podman[108223]: 2025-12-06 09:19:04.458682008 +0000 UTC m=+0.533108263 container exec_died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:19:04 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:19:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38388 DF PROTO=TCP SPT=59278 DPT=9102 SEQ=3909320554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D309F00000000001030307) 
Dec 06 09:19:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46632 DF PROTO=TCP SPT=34398 DPT=9101 SEQ=4205472505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D30A2E0000000001030307) 
Dec 06 09:19:05 np0005548788.localdomain sudo[108275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:19:05 np0005548788.localdomain sudo[108275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:19:05 np0005548788.localdomain sudo[108275]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46634 DF PROTO=TCP SPT=34398 DPT=9101 SEQ=4205472505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D316300000000001030307) 
Dec 06 09:19:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:19:08 np0005548788.localdomain podman[108290]: Error: container 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 is not running
Dec 06 09:19:08 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:19:08 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed with result 'exit-code'.
Dec 06 09:19:11 np0005548788.localdomain sshd[108303]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:11 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46635 DF PROTO=TCP SPT=34398 DPT=9101 SEQ=4205472505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D325F00000000001030307) 
Dec 06 09:19:13 np0005548788.localdomain sshd[108303]: Received disconnect from 36.50.177.119 port 51100:11: Bye Bye [preauth]
Dec 06 09:19:13 np0005548788.localdomain sshd[108303]: Disconnected from authenticating user root 36.50.177.119 port 51100 [preauth]
Dec 06 09:19:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30887 DF PROTO=TCP SPT=46332 DPT=9100 SEQ=1340234631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D32C4F0000000001030307) 
Dec 06 09:19:16 np0005548788.localdomain sshd[108305]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30889 DF PROTO=TCP SPT=46332 DPT=9100 SEQ=1340234631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D338700000000001030307) 
Dec 06 09:19:17 np0005548788.localdomain sshd[108305]: Received disconnect from 45.119.84.54 port 37226:11: Bye Bye [preauth]
Dec 06 09:19:17 np0005548788.localdomain sshd[108305]: Disconnected from authenticating user root 45.119.84.54 port 37226 [preauth]
Dec 06 09:19:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52403 DF PROTO=TCP SPT=39228 DPT=9882 SEQ=673647893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D343FA0000000001030307) 
Dec 06 09:19:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39022 DF PROTO=TCP SPT=47960 DPT=9102 SEQ=3885708257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D34F700000000001030307) 
Dec 06 09:19:25 np0005548788.localdomain sshd[108307]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:25 np0005548788.localdomain sshd[108307]: Received disconnect from 148.227.3.232 port 37728:11: Bye Bye [preauth]
Dec 06 09:19:25 np0005548788.localdomain sshd[108307]: Disconnected from authenticating user root 148.227.3.232 port 37728 [preauth]
Dec 06 09:19:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39023 DF PROTO=TCP SPT=47960 DPT=9102 SEQ=3885708257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D35F300000000001030307) 
Dec 06 09:19:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:19:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:19:27 np0005548788.localdomain podman[108309]: 2025-12-06 09:19:27.515768396 +0000 UTC m=+0.090926861 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible)
Dec 06 09:19:27 np0005548788.localdomain podman[108310]: 2025-12-06 09:19:27.561310648 +0000 UTC m=+0.131568582 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1)
Dec 06 09:19:27 np0005548788.localdomain podman[108310]: 2025-12-06 09:19:27.576437668 +0000 UTC m=+0.146695621 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 06 09:19:27 np0005548788.localdomain podman[108310]: unhealthy
Dec 06 09:19:27 np0005548788.localdomain podman[108309]: 2025-12-06 09:19:27.587601834 +0000 UTC m=+0.162760299 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team)
Dec 06 09:19:27 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:27 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:19:27 np0005548788.localdomain podman[108309]: unhealthy
Dec 06 09:19:27 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:27 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:19:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30891 DF PROTO=TCP SPT=46332 DPT=9100 SEQ=1340234631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D367F10000000001030307) 
Dec 06 09:19:29 np0005548788.localdomain sshd[107754]: Received disconnect from 101.47.142.76 port 58374:11: Bye Bye [preauth]
Dec 06 09:19:29 np0005548788.localdomain sshd[107754]: Disconnected from 101.47.142.76 port 58374 [preauth]
Dec 06 09:19:33 np0005548788.localdomain podman[108142]: time="2025-12-06T09:19:33Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: libpod-56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.scope: Deactivated successfully.
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: libpod-56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.scope: Consumed 29.330s CPU time.
Dec 06 09:19:33 np0005548788.localdomain podman[108142]: 2025-12-06 09:19:33.114148509 +0000 UTC m=+42.089630642 container died 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, release=1761123044, url=https://www.redhat.com, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.timer: Deactivated successfully.
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed to open /run/systemd/transient/56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: No such file or directory
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-7c45bd41b522a65233004a5964b2c23041fe2bca744b4069684cef8b44acf2fd-merged.mount: Deactivated successfully.
Dec 06 09:19:33 np0005548788.localdomain podman[108142]: 2025-12-06 09:19:33.34729997 +0000 UTC m=+42.322782093 container cleanup 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute)
Dec 06 09:19:33 np0005548788.localdomain podman[108142]: nova_compute
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.timer: Failed to open /run/systemd/transient/56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.timer: No such file or directory
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed to open /run/systemd/transient/56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: No such file or directory
Dec 06 09:19:33 np0005548788.localdomain podman[108352]: 2025-12-06 09:19:33.39212178 +0000 UTC m=+0.268984133 container cleanup 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: libpod-conmon-56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.scope: Deactivated successfully.
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.timer: Failed to open /run/systemd/transient/56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.timer: No such file or directory
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: Failed to open /run/systemd/transient/56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76.service: No such file or directory
Dec 06 09:19:33 np0005548788.localdomain podman[108365]: 2025-12-06 09:19:33.511767619 +0000 UTC m=+0.074623904 container cleanup 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:19:33 np0005548788.localdomain podman[108365]: nova_compute
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: Stopped nova_compute container.
Dec 06 09:19:33 np0005548788.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.193s CPU time, no IO.
Dec 06 09:19:33 np0005548788.localdomain sudo[108099]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:33 np0005548788.localdomain sudo[108467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdsxwgzjbjwtfdjwdprwkzdcitedrpes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012773.6799133-113-53040675903490/AnsiballZ_systemd_service.py
Dec 06 09:19:33 np0005548788.localdomain sudo[108467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:19:34 np0005548788.localdomain python3.9[108469]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:19:34 np0005548788.localdomain systemd-sysv-generator[108499]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:19:34 np0005548788.localdomain systemd-rc-local-generator[108496]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: Stopping nova_migration_target container...
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: tmp-crun.RfoOCA.mount: Deactivated successfully.
Dec 06 09:19:34 np0005548788.localdomain podman[108508]: 2025-12-06 09:19:34.808305126 +0000 UTC m=+0.102475148 container health_status 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 09:19:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38791 DF PROTO=TCP SPT=35860 DPT=9101 SEQ=1100404738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D37F5E0000000001030307) 
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: tmp-crun.UJXoBs.mount: Deactivated successfully.
Dec 06 09:19:34 np0005548788.localdomain sshd[69293]: Received signal 15; terminating.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: libpod-48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.scope: Deactivated successfully.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: libpod-48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.scope: Consumed 34.228s CPU time.
Dec 06 09:19:34 np0005548788.localdomain podman[108510]: 2025-12-06 09:19:34.901943721 +0000 UTC m=+0.186442903 container died 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.timer: Deactivated successfully.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: Stopping /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee...
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Deactivated successfully.
Dec 06 09:19:34 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.
Dec 06 09:19:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39024 DF PROTO=TCP SPT=47960 DPT=9102 SEQ=3885708257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D37FF00000000001030307) 
Dec 06 09:19:35 np0005548788.localdomain podman[108510]: 2025-12-06 09:19:35.030866249 +0000 UTC m=+0.315365441 container cleanup 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:19:35 np0005548788.localdomain podman[108510]: nova_migration_target
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.timer: Failed to open /run/systemd/transient/48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.timer: No such file or directory
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Failed to open /run/systemd/transient/48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: No such file or directory
Dec 06 09:19:35 np0005548788.localdomain podman[108544]: 2025-12-06 09:19:35.046886095 +0000 UTC m=+0.131321384 container cleanup 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: libpod-conmon-48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.scope: Deactivated successfully.
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.timer: Failed to open /run/systemd/transient/48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.timer: No such file or directory
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: Failed to open /run/systemd/transient/48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee.service: No such file or directory
Dec 06 09:19:35 np0005548788.localdomain podman[108559]: 2025-12-06 09:19:35.14668269 +0000 UTC m=+0.069801305 container cleanup 48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Dec 06 09:19:35 np0005548788.localdomain podman[108559]: nova_migration_target
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: Stopped nova_migration_target container.
Dec 06 09:19:35 np0005548788.localdomain sudo[108467]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:35 np0005548788.localdomain sudo[108660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhgftrouzjantbclmoravwogqxezoqub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012775.3331602-113-140628615720048/AnsiballZ_systemd_service.py
Dec 06 09:19:35 np0005548788.localdomain sudo[108660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-20be973d9ceea79ef95f10e6d248e592805035801284dea3de186096ff60ff28-merged.mount: Deactivated successfully.
Dec 06 09:19:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48f0e9749a1ada0d56fb66c810680f8e16e5bcf66c85bad8d96f567a6db69fee-userdata-shm.mount: Deactivated successfully.
Dec 06 09:19:35 np0005548788.localdomain python3.9[108662]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:19:36 np0005548788.localdomain systemd-sysv-generator[108694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:19:36 np0005548788.localdomain systemd-rc-local-generator[108690]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: libpod-0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439.scope: Deactivated successfully.
Dec 06 09:19:36 np0005548788.localdomain podman[108703]: 2025-12-06 09:19:36.459657937 +0000 UTC m=+0.083471829 container died 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, container_name=nova_virtlogd_wrapper, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, version=17.1.12)
Dec 06 09:19:36 np0005548788.localdomain podman[108703]: 2025-12-06 09:19:36.495953223 +0000 UTC m=+0.119767025 container cleanup 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=)
Dec 06 09:19:36 np0005548788.localdomain podman[108703]: nova_virtlogd_wrapper
Dec 06 09:19:36 np0005548788.localdomain podman[108715]: 2025-12-06 09:19:36.545982254 +0000 UTC m=+0.075956967 container cleanup 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtlogd_wrapper, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: tmp-crun.tSwiWj.mount: Deactivated successfully.
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763-merged.mount: Deactivated successfully.
Dec 06 09:19:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439-userdata-shm.mount: Deactivated successfully.
Dec 06 09:19:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38793 DF PROTO=TCP SPT=35860 DPT=9101 SEQ=1100404738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D38B700000000001030307) 
Dec 06 09:19:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6043 DF PROTO=TCP SPT=33804 DPT=9105 SEQ=4287723781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D395F10000000001030307) 
Dec 06 09:19:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32445 DF PROTO=TCP SPT=38424 DPT=9100 SEQ=403846457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3A1800000000001030307) 
Dec 06 09:19:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32447 DF PROTO=TCP SPT=38424 DPT=9100 SEQ=403846457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3AD700000000001030307) 
Dec 06 09:19:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51924 DF PROTO=TCP SPT=52892 DPT=9882 SEQ=593484320 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3B92A0000000001030307) 
Dec 06 09:19:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20558 DF PROTO=TCP SPT=42218 DPT=9102 SEQ=288038372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3C4710000000001030307) 
Dec 06 09:19:54 np0005548788.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:19:54 np0005548788.localdomain recover_tripleo_nova_virtqemud[108733]: 62021
Dec 06 09:19:54 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:19:54 np0005548788.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:19:56 np0005548788.localdomain sshd[108734]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20559 DF PROTO=TCP SPT=42218 DPT=9102 SEQ=288038372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3D4300000000001030307) 
Dec 06 09:19:57 np0005548788.localdomain sshd[108734]: Received disconnect from 179.43.189.36 port 58906:11: Bye Bye [preauth]
Dec 06 09:19:57 np0005548788.localdomain sshd[108734]: Disconnected from authenticating user root 179.43.189.36 port 58906 [preauth]
Dec 06 09:19:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:19:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:19:58 np0005548788.localdomain podman[108736]: 2025-12-06 09:19:58.012591178 +0000 UTC m=+0.085993917 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:19:58 np0005548788.localdomain podman[108736]: 2025-12-06 09:19:58.055036764 +0000 UTC m=+0.128439543 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, release=1761123044, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:19:58 np0005548788.localdomain podman[108736]: unhealthy
Dec 06 09:19:58 np0005548788.localdomain podman[108737]: 2025-12-06 09:19:58.06810934 +0000 UTC m=+0.136081101 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 06 09:19:58 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:58 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:19:58 np0005548788.localdomain podman[108737]: 2025-12-06 09:19:58.085021285 +0000 UTC m=+0.152992996 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:19:58 np0005548788.localdomain podman[108737]: unhealthy
Dec 06 09:19:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:19:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32449 DF PROTO=TCP SPT=38424 DPT=9100 SEQ=403846457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3DDF00000000001030307) 
Dec 06 09:20:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20560 DF PROTO=TCP SPT=42218 DPT=9102 SEQ=288038372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3F3F00000000001030307) 
Dec 06 09:20:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4551 DF PROTO=TCP SPT=57016 DPT=9101 SEQ=3372344824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D3F48E0000000001030307) 
Dec 06 09:20:05 np0005548788.localdomain sudo[108777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:20:05 np0005548788.localdomain sudo[108777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:20:05 np0005548788.localdomain sudo[108777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:20:05 np0005548788.localdomain sudo[108792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:20:05 np0005548788.localdomain sudo[108792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:20:05 np0005548788.localdomain sudo[108792]: pam_unix(sudo:session): session closed for user root
Dec 06 09:20:06 np0005548788.localdomain sudo[108840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:20:06 np0005548788.localdomain sudo[108840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:20:06 np0005548788.localdomain sudo[108840]: pam_unix(sudo:session): session closed for user root
Dec 06 09:20:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4553 DF PROTO=TCP SPT=57016 DPT=9101 SEQ=3372344824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D400B00000000001030307) 
Dec 06 09:20:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:20:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:20:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3812 DF PROTO=TCP SPT=35230 DPT=9105 SEQ=3730826776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D40C190000000001030307) 
Dec 06 09:20:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27449 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=3140982931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D416AF0000000001030307) 
Dec 06 09:20:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:20:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:20:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27451 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=3140982931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D422B00000000001030307) 
Dec 06 09:20:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17997 DF PROTO=TCP SPT=34546 DPT=9882 SEQ=3305761797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D42E5A0000000001030307) 
Dec 06 09:20:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58251 DF PROTO=TCP SPT=48670 DPT=9102 SEQ=2277125821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D439B00000000001030307) 
Dec 06 09:20:25 np0005548788.localdomain sshd[108855]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58252 DF PROTO=TCP SPT=48670 DPT=9102 SEQ=2277125821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D449700000000001030307) 
Dec 06 09:20:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:20:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:20:28 np0005548788.localdomain podman[108858]: 2025-12-06 09:20:28.247102166 +0000 UTC m=+0.069239078 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:20:28 np0005548788.localdomain podman[108857]: 2025-12-06 09:20:28.316407775 +0000 UTC m=+0.137319369 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Dec 06 09:20:28 np0005548788.localdomain podman[108858]: 2025-12-06 09:20:28.344846167 +0000 UTC m=+0.166983129 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64)
Dec 06 09:20:28 np0005548788.localdomain podman[108858]: unhealthy
Dec 06 09:20:28 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:28 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:20:28 np0005548788.localdomain podman[108857]: 2025-12-06 09:20:28.360601185 +0000 UTC m=+0.181512769 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:20:28 np0005548788.localdomain podman[108857]: unhealthy
Dec 06 09:20:28 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:28 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:20:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27453 DF PROTO=TCP SPT=38536 DPT=9100 SEQ=3140982931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D451F00000000001030307) 
Dec 06 09:20:31 np0005548788.localdomain sshd[108897]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:33 np0005548788.localdomain sshd[108897]: Received disconnect from 36.50.177.119 port 56890:11: Bye Bye [preauth]
Dec 06 09:20:33 np0005548788.localdomain sshd[108897]: Disconnected from authenticating user root 36.50.177.119 port 56890 [preauth]
Dec 06 09:20:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30988 DF PROTO=TCP SPT=38656 DPT=9101 SEQ=3087348791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D469BE0000000001030307) 
Dec 06 09:20:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18001 DF PROTO=TCP SPT=34546 DPT=9882 SEQ=3305761797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D469F00000000001030307) 
Dec 06 09:20:37 np0005548788.localdomain sshd[108899]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30990 DF PROTO=TCP SPT=38656 DPT=9101 SEQ=3087348791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D475B00000000001030307) 
Dec 06 09:20:39 np0005548788.localdomain sshd[108899]: Received disconnect from 45.119.84.54 port 40840:11: Bye Bye [preauth]
Dec 06 09:20:39 np0005548788.localdomain sshd[108899]: Disconnected from authenticating user root 45.119.84.54 port 40840 [preauth]
Dec 06 09:20:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32110 DF PROTO=TCP SPT=41034 DPT=9105 SEQ=2389591724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D47FF00000000001030307) 
Dec 06 09:20:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50006 DF PROTO=TCP SPT=35476 DPT=9100 SEQ=463091152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D48BE00000000001030307) 
Dec 06 09:20:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50008 DF PROTO=TCP SPT=35476 DPT=9100 SEQ=463091152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D497F00000000001030307) 
Dec 06 09:20:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35338 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2428772226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4A38A0000000001030307) 
Dec 06 09:20:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4280 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=4107328965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4AEF00000000001030307) 
Dec 06 09:20:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4281 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=4107328965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4BEB00000000001030307) 
Dec 06 09:20:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:20:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:20:58 np0005548788.localdomain podman[108901]: 2025-12-06 09:20:58.520792436 +0000 UTC m=+0.093034720 container health_status 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:20:58 np0005548788.localdomain podman[108902]: 2025-12-06 09:20:58.563294635 +0000 UTC m=+0.132632469 container health_status 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 09:20:58 np0005548788.localdomain podman[108901]: 2025-12-06 09:20:58.589946422 +0000 UTC m=+0.162188726 container exec_died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 06 09:20:58 np0005548788.localdomain podman[108901]: unhealthy
Dec 06 09:20:58 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:58 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed with result 'exit-code'.
Dec 06 09:20:58 np0005548788.localdomain podman[108902]: 2025-12-06 09:20:58.605035421 +0000 UTC m=+0.174373255 container exec_died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, version=17.1.12, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:20:58 np0005548788.localdomain podman[108902]: unhealthy
Dec 06 09:20:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:58 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed with result 'exit-code'.
Dec 06 09:20:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50010 DF PROTO=TCP SPT=35476 DPT=9100 SEQ=463091152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4C7F00000000001030307) 
Dec 06 09:21:00 np0005548788.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Dec 06 09:21:00 np0005548788.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61195 (conmon) with signal SIGKILL.
Dec 06 09:21:00 np0005548788.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Dec 06 09:21:00 np0005548788.localdomain systemd[1]: libpod-conmon-0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439.scope: Deactivated successfully.
Dec 06 09:21:00 np0005548788.localdomain podman[108952]: error opening file `/run/crun/0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439/status`: No such file or directory
Dec 06 09:21:00 np0005548788.localdomain podman[108941]: 2025-12-06 09:21:00.747934186 +0000 UTC m=+0.069163838 container cleanup 0486a37789a16a3ca0df5409618b97bb1cae82efb5329ca64af848b785e33439 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container)
Dec 06 09:21:00 np0005548788.localdomain podman[108941]: nova_virtlogd_wrapper
Dec 06 09:21:00 np0005548788.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Dec 06 09:21:00 np0005548788.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Dec 06 09:21:00 np0005548788.localdomain sudo[108660]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:01 np0005548788.localdomain sudo[109043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofnkagtipvjugcmkjmyowsasgasiwljl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012860.9216816-113-60471641642330/AnsiballZ_systemd_service.py
Dec 06 09:21:01 np0005548788.localdomain sudo[109043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:01 np0005548788.localdomain python3.9[109045]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:01 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:01 np0005548788.localdomain systemd-rc-local-generator[109071]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:01 np0005548788.localdomain systemd-sysv-generator[109078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:01 np0005548788.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Dec 06 09:21:02 np0005548788.localdomain systemd[1]: libpod-46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0.scope: Deactivated successfully.
Dec 06 09:21:02 np0005548788.localdomain systemd[1]: libpod-46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0.scope: Consumed 1.511s CPU time.
Dec 06 09:21:02 np0005548788.localdomain podman[109086]: 2025-12-06 09:21:02.022704411 +0000 UTC m=+0.084178894 container died 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible)
Dec 06 09:21:02 np0005548788.localdomain podman[109086]: 2025-12-06 09:21:02.066518351 +0000 UTC m=+0.127992834 container cleanup 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtnodedevd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:02 np0005548788.localdomain podman[109086]: nova_virtnodedevd
Dec 06 09:21:02 np0005548788.localdomain podman[109100]: 2025-12-06 09:21:02.108880826 +0000 UTC m=+0.070065467 container cleanup 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 06 09:21:02 np0005548788.localdomain systemd[1]: libpod-conmon-46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0.scope: Deactivated successfully.
Dec 06 09:21:02 np0005548788.localdomain podman[109129]: error opening file `/run/crun/46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0/status`: No such file or directory
Dec 06 09:21:02 np0005548788.localdomain podman[109117]: 2025-12-06 09:21:02.223233336 +0000 UTC m=+0.074505664 container cleanup 46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_virtnodedevd, batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:02 np0005548788.localdomain podman[109117]: nova_virtnodedevd
Dec 06 09:21:02 np0005548788.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Dec 06 09:21:02 np0005548788.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Dec 06 09:21:02 np0005548788.localdomain sudo[109043]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:02 np0005548788.localdomain sudo[109220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imzropjwabtxgvonfxzoodcmflztwwyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012862.4098153-113-250376236997366/AnsiballZ_systemd_service.py
Dec 06 09:21:02 np0005548788.localdomain sudo[109220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:02 np0005548788.localdomain python3.9[109222]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2-merged.mount: Deactivated successfully.
Dec 06 09:21:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:04 np0005548788.localdomain systemd-rc-local-generator[109249]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:04 np0005548788.localdomain systemd-sysv-generator[109254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: Stopping nova_virtproxyd container...
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: libpod-eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9.scope: Deactivated successfully.
Dec 06 09:21:04 np0005548788.localdomain podman[109263]: 2025-12-06 09:21:04.47947943 +0000 UTC m=+0.088691605 container died eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, architecture=x86_64, config_id=tripleo_step3, version=17.1.12, container_name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=)
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: tmp-crun.8tzMg1.mount: Deactivated successfully.
Dec 06 09:21:04 np0005548788.localdomain podman[109263]: 2025-12-06 09:21:04.533047673 +0000 UTC m=+0.142259818 container cleanup eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtproxyd, config_id=tripleo_step3, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64)
Dec 06 09:21:04 np0005548788.localdomain podman[109263]: nova_virtproxyd
Dec 06 09:21:04 np0005548788.localdomain podman[109276]: 2025-12-06 09:21:04.556583993 +0000 UTC m=+0.063408379 container cleanup eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtproxyd)
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: libpod-conmon-eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9.scope: Deactivated successfully.
Dec 06 09:21:04 np0005548788.localdomain podman[109305]: error opening file `/run/crun/eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9/status`: No such file or directory
Dec 06 09:21:04 np0005548788.localdomain podman[109292]: 2025-12-06 09:21:04.653969377 +0000 UTC m=+0.063472142 container cleanup eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt)
Dec 06 09:21:04 np0005548788.localdomain podman[109292]: nova_virtproxyd
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Dec 06 09:21:04 np0005548788.localdomain systemd[1]: Stopped nova_virtproxyd container.
Dec 06 09:21:04 np0005548788.localdomain sudo[109220]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49695 DF PROTO=TCP SPT=33114 DPT=9101 SEQ=149207755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4DEEE0000000001030307) 
Dec 06 09:21:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35342 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2428772226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4DFF00000000001030307) 
Dec 06 09:21:05 np0005548788.localdomain sudo[109396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghumnixktanftmzskcfluakyimnaypps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012864.8310988-113-13660034149007/AnsiballZ_systemd_service.py
Dec 06 09:21:05 np0005548788.localdomain sudo[109396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:05 np0005548788.localdomain python3.9[109398]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7-merged.mount: Deactivated successfully.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb2a6d636064d043d3933e3573a3d2d22bdb88a31b085265a957c163d03536e9-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:05 np0005548788.localdomain systemd-rc-local-generator[109428]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:05 np0005548788.localdomain systemd-sysv-generator[109431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: Stopping nova_virtqemud container...
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: libpod-77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732.scope: Deactivated successfully.
Dec 06 09:21:05 np0005548788.localdomain systemd[1]: libpod-77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732.scope: Consumed 2.260s CPU time.
Dec 06 09:21:05 np0005548788.localdomain podman[109439]: 2025-12-06 09:21:05.910252397 +0000 UTC m=+0.084200925 container died 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, release=1761123044, container_name=nova_virtqemud, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, version=17.1.12)
Dec 06 09:21:05 np0005548788.localdomain podman[109439]: 2025-12-06 09:21:05.942130467 +0000 UTC m=+0.116078955 container cleanup 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:05 np0005548788.localdomain podman[109439]: nova_virtqemud
Dec 06 09:21:05 np0005548788.localdomain podman[109453]: 2025-12-06 09:21:05.996028031 +0000 UTC m=+0.070577342 container cleanup 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 09:21:06 np0005548788.localdomain systemd[1]: libpod-conmon-77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732.scope: Deactivated successfully.
Dec 06 09:21:06 np0005548788.localdomain podman[109481]: error opening file `/run/crun/77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732/status`: No such file or directory
Dec 06 09:21:06 np0005548788.localdomain podman[109469]: 2025-12-06 09:21:06.099233045 +0000 UTC m=+0.070260473 container cleanup 77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, container_name=nova_virtqemud, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step3)
Dec 06 09:21:06 np0005548788.localdomain podman[109469]: nova_virtqemud
Dec 06 09:21:06 np0005548788.localdomain systemd[1]: tripleo_nova_virtqemud.service: Deactivated successfully.
Dec 06 09:21:06 np0005548788.localdomain systemd[1]: Stopped nova_virtqemud container.
Dec 06 09:21:06 np0005548788.localdomain sudo[109396]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437-merged.mount: Deactivated successfully.
Dec 06 09:21:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:06 np0005548788.localdomain sudo[109572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uolcycbwuzksodzzjntzmvywkmablhzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012866.2767775-113-278438844993776/AnsiballZ_systemd_service.py
Dec 06 09:21:06 np0005548788.localdomain sudo[109572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:06 np0005548788.localdomain python3.9[109574]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:06 np0005548788.localdomain sudo[109575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:21:06 np0005548788.localdomain sudo[109575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:21:06 np0005548788.localdomain sudo[109575]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:06 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:06 np0005548788.localdomain sudo[109593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:21:06 np0005548788.localdomain systemd-sysv-generator[109634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:06 np0005548788.localdomain systemd-rc-local-generator[109627]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:07 np0005548788.localdomain sudo[109593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:21:07 np0005548788.localdomain sudo[109572]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:07 np0005548788.localdomain sudo[109745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olbjjeotvstwzvzoihfnhjachqgitvyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012867.3179517-113-149984143397239/AnsiballZ_systemd_service.py
Dec 06 09:21:07 np0005548788.localdomain sudo[109745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:07 np0005548788.localdomain sudo[109593]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49697 DF PROTO=TCP SPT=33114 DPT=9101 SEQ=149207755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4EAF00000000001030307) 
Dec 06 09:21:07 np0005548788.localdomain python3.9[109747]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:07 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:08 np0005548788.localdomain systemd-sysv-generator[109796]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:08 np0005548788.localdomain systemd-rc-local-generator[109792]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: Stopping nova_virtsecretd container...
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: libpod-e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199.scope: Deactivated successfully.
Dec 06 09:21:08 np0005548788.localdomain podman[109804]: 2025-12-06 09:21:08.406003487 +0000 UTC m=+0.062216692 container died e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: tmp-crun.nUpL2B.mount: Deactivated successfully.
Dec 06 09:21:08 np0005548788.localdomain podman[109804]: 2025-12-06 09:21:08.450746196 +0000 UTC m=+0.106959411 container cleanup e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtsecretd, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Dec 06 09:21:08 np0005548788.localdomain podman[109804]: nova_virtsecretd
Dec 06 09:21:08 np0005548788.localdomain podman[109817]: 2025-12-06 09:21:08.477697033 +0000 UTC m=+0.061369506 container cleanup e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=nova_virtsecretd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt)
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: libpod-conmon-e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199.scope: Deactivated successfully.
Dec 06 09:21:08 np0005548788.localdomain podman[109844]: error opening file `/run/crun/e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199/status`: No such file or directory
Dec 06 09:21:08 np0005548788.localdomain podman[109832]: 2025-12-06 09:21:08.589388341 +0000 UTC m=+0.073427102 container cleanup e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, vcs-type=git, container_name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, name=rhosp17/openstack-nova-libvirt)
Dec 06 09:21:08 np0005548788.localdomain podman[109832]: nova_virtsecretd
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Dec 06 09:21:08 np0005548788.localdomain systemd[1]: Stopped nova_virtsecretd container.
Dec 06 09:21:08 np0005548788.localdomain sudo[109745]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:09 np0005548788.localdomain sudo[109935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fymwwasjmrkwifsscvyupzxozinezroy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012868.935783-113-147035667264784/AnsiballZ_systemd_service.py
Dec 06 09:21:09 np0005548788.localdomain sudo[109935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df-merged.mount: Deactivated successfully.
Dec 06 09:21:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:09 np0005548788.localdomain python3.9[109937]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:09 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:09 np0005548788.localdomain systemd-sysv-generator[109965]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:09 np0005548788.localdomain systemd-rc-local-generator[109962]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:09 np0005548788.localdomain systemd[1]: Stopping nova_virtstoraged container...
Dec 06 09:21:09 np0005548788.localdomain systemd[1]: libpod-e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1.scope: Deactivated successfully.
Dec 06 09:21:10 np0005548788.localdomain podman[109978]: 2025-12-06 09:21:09.999439555 +0000 UTC m=+0.082014417 container died e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-libvirt, version=17.1.12, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, container_name=nova_virtstoraged, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:21:10 np0005548788.localdomain podman[109978]: 2025-12-06 09:21:10.035605017 +0000 UTC m=+0.118179849 container cleanup e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, container_name=nova_virtstoraged, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:21:10 np0005548788.localdomain podman[109978]: nova_virtstoraged
Dec 06 09:21:10 np0005548788.localdomain podman[109992]: 2025-12-06 09:21:10.08849757 +0000 UTC m=+0.070455718 container cleanup e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_virtstoraged, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:10 np0005548788.localdomain systemd[1]: libpod-conmon-e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1.scope: Deactivated successfully.
Dec 06 09:21:10 np0005548788.localdomain podman[110022]: error opening file `/run/crun/e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1/status`: No such file or directory
Dec 06 09:21:10 np0005548788.localdomain podman[110009]: 2025-12-06 09:21:10.206060689 +0000 UTC m=+0.079042424 container cleanup e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtstoraged, distribution-scope=public, io.openshift.expose-services=, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '558ed7a6d0c1bb3d92c212dc57d9717b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:21:10 np0005548788.localdomain podman[110009]: nova_virtstoraged
Dec 06 09:21:10 np0005548788.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Dec 06 09:21:10 np0005548788.localdomain systemd[1]: Stopped nova_virtstoraged container.
Dec 06 09:21:10 np0005548788.localdomain sudo[109935]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d-merged.mount: Deactivated successfully.
Dec 06 09:21:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6c41cc85e63de00dde6b03c03713b0a85c2705c6e7903d4fd908bb3c298e5c1-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:10 np0005548788.localdomain sudo[110083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:21:10 np0005548788.localdomain sudo[110083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:21:10 np0005548788.localdomain sudo[110083]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:10 np0005548788.localdomain sudo[110128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkfkjficyokvqjbsfsdkvmgkvmbfmhrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012870.389993-113-260187768319848/AnsiballZ_systemd_service.py
Dec 06 09:21:10 np0005548788.localdomain sudo[110128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37511 DF PROTO=TCP SPT=58774 DPT=9105 SEQ=1585469996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D4F5F10000000001030307) 
Dec 06 09:21:10 np0005548788.localdomain python3.9[110130]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:11 np0005548788.localdomain systemd-rc-local-generator[110159]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:11 np0005548788.localdomain systemd-sysv-generator[110163]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: Stopping ovn_controller container...
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: tmp-crun.JT0Gld.mount: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: libpod-6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.scope: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: libpod-6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.scope: Consumed 2.722s CPU time.
Dec 06 09:21:11 np0005548788.localdomain podman[110172]: 2025-12-06 09:21:11.503432736 +0000 UTC m=+0.130789862 container stop 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64)
Dec 06 09:21:11 np0005548788.localdomain podman[110172]: 2025-12-06 09:21:11.537468812 +0000 UTC m=+0.164825928 container died 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.timer: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed to open /run/systemd/transient/6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: No such file or directory
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d144d97f229d6ea4f1176db9ee5f38246c70badf684dca92813d3fa706be8063-merged.mount: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain podman[110172]: 2025-12-06 09:21:11.643572116 +0000 UTC m=+0.270929232 container cleanup 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 06 09:21:11 np0005548788.localdomain podman[110172]: ovn_controller
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.timer: Failed to open /run/systemd/transient/6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.timer: No such file or directory
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed to open /run/systemd/transient/6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: No such file or directory
Dec 06 09:21:11 np0005548788.localdomain podman[110186]: 2025-12-06 09:21:11.65689613 +0000 UTC m=+0.138495961 container cleanup 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20251118.1)
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: libpod-conmon-6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.scope: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.timer: Failed to open /run/systemd/transient/6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.timer: No such file or directory
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: Failed to open /run/systemd/transient/6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef.service: No such file or directory
Dec 06 09:21:11 np0005548788.localdomain podman[110199]: 2025-12-06 09:21:11.768409081 +0000 UTC m=+0.076859096 container cleanup 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:21:11 np0005548788.localdomain podman[110199]: ovn_controller
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Dec 06 09:21:11 np0005548788.localdomain systemd[1]: Stopped ovn_controller container.
Dec 06 09:21:11 np0005548788.localdomain sudo[110128]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:12 np0005548788.localdomain sudo[110299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmxwegxmpglfkcytuzmjtrhyoecttzrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012871.9642613-113-212364458706609/AnsiballZ_systemd_service.py
Dec 06 09:21:12 np0005548788.localdomain sudo[110299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:12 np0005548788.localdomain python3.9[110301]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:12 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:12 np0005548788.localdomain systemd-rc-local-generator[110321]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:12 np0005548788.localdomain systemd-sysv-generator[110327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:12 np0005548788.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Dec 06 09:21:13 np0005548788.localdomain systemd[1]: tmp-crun.OV3aBr.mount: Deactivated successfully.
Dec 06 09:21:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46002 DF PROTO=TCP SPT=44122 DPT=9100 SEQ=2496696652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5010F0000000001030307) 
Dec 06 09:21:13 np0005548788.localdomain systemd[1]: libpod-215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.scope: Deactivated successfully.
Dec 06 09:21:13 np0005548788.localdomain systemd[1]: libpod-215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.scope: Consumed 10.105s CPU time.
Dec 06 09:21:13 np0005548788.localdomain podman[110342]: 2025-12-06 09:21:13.911039438 +0000 UTC m=+0.930093565 container died 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:21:13 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.timer: Deactivated successfully.
Dec 06 09:21:13 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.
Dec 06 09:21:13 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed to open /run/systemd/transient/215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: No such file or directory
Dec 06 09:21:14 np0005548788.localdomain podman[110342]: 2025-12-06 09:21:14.020886019 +0000 UTC m=+1.039940146 container cleanup 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Dec 06 09:21:14 np0005548788.localdomain podman[110342]: ovn_metadata_agent
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.timer: Failed to open /run/systemd/transient/215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.timer: No such file or directory
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed to open /run/systemd/transient/215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: No such file or directory
Dec 06 09:21:14 np0005548788.localdomain podman[110354]: 2025-12-06 09:21:14.046064471 +0000 UTC m=+0.125711334 container cleanup 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-7aeebb59c2ae55208b2c5ff933ebd16346303bee7a4c212ebe435c707b85240b-merged.mount: Deactivated successfully.
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: libpod-conmon-215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.scope: Deactivated successfully.
Dec 06 09:21:14 np0005548788.localdomain podman[110383]: error opening file `/run/crun/215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86/status`: No such file or directory
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: tmp-crun.CWeiJs.mount: Deactivated successfully.
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.timer: Failed to open /run/systemd/transient/215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.timer: No such file or directory
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: Failed to open /run/systemd/transient/215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86.service: No such file or directory
Dec 06 09:21:14 np0005548788.localdomain podman[110371]: 2025-12-06 09:21:14.169451911 +0000 UTC m=+0.087323332 container cleanup 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:21:14 np0005548788.localdomain podman[110371]: ovn_metadata_agent
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Deactivated successfully.
Dec 06 09:21:14 np0005548788.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Dec 06 09:21:14 np0005548788.localdomain sudo[110299]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:14 np0005548788.localdomain sudo[110474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihjxqjlbmyuvbmzhqwohidmllleesdbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012874.3463528-113-58862954341184/AnsiballZ_systemd_service.py
Dec 06 09:21:14 np0005548788.localdomain sudo[110474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:14 np0005548788.localdomain python3.9[110476]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:15 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:21:15 np0005548788.localdomain systemd-sysv-generator[110504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:15 np0005548788.localdomain systemd-rc-local-generator[110500]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:15 np0005548788.localdomain sudo[110474]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46004 DF PROTO=TCP SPT=44122 DPT=9100 SEQ=2496696652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D50D300000000001030307) 
Dec 06 09:21:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12917 DF PROTO=TCP SPT=45264 DPT=9882 SEQ=3836554014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D518BA0000000001030307) 
Dec 06 09:21:21 np0005548788.localdomain sshd[110528]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57151 DF PROTO=TCP SPT=33690 DPT=9102 SEQ=13270051 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D524310000000001030307) 
Dec 06 09:21:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57152 DF PROTO=TCP SPT=33690 DPT=9102 SEQ=13270051 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D533F10000000001030307) 
Dec 06 09:21:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46006 DF PROTO=TCP SPT=44122 DPT=9100 SEQ=2496696652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D53DF00000000001030307) 
Dec 06 09:21:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12921 DF PROTO=TCP SPT=45264 DPT=9882 SEQ=3836554014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D553F00000000001030307) 
Dec 06 09:21:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57153 DF PROTO=TCP SPT=33690 DPT=9102 SEQ=13270051 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D553F00000000001030307) 
Dec 06 09:21:35 np0005548788.localdomain rhsm-service[6597]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 09:21:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40513 DF PROTO=TCP SPT=54260 DPT=9101 SEQ=1260902631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D560310000000001030307) 
Dec 06 09:21:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64028 DF PROTO=TCP SPT=50666 DPT=9105 SEQ=697665613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D56FF00000000001030307) 
Dec 06 09:21:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12645 DF PROTO=TCP SPT=38048 DPT=9100 SEQ=3434547849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D576400000000001030307) 
Dec 06 09:21:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12647 DF PROTO=TCP SPT=38048 DPT=9100 SEQ=3434547849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D582310000000001030307) 
Dec 06 09:21:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54572 DF PROTO=TCP SPT=39598 DPT=9882 SEQ=3203860789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D58DEA0000000001030307) 
Dec 06 09:21:52 np0005548788.localdomain sshd[110708]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13810 DF PROTO=TCP SPT=58564 DPT=9102 SEQ=989923807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D599300000000001030307) 
Dec 06 09:21:54 np0005548788.localdomain sshd[110708]: Received disconnect from 36.50.177.119 port 52110:11: Bye Bye [preauth]
Dec 06 09:21:54 np0005548788.localdomain sshd[110708]: Disconnected from authenticating user root 36.50.177.119 port 52110 [preauth]
Dec 06 09:21:55 np0005548788.localdomain sshd[110710]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:56 np0005548788.localdomain sshd[110710]: Received disconnect from 148.227.3.232 port 49632:11: Bye Bye [preauth]
Dec 06 09:21:56 np0005548788.localdomain sshd[110710]: Disconnected from authenticating user root 148.227.3.232 port 49632 [preauth]
Dec 06 09:21:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13811 DF PROTO=TCP SPT=58564 DPT=9102 SEQ=989923807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5A8F10000000001030307) 
Dec 06 09:21:58 np0005548788.localdomain sshd[110712]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12649 DF PROTO=TCP SPT=38048 DPT=9100 SEQ=3434547849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5B1F00000000001030307) 
Dec 06 09:21:59 np0005548788.localdomain sshd[110712]: Received disconnect from 45.119.84.54 port 45686:11: Bye Bye [preauth]
Dec 06 09:21:59 np0005548788.localdomain sshd[110712]: Disconnected from authenticating user root 45.119.84.54 port 45686 [preauth]
Dec 06 09:22:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6151 DF PROTO=TCP SPT=54480 DPT=9101 SEQ=3486348593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5C94E0000000001030307) 
Dec 06 09:22:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54576 DF PROTO=TCP SPT=39598 DPT=9882 SEQ=3203860789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5C9F10000000001030307) 
Dec 06 09:22:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6153 DF PROTO=TCP SPT=54480 DPT=9101 SEQ=3486348593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5D5700000000001030307) 
Dec 06 09:22:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64031 DF PROTO=TCP SPT=50666 DPT=9105 SEQ=697665613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5DFF00000000001030307) 
Dec 06 09:22:10 np0005548788.localdomain sudo[110714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:22:10 np0005548788.localdomain sudo[110714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:10 np0005548788.localdomain sudo[110714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:10 np0005548788.localdomain sudo[110729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:22:10 np0005548788.localdomain sudo[110729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:11 np0005548788.localdomain sudo[110729]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:11 np0005548788.localdomain sudo[110764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:22:11 np0005548788.localdomain sudo[110764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:11 np0005548788.localdomain sudo[110764]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:11 np0005548788.localdomain sudo[110779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:22:11 np0005548788.localdomain sudo[110779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:12 np0005548788.localdomain sudo[110779]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:12 np0005548788.localdomain sudo[110824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:22:12 np0005548788.localdomain sudo[110824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:12 np0005548788.localdomain sudo[110824]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42276 DF PROTO=TCP SPT=35740 DPT=9100 SEQ=2482119836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5EB6F0000000001030307) 
Dec 06 09:22:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42278 DF PROTO=TCP SPT=35740 DPT=9100 SEQ=2482119836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D5F7700000000001030307) 
Dec 06 09:22:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15012 DF PROTO=TCP SPT=60020 DPT=9882 SEQ=46312001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6031A0000000001030307) 
Dec 06 09:22:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19569 DF PROTO=TCP SPT=40596 DPT=9102 SEQ=7544644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D60E700000000001030307) 
Dec 06 09:22:25 np0005548788.localdomain sshd[108855]: fatal: Timeout before authentication for 45.78.219.195 port 40322
Dec 06 09:22:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19570 DF PROTO=TCP SPT=40596 DPT=9102 SEQ=7544644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D61E300000000001030307) 
Dec 06 09:22:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42280 DF PROTO=TCP SPT=35740 DPT=9100 SEQ=2482119836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D627F00000000001030307) 
Dec 06 09:22:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19571 DF PROTO=TCP SPT=40596 DPT=9102 SEQ=7544644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D63DF00000000001030307) 
Dec 06 09:22:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3605 DF PROTO=TCP SPT=60328 DPT=9101 SEQ=1289053188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D63E7E0000000001030307) 
Dec 06 09:22:35 np0005548788.localdomain sshd[105304]: Received disconnect from 192.168.122.31 port 47316:11: disconnected by user
Dec 06 09:22:35 np0005548788.localdomain sshd[105304]: Disconnected from user zuul 192.168.122.31 port 47316
Dec 06 09:22:35 np0005548788.localdomain sshd[105301]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:22:35 np0005548788.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Dec 06 09:22:35 np0005548788.localdomain systemd[1]: session-37.scope: Consumed 19.691s CPU time.
Dec 06 09:22:35 np0005548788.localdomain systemd-logind[765]: Session 37 logged out. Waiting for processes to exit.
Dec 06 09:22:35 np0005548788.localdomain systemd-logind[765]: Removed session 37.
Dec 06 09:22:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3607 DF PROTO=TCP SPT=60328 DPT=9101 SEQ=1289053188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D64A700000000001030307) 
Dec 06 09:22:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51862 DF PROTO=TCP SPT=58552 DPT=9105 SEQ=3768230248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D655F10000000001030307) 
Dec 06 09:22:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38799 DF PROTO=TCP SPT=56402 DPT=9100 SEQ=2741015615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6609F0000000001030307) 
Dec 06 09:22:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38801 DF PROTO=TCP SPT=56402 DPT=9100 SEQ=2741015615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D66CB10000000001030307) 
Dec 06 09:22:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62768 DF PROTO=TCP SPT=52766 DPT=9882 SEQ=2804246970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6784A0000000001030307) 
Dec 06 09:22:51 np0005548788.localdomain sshd[110839]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:52 np0005548788.localdomain sshd[110840]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17376 DF PROTO=TCP SPT=41262 DPT=9102 SEQ=4225964896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D683B00000000001030307) 
Dec 06 09:22:53 np0005548788.localdomain sshd[110839]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 09:22:53 np0005548788.localdomain sshd[110839]: Connection closed by 87.236.176.178 port 58777
Dec 06 09:22:53 np0005548788.localdomain sshd[110842]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:53 np0005548788.localdomain sshd[110842]: Connection closed by 87.236.176.178 port 51631 [preauth]
Dec 06 09:22:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17377 DF PROTO=TCP SPT=41262 DPT=9102 SEQ=4225964896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D693700000000001030307) 
Dec 06 09:22:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38803 DF PROTO=TCP SPT=56402 DPT=9100 SEQ=2741015615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D69BF00000000001030307) 
Dec 06 09:23:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37694 DF PROTO=TCP SPT=51526 DPT=9101 SEQ=1398729786 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6B3AE0000000001030307) 
Dec 06 09:23:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17378 DF PROTO=TCP SPT=41262 DPT=9102 SEQ=4225964896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6B3F00000000001030307) 
Dec 06 09:23:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37696 DF PROTO=TCP SPT=51526 DPT=9101 SEQ=1398729786 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6BFB00000000001030307) 
Dec 06 09:23:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14569 DF PROTO=TCP SPT=42688 DPT=9105 SEQ=3287826769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6C9F00000000001030307) 
Dec 06 09:23:12 np0005548788.localdomain sshd[110844]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:23:12 np0005548788.localdomain sudo[110846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:23:12 np0005548788.localdomain sudo[110846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:23:12 np0005548788.localdomain sudo[110846]: pam_unix(sudo:session): session closed for user root
Dec 06 09:23:13 np0005548788.localdomain sudo[110861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:23:13 np0005548788.localdomain sudo[110861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:23:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57643 DF PROTO=TCP SPT=55604 DPT=9100 SEQ=3451175953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6D5CF0000000001030307) 
Dec 06 09:23:13 np0005548788.localdomain sudo[110861]: pam_unix(sudo:session): session closed for user root
Dec 06 09:23:14 np0005548788.localdomain sshd[110844]: Received disconnect from 36.50.177.119 port 51600:11: Bye Bye [preauth]
Dec 06 09:23:14 np0005548788.localdomain sshd[110844]: Disconnected from authenticating user root 36.50.177.119 port 51600 [preauth]
Dec 06 09:23:14 np0005548788.localdomain sudo[110907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:23:14 np0005548788.localdomain sudo[110907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:23:14 np0005548788.localdomain sudo[110907]: pam_unix(sudo:session): session closed for user root
Dec 06 09:23:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57645 DF PROTO=TCP SPT=55604 DPT=9100 SEQ=3451175953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6E1F10000000001030307) 
Dec 06 09:23:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11075 DF PROTO=TCP SPT=58850 DPT=9882 SEQ=898459588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6ED7A0000000001030307) 
Dec 06 09:23:21 np0005548788.localdomain sshd[110528]: fatal: Timeout before authentication for 101.47.142.76 port 53586
Dec 06 09:23:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10879 DF PROTO=TCP SPT=59382 DPT=9102 SEQ=1692200968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D6F8F00000000001030307) 
Dec 06 09:23:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10880 DF PROTO=TCP SPT=59382 DPT=9102 SEQ=1692200968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D708B00000000001030307) 
Dec 06 09:23:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57647 DF PROTO=TCP SPT=55604 DPT=9100 SEQ=3451175953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D711F00000000001030307) 
Dec 06 09:23:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39099 DF PROTO=TCP SPT=56456 DPT=9101 SEQ=3621174100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D728DE0000000001030307) 
Dec 06 09:23:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10881 DF PROTO=TCP SPT=59382 DPT=9102 SEQ=1692200968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D729F10000000001030307) 
Dec 06 09:23:37 np0005548788.localdomain sshd[110840]: Connection closed by 45.78.219.195 port 44558 [preauth]
Dec 06 09:23:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39101 DF PROTO=TCP SPT=56456 DPT=9101 SEQ=3621174100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D734F00000000001030307) 
Dec 06 09:23:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64611 DF PROTO=TCP SPT=42426 DPT=9105 SEQ=3501573909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D73FF10000000001030307) 
Dec 06 09:23:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12314 DF PROTO=TCP SPT=56540 DPT=9100 SEQ=180662547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D74B000000000001030307) 
Dec 06 09:23:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12316 DF PROTO=TCP SPT=56540 DPT=9100 SEQ=180662547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D756F10000000001030307) 
Dec 06 09:23:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30103 DF PROTO=TCP SPT=37600 DPT=9882 SEQ=3335599930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D762AA0000000001030307) 
Dec 06 09:23:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9532 DF PROTO=TCP SPT=33524 DPT=9102 SEQ=2081811818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D76DF00000000001030307) 
Dec 06 09:23:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9533 DF PROTO=TCP SPT=33524 DPT=9102 SEQ=2081811818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D77DB00000000001030307) 
Dec 06 09:23:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12318 DF PROTO=TCP SPT=56540 DPT=9100 SEQ=180662547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D787F10000000001030307) 
Dec 06 09:24:00 np0005548788.localdomain sshd[110922]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9534 DF PROTO=TCP SPT=33524 DPT=9102 SEQ=2081811818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D79DF00000000001030307) 
Dec 06 09:24:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30107 DF PROTO=TCP SPT=37600 DPT=9882 SEQ=3335599930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D79DF10000000001030307) 
Dec 06 09:24:06 np0005548788.localdomain sshd[110923]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:06 np0005548788.localdomain sshd[110923]: Accepted publickey for zuul from 192.168.122.31 port 56148 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:24:06 np0005548788.localdomain systemd-logind[765]: New session 38 of user zuul.
Dec 06 09:24:06 np0005548788.localdomain systemd[1]: Started Session 38 of User zuul.
Dec 06 09:24:06 np0005548788.localdomain sshd[110923]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:24:06 np0005548788.localdomain sudo[111002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezajwvalbyhhdosopkyzwyrtjrwbkyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013046.4710522-563-48951950121269/AnsiballZ_file.py
Dec 06 09:24:06 np0005548788.localdomain sudo[111002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:06 np0005548788.localdomain python3.9[111004]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:06 np0005548788.localdomain sudo[111002]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:07 np0005548788.localdomain sudo[111094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlaqummwuqqoiwfkohlyqtavfugahijy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013047.0714276-563-3643990470661/AnsiballZ_file.py
Dec 06 09:24:07 np0005548788.localdomain sudo[111094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:07 np0005548788.localdomain python3.9[111096]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:07 np0005548788.localdomain sudo[111094]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:07 np0005548788.localdomain sudo[111186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiwemthgexsibkwitikonzlgnkperpom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013047.6472614-563-210250056815316/AnsiballZ_file.py
Dec 06 09:24:07 np0005548788.localdomain sudo[111186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49000 DF PROTO=TCP SPT=58776 DPT=9101 SEQ=3542878979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7AA300000000001030307) 
Dec 06 09:24:08 np0005548788.localdomain python3.9[111188]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:08 np0005548788.localdomain sudo[111186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:08 np0005548788.localdomain sudo[111278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfkdxpsbngvbjuxgiaqmpyuathfzwftk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013048.242388-563-130407276427572/AnsiballZ_file.py
Dec 06 09:24:08 np0005548788.localdomain sudo[111278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:08 np0005548788.localdomain python3.9[111280]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:08 np0005548788.localdomain sudo[111278]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:09 np0005548788.localdomain sudo[111370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydhmpikonhhemxrqswdtjkodnrtwftrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013048.826171-563-24845069702037/AnsiballZ_file.py
Dec 06 09:24:09 np0005548788.localdomain sudo[111370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:09 np0005548788.localdomain python3.9[111372]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:09 np0005548788.localdomain sudo[111370]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:09 np0005548788.localdomain sudo[111462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejguijzrmtukqnjjckckoalymezlbvjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013049.3385642-563-261921960163148/AnsiballZ_file.py
Dec 06 09:24:09 np0005548788.localdomain sudo[111462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:09 np0005548788.localdomain python3.9[111464]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:09 np0005548788.localdomain sudo[111462]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:10 np0005548788.localdomain sudo[111554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otlfddinmftsnckavglorlsxdrmmnxgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013049.8986053-563-220311731755599/AnsiballZ_file.py
Dec 06 09:24:10 np0005548788.localdomain sudo[111554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:10 np0005548788.localdomain python3.9[111556]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:10 np0005548788.localdomain sudo[111554]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:10 np0005548788.localdomain sudo[111646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prtgvbhnnzfkeaslooobzzdomlkbwcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013050.4413419-563-244637860735629/AnsiballZ_file.py
Dec 06 09:24:10 np0005548788.localdomain sudo[111646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:10 np0005548788.localdomain python3.9[111648]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:10 np0005548788.localdomain sudo[111646]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:11 np0005548788.localdomain sudo[111738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgkmbysanuygyoztrshqluifsltgcasg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013051.035937-563-116174265244746/AnsiballZ_file.py
Dec 06 09:24:11 np0005548788.localdomain sudo[111738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:11 np0005548788.localdomain python3.9[111740]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:11 np0005548788.localdomain sudo[111738]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:11 np0005548788.localdomain sudo[111830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvlwpqxfgqanehdozazxbjpuqmodnybz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013051.6178222-563-72407125954336/AnsiballZ_file.py
Dec 06 09:24:11 np0005548788.localdomain sudo[111830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:11 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35683 DF PROTO=TCP SPT=48340 DPT=9105 SEQ=832419676 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7B9F10000000001030307) 
Dec 06 09:24:12 np0005548788.localdomain python3.9[111832]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:12 np0005548788.localdomain sudo[111830]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:12 np0005548788.localdomain sudo[111922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kawjesknfrefhgvsmdtzsdcccflwntmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013052.2063088-563-127233891266059/AnsiballZ_file.py
Dec 06 09:24:12 np0005548788.localdomain sudo[111922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:12 np0005548788.localdomain python3.9[111924]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:12 np0005548788.localdomain sudo[111922]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:13 np0005548788.localdomain sudo[112014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvrfrvzbtbdwchtjmafxjzrggrqwdduj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013052.818963-563-238940353491541/AnsiballZ_file.py
Dec 06 09:24:13 np0005548788.localdomain sudo[112014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:13 np0005548788.localdomain python3.9[112016]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:13 np0005548788.localdomain sudo[112014]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59739 DF PROTO=TCP SPT=53528 DPT=9100 SEQ=1249775720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7C0300000000001030307) 
Dec 06 09:24:13 np0005548788.localdomain sudo[112106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agrajftlehlumsknxjyddbgqslnxtirz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013053.4152973-563-251922407832305/AnsiballZ_file.py
Dec 06 09:24:13 np0005548788.localdomain sudo[112106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:13 np0005548788.localdomain python3.9[112108]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:13 np0005548788.localdomain sudo[112106]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:14 np0005548788.localdomain sudo[112198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaumqkucuzjdyoiheirwyncwngqfjrms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013054.0295944-563-181073988303486/AnsiballZ_file.py
Dec 06 09:24:14 np0005548788.localdomain sudo[112198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:14 np0005548788.localdomain sudo[112201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:24:14 np0005548788.localdomain python3.9[112200]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:14 np0005548788.localdomain sudo[112201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:14 np0005548788.localdomain sudo[112201]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:14 np0005548788.localdomain sudo[112198]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:14 np0005548788.localdomain sudo[112216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:24:14 np0005548788.localdomain sudo[112216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:14 np0005548788.localdomain sudo[112320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpkbmaimoszsdvwfjwldvjfgomvwisbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013054.6781075-563-138266871432338/AnsiballZ_file.py
Dec 06 09:24:14 np0005548788.localdomain sudo[112320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:15 np0005548788.localdomain python3.9[112323]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:15 np0005548788.localdomain sudo[112320]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:15 np0005548788.localdomain podman[112453]: 2025-12-06 09:24:15.482516156 +0000 UTC m=+0.100335996 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.openshift.expose-services=, release=1763362218, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, GIT_CLEAN=True)
Dec 06 09:24:15 np0005548788.localdomain sudo[112502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usvumzyghcbftzdltmqzwlvthnsymtbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013055.2779677-563-37286339991070/AnsiballZ_file.py
Dec 06 09:24:15 np0005548788.localdomain sudo[112502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:15 np0005548788.localdomain podman[112453]: 2025-12-06 09:24:15.612578213 +0000 UTC m=+0.230397983 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main)
Dec 06 09:24:15 np0005548788.localdomain python3.9[112504]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:15 np0005548788.localdomain sudo[112502]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:15 np0005548788.localdomain sudo[112216]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:16 np0005548788.localdomain sudo[112614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:24:16 np0005548788.localdomain sudo[112614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:16 np0005548788.localdomain sudo[112614]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:16 np0005548788.localdomain sudo[112669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljocnbqkhzpukmlspepyyuiwlmrauoal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013055.8384352-563-165522614453402/AnsiballZ_file.py
Dec 06 09:24:16 np0005548788.localdomain sudo[112669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:16 np0005548788.localdomain sudo[112648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:24:16 np0005548788.localdomain sudo[112648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:16 np0005548788.localdomain python3.9[112674]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:16 np0005548788.localdomain sudo[112669]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59741 DF PROTO=TCP SPT=53528 DPT=9100 SEQ=1249775720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7CC300000000001030307) 
Dec 06 09:24:16 np0005548788.localdomain sudo[112785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgarmtldbjltdbuegssrsoypswrzxszj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013056.3947825-563-275976744583982/AnsiballZ_file.py
Dec 06 09:24:16 np0005548788.localdomain sudo[112785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:16 np0005548788.localdomain sudo[112648]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:16 np0005548788.localdomain python3.9[112792]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:16 np0005548788.localdomain sudo[112785]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:17 np0005548788.localdomain sudo[112814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:24:17 np0005548788.localdomain sudo[112814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:17 np0005548788.localdomain sudo[112814]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:17 np0005548788.localdomain sudo[112904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwnltuoawfbycygfmmsvtjoytggptxwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013057.0669458-563-279220312615797/AnsiballZ_file.py
Dec 06 09:24:17 np0005548788.localdomain sudo[112904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:17 np0005548788.localdomain python3.9[112906]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:17 np0005548788.localdomain sudo[112904]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:17 np0005548788.localdomain sudo[112996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msrcwqyvzmgveapfhvqnmseajzulhcxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013057.6941323-563-176462984348697/AnsiballZ_file.py
Dec 06 09:24:17 np0005548788.localdomain sudo[112996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:18 np0005548788.localdomain python3.9[112998]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:18 np0005548788.localdomain sudo[112996]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:18 np0005548788.localdomain sudo[113088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phjsmstbvcoojxaawyzafzoceymbouef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013058.2766867-563-35234123724579/AnsiballZ_file.py
Dec 06 09:24:18 np0005548788.localdomain sudo[113088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:18 np0005548788.localdomain python3.9[113090]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:18 np0005548788.localdomain sudo[113088]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:19 np0005548788.localdomain sudo[113180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwulkxitipdufnbbmgfvwolavihztyet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013058.9797273-1013-87735226737665/AnsiballZ_file.py
Dec 06 09:24:19 np0005548788.localdomain sudo[113180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:19 np0005548788.localdomain python3.9[113182]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:19 np0005548788.localdomain sudo[113180]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13464 DF PROTO=TCP SPT=33784 DPT=9882 SEQ=3849620430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7D7DA0000000001030307) 
Dec 06 09:24:19 np0005548788.localdomain sudo[113272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztcuzbyeikdtwcdpyhfzkzniyyjtkito ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013059.5833058-1013-61207078513247/AnsiballZ_file.py
Dec 06 09:24:19 np0005548788.localdomain sudo[113272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:19 np0005548788.localdomain sshd[113275]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:20 np0005548788.localdomain python3.9[113274]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:20 np0005548788.localdomain sudo[113272]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:20 np0005548788.localdomain sudo[113366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exlyynksoalbcdywmsiyqubqfngkjfou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013060.195016-1013-163523364018133/AnsiballZ_file.py
Dec 06 09:24:20 np0005548788.localdomain sudo[113366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:20 np0005548788.localdomain sshd[113275]: Received disconnect from 148.227.3.232 port 46152:11: Bye Bye [preauth]
Dec 06 09:24:20 np0005548788.localdomain sshd[113275]: Disconnected from authenticating user root 148.227.3.232 port 46152 [preauth]
Dec 06 09:24:20 np0005548788.localdomain python3.9[113368]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:20 np0005548788.localdomain sudo[113366]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:21 np0005548788.localdomain sudo[113458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmdswenlojsyshwdlomwyvkpnrrdvcai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013060.7866971-1013-150293868623695/AnsiballZ_file.py
Dec 06 09:24:21 np0005548788.localdomain sudo[113458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:21 np0005548788.localdomain python3.9[113460]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:21 np0005548788.localdomain sudo[113458]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:21 np0005548788.localdomain sudo[113550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqmevcxtdylvctcudjjnhuddwimjqkqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013061.4017763-1013-248773769998324/AnsiballZ_file.py
Dec 06 09:24:21 np0005548788.localdomain sudo[113550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:21 np0005548788.localdomain python3.9[113552]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:21 np0005548788.localdomain sudo[113550]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:22 np0005548788.localdomain sudo[113642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtjugbunwlrfhyvvubnatypuamfqxctz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013061.9973593-1013-262283652439238/AnsiballZ_file.py
Dec 06 09:24:22 np0005548788.localdomain sudo[113642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:22 np0005548788.localdomain python3.9[113644]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:22 np0005548788.localdomain sudo[113642]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50260 DF PROTO=TCP SPT=60148 DPT=9102 SEQ=2863349712 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7E3310000000001030307) 
Dec 06 09:24:22 np0005548788.localdomain sudo[113734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozotvaajebyuwowfqwbjehjsgrqyqkcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013062.5943332-1013-140785296770652/AnsiballZ_file.py
Dec 06 09:24:22 np0005548788.localdomain sudo[113734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:23 np0005548788.localdomain python3.9[113736]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:23 np0005548788.localdomain sudo[113734]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:23 np0005548788.localdomain sudo[113826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyfcyozqathnqqgfwexiecjzcefebids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013063.376436-1013-169767730653827/AnsiballZ_file.py
Dec 06 09:24:23 np0005548788.localdomain sudo[113826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:23 np0005548788.localdomain python3.9[113828]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:23 np0005548788.localdomain sudo[113826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:24 np0005548788.localdomain sudo[113918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkevqonhkaegesdajufqsfeopevqrjak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013063.9118738-1013-34351520924660/AnsiballZ_file.py
Dec 06 09:24:24 np0005548788.localdomain sudo[113918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:24 np0005548788.localdomain python3.9[113920]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:24 np0005548788.localdomain sudo[113918]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:24 np0005548788.localdomain sudo[114010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeeadtfzctnfswtukaunzuogoiiotkjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013064.4926555-1013-209955096072454/AnsiballZ_file.py
Dec 06 09:24:24 np0005548788.localdomain sudo[114010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:24 np0005548788.localdomain python3.9[114012]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:25 np0005548788.localdomain sudo[114010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:25 np0005548788.localdomain sudo[114102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeziehipsxqlgcgdqazeghunjzpuhxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013065.121921-1013-229526703039122/AnsiballZ_file.py
Dec 06 09:24:25 np0005548788.localdomain sudo[114102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:25 np0005548788.localdomain python3.9[114104]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:25 np0005548788.localdomain sudo[114102]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:26 np0005548788.localdomain sudo[114194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfhcwhssxsshcsxjglnpxnkdevhyebdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013065.7431912-1013-27534257448156/AnsiballZ_file.py
Dec 06 09:24:26 np0005548788.localdomain sudo[114194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:26 np0005548788.localdomain python3.9[114196]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:26 np0005548788.localdomain sudo[114194]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50261 DF PROTO=TCP SPT=60148 DPT=9102 SEQ=2863349712 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7F2F00000000001030307) 
Dec 06 09:24:26 np0005548788.localdomain sudo[114286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmxgrwjjebvwdqorajtxhcindfpwembk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013066.3479028-1013-131860278116545/AnsiballZ_file.py
Dec 06 09:24:26 np0005548788.localdomain sudo[114286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:26 np0005548788.localdomain python3.9[114288]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:26 np0005548788.localdomain sudo[114286]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:27 np0005548788.localdomain sudo[114378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evpakixttmesdmefdvnxjkgmjpcpziec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013066.9205823-1013-121623711798320/AnsiballZ_file.py
Dec 06 09:24:27 np0005548788.localdomain sudo[114378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:27 np0005548788.localdomain python3.9[114380]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:27 np0005548788.localdomain sudo[114378]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:27 np0005548788.localdomain sudo[114470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vngjwckbbkymsdbtlyycyugwkgnpynoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013067.543314-1013-21409260840416/AnsiballZ_file.py
Dec 06 09:24:27 np0005548788.localdomain sudo[114470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:27 np0005548788.localdomain python3.9[114472]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:27 np0005548788.localdomain sudo[114470]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:28 np0005548788.localdomain sudo[114562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzohrpadsrlahtcbeltkjavwbutehmfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013068.1197562-1013-155809345205283/AnsiballZ_file.py
Dec 06 09:24:28 np0005548788.localdomain sudo[114562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:28 np0005548788.localdomain python3.9[114564]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:28 np0005548788.localdomain sudo[114562]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59743 DF PROTO=TCP SPT=53528 DPT=9100 SEQ=1249775720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D7FBF00000000001030307) 
Dec 06 09:24:28 np0005548788.localdomain sudo[114654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjmgddkyzwggqupdzhbcxgxnjsrimwxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013068.7144072-1013-117840362025784/AnsiballZ_file.py
Dec 06 09:24:28 np0005548788.localdomain sudo[114654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:29 np0005548788.localdomain python3.9[114656]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:29 np0005548788.localdomain sudo[114654]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:29 np0005548788.localdomain sudo[114746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utqdcdgkfjtxjfxrdmnoejloacbkthtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013069.682835-1013-51846125493981/AnsiballZ_file.py
Dec 06 09:24:29 np0005548788.localdomain sudo[114746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:30 np0005548788.localdomain python3.9[114748]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:30 np0005548788.localdomain sudo[114746]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:30 np0005548788.localdomain sudo[114838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxdsrertgyjsepszdoshqiygapbhwijv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013070.1916585-1013-270124341295159/AnsiballZ_file.py
Dec 06 09:24:30 np0005548788.localdomain sudo[114838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:30 np0005548788.localdomain python3.9[114840]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:30 np0005548788.localdomain sudo[114838]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:31 np0005548788.localdomain sshd[114841]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:31 np0005548788.localdomain sudo[114932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obxqdowuvdphldhnymouomjloboexigi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013071.3958974-1013-196873978891499/AnsiballZ_file.py
Dec 06 09:24:31 np0005548788.localdomain sudo[114932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:31 np0005548788.localdomain python3.9[114934]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:31 np0005548788.localdomain sudo[114932]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:32 np0005548788.localdomain sudo[115024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnqpwntmpkirdfcqihpkqbbljpgaoyoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013071.9657896-1013-170687200693836/AnsiballZ_file.py
Dec 06 09:24:32 np0005548788.localdomain sudo[115024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:32 np0005548788.localdomain python3.9[115026]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:32 np0005548788.localdomain sudo[115024]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:33 np0005548788.localdomain sshd[114841]: Received disconnect from 36.50.177.119 port 59406:11: Bye Bye [preauth]
Dec 06 09:24:33 np0005548788.localdomain sshd[114841]: Disconnected from authenticating user root 36.50.177.119 port 59406 [preauth]
Dec 06 09:24:33 np0005548788.localdomain sudo[115116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smeztsxndesgczjjttvbgvmgzbdttnlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013072.8689542-1460-59573201064509/AnsiballZ_command.py
Dec 06 09:24:33 np0005548788.localdomain sudo[115116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:33 np0005548788.localdomain python3.9[115118]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:33 np0005548788.localdomain sudo[115116]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:34 np0005548788.localdomain python3.9[115210]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:24:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6305 DF PROTO=TCP SPT=58956 DPT=9101 SEQ=2174550856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D813400000000001030307) 
Dec 06 09:24:34 np0005548788.localdomain sudo[115300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evoivxkutqwqpqopyhylxdphbopgepiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013074.4945903-1514-254101422050303/AnsiballZ_systemd_service.py
Dec 06 09:24:34 np0005548788.localdomain sudo[115300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13468 DF PROTO=TCP SPT=33784 DPT=9882 SEQ=3849620430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D813F00000000001030307) 
Dec 06 09:24:35 np0005548788.localdomain python3.9[115302]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:24:35 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:24:35 np0005548788.localdomain systemd-rc-local-generator[115323]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:24:35 np0005548788.localdomain systemd-sysv-generator[115329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:24:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:24:37 np0005548788.localdomain sudo[115300]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:37 np0005548788.localdomain sudo[115427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkgefeuskkoiykwwfbdbmrieybsvxxjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013077.213517-1538-147407389542889/AnsiballZ_command.py
Dec 06 09:24:37 np0005548788.localdomain sudo[115427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:37 np0005548788.localdomain python3.9[115429]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:37 np0005548788.localdomain sudo[115427]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6307 DF PROTO=TCP SPT=58956 DPT=9101 SEQ=2174550856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D81F310000000001030307) 
Dec 06 09:24:38 np0005548788.localdomain sudo[115520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhdqfqduptuuweuvbxndswwoxisptbmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013077.7635756-1538-110234915267369/AnsiballZ_command.py
Dec 06 09:24:38 np0005548788.localdomain sudo[115520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:38 np0005548788.localdomain python3.9[115522]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:38 np0005548788.localdomain sudo[115520]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:38 np0005548788.localdomain sudo[115613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycducdyhxfeefwzjgavhtsmxatlpreap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013078.3716009-1538-132251826834891/AnsiballZ_command.py
Dec 06 09:24:38 np0005548788.localdomain sudo[115613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:38 np0005548788.localdomain python3.9[115615]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:38 np0005548788.localdomain sudo[115613]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:39 np0005548788.localdomain sudo[115706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvqgqcbcpffcighaknkowynndfmajoix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013078.9763834-1538-63895120862936/AnsiballZ_command.py
Dec 06 09:24:39 np0005548788.localdomain sudo[115706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:39 np0005548788.localdomain python3.9[115708]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:39 np0005548788.localdomain sudo[115706]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:39 np0005548788.localdomain sudo[115799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfuylvxmwyomkvosjvaydnqmlxocmrho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013079.555731-1538-98219571105963/AnsiballZ_command.py
Dec 06 09:24:39 np0005548788.localdomain sudo[115799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:40 np0005548788.localdomain python3.9[115801]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35686 DF PROTO=TCP SPT=48340 DPT=9105 SEQ=832419676 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D829F00000000001030307) 
Dec 06 09:24:41 np0005548788.localdomain sudo[115799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:41 np0005548788.localdomain sudo[115892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvxtwjgophvustbakmccsepcnhjmktgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013081.184438-1538-31015385313120/AnsiballZ_command.py
Dec 06 09:24:41 np0005548788.localdomain sudo[115892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:41 np0005548788.localdomain python3.9[115894]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:41 np0005548788.localdomain sudo[115892]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:42 np0005548788.localdomain sudo[115985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukawrxgcqnvbcozntrpwdkmnlbexiokb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013081.7669291-1538-87445381011827/AnsiballZ_command.py
Dec 06 09:24:42 np0005548788.localdomain sudo[115985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:42 np0005548788.localdomain python3.9[115987]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:42 np0005548788.localdomain sudo[115985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:42 np0005548788.localdomain sudo[116078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgztjjvimizqjvhgmxbpynzheumubvtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013082.425282-1538-20788885981715/AnsiballZ_command.py
Dec 06 09:24:42 np0005548788.localdomain sudo[116078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:42 np0005548788.localdomain python3.9[116080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:42 np0005548788.localdomain sudo[116078]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:43 np0005548788.localdomain sudo[116171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntadatpungpudwbzbautkiyhxqbfmwdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013082.997893-1538-153059284968957/AnsiballZ_command.py
Dec 06 09:24:43 np0005548788.localdomain sudo[116171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:43 np0005548788.localdomain python3.9[116173]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:43 np0005548788.localdomain sudo[116171]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34188 DF PROTO=TCP SPT=47096 DPT=9100 SEQ=3355517867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8355F0000000001030307) 
Dec 06 09:24:43 np0005548788.localdomain sudo[116264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytthonwqxcfvfdyigghcpgefjznsgvtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013083.5821888-1538-78714437727924/AnsiballZ_command.py
Dec 06 09:24:43 np0005548788.localdomain sudo[116264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:44 np0005548788.localdomain python3.9[116266]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:44 np0005548788.localdomain sudo[116264]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:44 np0005548788.localdomain sudo[116357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqzfdimvsrkkengxpuczylhenrdbwqzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013084.1810837-1538-2679894525555/AnsiballZ_command.py
Dec 06 09:24:44 np0005548788.localdomain sudo[116357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:44 np0005548788.localdomain python3.9[116359]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:44 np0005548788.localdomain sudo[116357]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:45 np0005548788.localdomain sudo[116450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntygwxjdftauzaoaucrejmhtkskbnyuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013084.7832851-1538-64511539330349/AnsiballZ_command.py
Dec 06 09:24:45 np0005548788.localdomain sudo[116450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:45 np0005548788.localdomain python3.9[116452]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:45 np0005548788.localdomain sudo[116450]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:45 np0005548788.localdomain sudo[116543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biqyqmxnjcvhwniuhvisrffrahdkwdqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013085.322015-1538-240935897660399/AnsiballZ_command.py
Dec 06 09:24:45 np0005548788.localdomain sudo[116543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:45 np0005548788.localdomain python3.9[116545]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:45 np0005548788.localdomain sudo[116543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:46 np0005548788.localdomain sshd[116587]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:46 np0005548788.localdomain sudo[116638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzakbupiervbbguunjhpvkpvvqemcfdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013085.9892683-1538-108434751037217/AnsiballZ_command.py
Dec 06 09:24:46 np0005548788.localdomain sudo[116638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:46 np0005548788.localdomain python3.9[116640]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:46 np0005548788.localdomain sudo[116638]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34190 DF PROTO=TCP SPT=47096 DPT=9100 SEQ=3355517867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D841700000000001030307) 
Dec 06 09:24:46 np0005548788.localdomain sudo[116731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elyppnshkkhsamcocknydhjofezkkqxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013086.6074593-1538-201677981808713/AnsiballZ_command.py
Dec 06 09:24:46 np0005548788.localdomain sudo[116731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:47 np0005548788.localdomain python3.9[116733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:47 np0005548788.localdomain sudo[116731]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:47 np0005548788.localdomain sudo[116824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyhsbhwouurcjxrasxqqrptveyqqyfzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013087.2177174-1538-219083690689788/AnsiballZ_command.py
Dec 06 09:24:47 np0005548788.localdomain sudo[116824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:47 np0005548788.localdomain python3.9[116826]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:47 np0005548788.localdomain sudo[116824]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:47 np0005548788.localdomain sshd[116587]: Received disconnect from 45.119.84.54 port 54982:11: Bye Bye [preauth]
Dec 06 09:24:47 np0005548788.localdomain sshd[116587]: Disconnected from authenticating user root 45.119.84.54 port 54982 [preauth]
Dec 06 09:24:48 np0005548788.localdomain sudo[116917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngxeqnxamhtvbkfsitqhyiargcrnkwxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013087.784-1538-296545336576/AnsiballZ_command.py
Dec 06 09:24:48 np0005548788.localdomain sudo[116917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:48 np0005548788.localdomain python3.9[116919]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:48 np0005548788.localdomain sudo[116917]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:48 np0005548788.localdomain sudo[117010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvnbtvxghplapzeqiigwgcnppnoduphu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013088.360368-1538-129808797162921/AnsiballZ_command.py
Dec 06 09:24:48 np0005548788.localdomain sudo[117010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:48 np0005548788.localdomain python3.9[117012]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:48 np0005548788.localdomain sudo[117010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:49 np0005548788.localdomain sudo[117103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raorbicupztrrezptowndvjvwmxwmhou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013088.9344149-1538-124663975448655/AnsiballZ_command.py
Dec 06 09:24:49 np0005548788.localdomain sudo[117103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:49 np0005548788.localdomain python3.9[117105]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:49 np0005548788.localdomain sudo[117103]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64489 DF PROTO=TCP SPT=48450 DPT=9882 SEQ=3100416934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D84D0A0000000001030307) 
Dec 06 09:24:49 np0005548788.localdomain sudo[117196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iksxamiksllcdjxkrbzdwoizxlhueyln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013089.500231-1538-216382764567935/AnsiballZ_command.py
Dec 06 09:24:49 np0005548788.localdomain sudo[117196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:49 np0005548788.localdomain python3.9[117198]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:49 np0005548788.localdomain sudo[117196]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:50 np0005548788.localdomain sudo[117289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzlqfeaiucunymuqcncxwkaankqsxplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013090.1119492-1538-231179183524857/AnsiballZ_command.py
Dec 06 09:24:50 np0005548788.localdomain sudo[117289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:50 np0005548788.localdomain python3.9[117291]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:50 np0005548788.localdomain sudo[117289]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:50 np0005548788.localdomain sshd[110923]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:24:50 np0005548788.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Dec 06 09:24:50 np0005548788.localdomain systemd[1]: session-38.scope: Consumed 31.149s CPU time.
Dec 06 09:24:50 np0005548788.localdomain systemd-logind[765]: Session 38 logged out. Waiting for processes to exit.
Dec 06 09:24:50 np0005548788.localdomain systemd-logind[765]: Removed session 38.
Dec 06 09:24:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28868 DF PROTO=TCP SPT=44220 DPT=9102 SEQ=3153299255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D858700000000001030307) 
Dec 06 09:24:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28869 DF PROTO=TCP SPT=44220 DPT=9102 SEQ=3153299255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D868300000000001030307) 
Dec 06 09:24:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34192 DF PROTO=TCP SPT=47096 DPT=9100 SEQ=3355517867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D871F00000000001030307) 
Dec 06 09:25:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28870 DF PROTO=TCP SPT=44220 DPT=9102 SEQ=3153299255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D887F00000000001030307) 
Dec 06 09:25:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59434 DF PROTO=TCP SPT=41540 DPT=9101 SEQ=1230804529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8886E0000000001030307) 
Dec 06 09:25:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59436 DF PROTO=TCP SPT=41540 DPT=9101 SEQ=1230804529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D894700000000001030307) 
Dec 06 09:25:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50546 DF PROTO=TCP SPT=56002 DPT=9105 SEQ=2363774649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D89FF10000000001030307) 
Dec 06 09:25:11 np0005548788.localdomain sshd[117307]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:11 np0005548788.localdomain sshd[117307]: Accepted publickey for zuul from 192.168.122.30 port 48906 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:25:11 np0005548788.localdomain systemd-logind[765]: New session 39 of user zuul.
Dec 06 09:25:11 np0005548788.localdomain systemd[1]: Started Session 39 of User zuul.
Dec 06 09:25:11 np0005548788.localdomain sshd[117307]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:25:12 np0005548788.localdomain python3.9[117400]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 06 09:25:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6747 DF PROTO=TCP SPT=37028 DPT=9100 SEQ=2786532120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8AA8F0000000001030307) 
Dec 06 09:25:13 np0005548788.localdomain python3.9[117504]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:25:14 np0005548788.localdomain sudo[117594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzcqwvimuounaaarkwswkpigszduqozs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013113.952791-93-273003049840458/AnsiballZ_command.py
Dec 06 09:25:14 np0005548788.localdomain sudo[117594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:14 np0005548788.localdomain python3.9[117596]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:25:14 np0005548788.localdomain sudo[117594]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:15 np0005548788.localdomain sudo[117687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sssstijxfzihgheghlnmkvglhblfpybg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013114.878458-129-196117933838125/AnsiballZ_stat.py
Dec 06 09:25:15 np0005548788.localdomain sudo[117687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:15 np0005548788.localdomain python3.9[117689]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:25:15 np0005548788.localdomain sudo[117687]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:16 np0005548788.localdomain sudo[117779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzlaybbncjydolitkbndvdjlqwrfttyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013115.6461046-153-4771881931168/AnsiballZ_file.py
Dec 06 09:25:16 np0005548788.localdomain sudo[117779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:16 np0005548788.localdomain python3.9[117781]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:25:16 np0005548788.localdomain sudo[117779]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6749 DF PROTO=TCP SPT=37028 DPT=9100 SEQ=2786532120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8B6B00000000001030307) 
Dec 06 09:25:16 np0005548788.localdomain sudo[117871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqzllwmptyyqjpongtqcaakbrhmscgdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013116.4118974-177-203407261267335/AnsiballZ_stat.py
Dec 06 09:25:16 np0005548788.localdomain sudo[117871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:17 np0005548788.localdomain python3.9[117873]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:25:17 np0005548788.localdomain sudo[117871]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:17 np0005548788.localdomain sudo[117901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:25:17 np0005548788.localdomain sudo[117901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:25:17 np0005548788.localdomain sudo[117901]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:17 np0005548788.localdomain sudo[117916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:25:17 np0005548788.localdomain sudo[117916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:25:17 np0005548788.localdomain sudo[117974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilihnamtcjmwiyuwcnammgfjodfqkodd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013116.4118974-177-203407261267335/AnsiballZ_copy.py
Dec 06 09:25:17 np0005548788.localdomain sudo[117974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:17 np0005548788.localdomain python3.9[117976]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013116.4118974-177-203407261267335/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:25:17 np0005548788.localdomain sudo[117974]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:18 np0005548788.localdomain sudo[117916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:18 np0005548788.localdomain sudo[118097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swbgghnfotmpegbsqlgwgvtdikdsokrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013118.015208-222-13810763098033/AnsiballZ_setup.py
Dec 06 09:25:18 np0005548788.localdomain sudo[118097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:18 np0005548788.localdomain python3.9[118099]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:25:18 np0005548788.localdomain sudo[118097]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:18 np0005548788.localdomain sudo[118104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:25:18 np0005548788.localdomain sudo[118104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:25:18 np0005548788.localdomain sudo[118104]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:19 np0005548788.localdomain sudo[118208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvzlpgjoxwyyunmacpwtlfsbcixeiwef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013119.046799-246-52861695266096/AnsiballZ_file.py
Dec 06 09:25:19 np0005548788.localdomain sudo[118208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:19 np0005548788.localdomain sshd[118211]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:19 np0005548788.localdomain python3.9[118210]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:25:19 np0005548788.localdomain sudo[118208]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57661 DF PROTO=TCP SPT=60398 DPT=9882 SEQ=2396478009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8C23A0000000001030307) 
Dec 06 09:25:19 np0005548788.localdomain sudo[118302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urfugjxsxjzboraasdtnhqncwexwwtpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013119.7145805-273-50478174308087/AnsiballZ_file.py
Dec 06 09:25:19 np0005548788.localdomain sudo[118302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:20 np0005548788.localdomain python3.9[118304]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:25:20 np0005548788.localdomain sudo[118302]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:21 np0005548788.localdomain python3.9[118394]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:25:21 np0005548788.localdomain network[118411]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:25:21 np0005548788.localdomain network[118412]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:25:21 np0005548788.localdomain network[118413]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:25:22 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:25:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49797 DF PROTO=TCP SPT=41460 DPT=9102 SEQ=2773196409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8CDB10000000001030307) 
Dec 06 09:25:25 np0005548788.localdomain python3.9[118611]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:25:25 np0005548788.localdomain python3.9[118701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:25:26 np0005548788.localdomain sshd[118211]: Connection closed by 45.78.219.195 port 48472 [preauth]
Dec 06 09:25:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49798 DF PROTO=TCP SPT=41460 DPT=9102 SEQ=2773196409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8DD710000000001030307) 
Dec 06 09:25:26 np0005548788.localdomain sudo[118795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdyqbhkpozhazbxuusuyydnxflpmgfqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013126.3629987-375-24154586965731/AnsiballZ_command.py
Dec 06 09:25:26 np0005548788.localdomain sudo[118795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:26 np0005548788.localdomain python3.9[118797]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:25:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6751 DF PROTO=TCP SPT=37028 DPT=9100 SEQ=2786532120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8E5F10000000001030307) 
Dec 06 09:25:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51406 DF PROTO=TCP SPT=50842 DPT=9101 SEQ=4099493542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8FD9E0000000001030307) 
Dec 06 09:25:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49799 DF PROTO=TCP SPT=41460 DPT=9102 SEQ=2773196409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D8FDF00000000001030307) 
Dec 06 09:25:36 np0005548788.localdomain sshd[45697]: Received signal 15; terminating.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: sshd.service: Unit process 110922 (sshd) remains running after unit stopped.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: sshd.service: Consumed 7.650s CPU time, no IO.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:25:36 np0005548788.localdomain sshd[118840]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:36 np0005548788.localdomain sshd[118840]: Server listening on 0.0.0.0 port 22.
Dec 06 09:25:36 np0005548788.localdomain sshd[118840]: Server listening on :: port 22.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:25:36 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:25:37 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:25:37 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:25:37 np0005548788.localdomain systemd[1]: run-radeb7608392142859597aa0502f8b580.service: Deactivated successfully.
Dec 06 09:25:37 np0005548788.localdomain systemd[1]: run-r217537ae027140ffa528be5a5b8d273e.service: Deactivated successfully.
Dec 06 09:25:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51408 DF PROTO=TCP SPT=50842 DPT=9101 SEQ=4099493542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D909B00000000001030307) 
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:25:38 np0005548788.localdomain sshd[118840]: Received signal 15; terminating.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: sshd.service: Unit process 110922 (sshd) remains running after unit stopped.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:25:38 np0005548788.localdomain sshd[119013]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:38 np0005548788.localdomain sshd[119013]: Server listening on 0.0.0.0 port 22.
Dec 06 09:25:38 np0005548788.localdomain sshd[119013]: Server listening on :: port 22.
Dec 06 09:25:38 np0005548788.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 09:25:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28374 DF PROTO=TCP SPT=48664 DPT=9105 SEQ=3080310155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D913F10000000001030307) 
Dec 06 09:25:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52535 DF PROTO=TCP SPT=57294 DPT=9100 SEQ=2661614380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D91FBF0000000001030307) 
Dec 06 09:25:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52537 DF PROTO=TCP SPT=57294 DPT=9100 SEQ=2661614380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D92BB00000000001030307) 
Dec 06 09:25:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3142 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=183942506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D937690000000001030307) 
Dec 06 09:25:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39730 DF PROTO=TCP SPT=55624 DPT=9102 SEQ=1405291185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D942B10000000001030307) 
Dec 06 09:25:52 np0005548788.localdomain sshd[119114]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:54 np0005548788.localdomain sshd[119114]: Received disconnect from 36.50.177.119 port 44310:11: Bye Bye [preauth]
Dec 06 09:25:54 np0005548788.localdomain sshd[119114]: Disconnected from authenticating user root 36.50.177.119 port 44310 [preauth]
Dec 06 09:25:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39731 DF PROTO=TCP SPT=55624 DPT=9102 SEQ=1405291185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D952700000000001030307) 
Dec 06 09:25:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52539 DF PROTO=TCP SPT=57294 DPT=9100 SEQ=2661614380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D95BF00000000001030307) 
Dec 06 09:26:00 np0005548788.localdomain sshd[110922]: fatal: Timeout before authentication for 101.47.142.76 port 37364
Dec 06 09:26:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39732 DF PROTO=TCP SPT=55624 DPT=9102 SEQ=1405291185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D971F00000000001030307) 
Dec 06 09:26:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9112 DF PROTO=TCP SPT=36482 DPT=9101 SEQ=650129405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D972CF0000000001030307) 
Dec 06 09:26:07 np0005548788.localdomain sshd[119151]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9114 DF PROTO=TCP SPT=36482 DPT=9101 SEQ=650129405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D97EF00000000001030307) 
Dec 06 09:26:09 np0005548788.localdomain sshd[119151]: Received disconnect from 45.119.84.54 port 56728:11: Bye Bye [preauth]
Dec 06 09:26:09 np0005548788.localdomain sshd[119151]: Disconnected from authenticating user root 45.119.84.54 port 56728 [preauth]
Dec 06 09:26:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12911 DF PROTO=TCP SPT=41972 DPT=9105 SEQ=1347129539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D989F10000000001030307) 
Dec 06 09:26:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2536 DF PROTO=TCP SPT=40324 DPT=9100 SEQ=3804361494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D994F00000000001030307) 
Dec 06 09:26:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2538 DF PROTO=TCP SPT=40324 DPT=9100 SEQ=3804361494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9A0F00000000001030307) 
Dec 06 09:26:19 np0005548788.localdomain sudo[119154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:26:19 np0005548788.localdomain sudo[119154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:26:19 np0005548788.localdomain sudo[119154]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:19 np0005548788.localdomain sudo[119169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:26:19 np0005548788.localdomain sudo[119169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:26:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17565 DF PROTO=TCP SPT=41922 DPT=9882 SEQ=878954008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9AC9A0000000001030307) 
Dec 06 09:26:19 np0005548788.localdomain sudo[119169]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:20 np0005548788.localdomain sudo[119216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:26:20 np0005548788.localdomain sudo[119216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:26:20 np0005548788.localdomain sudo[119216]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54987 DF PROTO=TCP SPT=54582 DPT=9102 SEQ=4117883098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9B7F00000000001030307) 
Dec 06 09:26:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54988 DF PROTO=TCP SPT=54582 DPT=9102 SEQ=4117883098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9C7B00000000001030307) 
Dec 06 09:26:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2540 DF PROTO=TCP SPT=40324 DPT=9100 SEQ=3804361494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9D1F00000000001030307) 
Dec 06 09:26:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54989 DF PROTO=TCP SPT=54582 DPT=9102 SEQ=4117883098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9E7F00000000001030307) 
Dec 06 09:26:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17569 DF PROTO=TCP SPT=41922 DPT=9882 SEQ=878954008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9E7F00000000001030307) 
Dec 06 09:26:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26779 DF PROTO=TCP SPT=40024 DPT=9101 SEQ=2640721877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5D9F3F00000000001030307) 
Dec 06 09:26:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27762 DF PROTO=TCP SPT=46178 DPT=9105 SEQ=2719702939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA03B00000000001030307) 
Dec 06 09:26:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59026 DF PROTO=TCP SPT=41036 DPT=9100 SEQ=2489842493 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA0A200000000001030307) 
Dec 06 09:26:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59028 DF PROTO=TCP SPT=41036 DPT=9100 SEQ=2489842493 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA16310000000001030307) 
Dec 06 09:26:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60110 DF PROTO=TCP SPT=60806 DPT=9882 SEQ=1382884013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA21CA0000000001030307) 
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  Converting 2741 SID table entries...
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:26:51 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:26:51 np0005548788.localdomain sshd[119570]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65162 DF PROTO=TCP SPT=56142 DPT=9102 SEQ=666720505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA2D300000000001030307) 
Dec 06 09:26:52 np0005548788.localdomain sudo[118795]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:52 np0005548788.localdomain sshd[119570]: Received disconnect from 148.227.3.232 port 49576:11: Bye Bye [preauth]
Dec 06 09:26:52 np0005548788.localdomain sshd[119570]: Disconnected from authenticating user root 148.227.3.232 port 49576 [preauth]
Dec 06 09:26:53 np0005548788.localdomain sudo[119662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmfbgswibgoootofjoemdcokbiwknvsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013213.00594-402-83152972004161/AnsiballZ_file.py
Dec 06 09:26:53 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Dec 06 09:26:53 np0005548788.localdomain sudo[119662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:53 np0005548788.localdomain python3.9[119664]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:26:53 np0005548788.localdomain sudo[119662]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:53 np0005548788.localdomain sudo[119754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcvmhwlzvvcoybiqlssvejohyuqkgsfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013213.674553-426-278162347918264/AnsiballZ_stat.py
Dec 06 09:26:53 np0005548788.localdomain sudo[119754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:54 np0005548788.localdomain python3.9[119756]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:26:54 np0005548788.localdomain sudo[119754]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:54 np0005548788.localdomain sudo[119827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvcskeuftkysaboozkbtlswxlumgalax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013213.674553-426-278162347918264/AnsiballZ_copy.py
Dec 06 09:26:54 np0005548788.localdomain sudo[119827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:54 np0005548788.localdomain python3.9[119829]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013213.674553-426-278162347918264/.source.fact _original_basename=.wzjactlu follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:26:54 np0005548788.localdomain sudo[119827]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:55 np0005548788.localdomain python3.9[119919]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:26:56 np0005548788.localdomain sudo[120015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzbkfeotaedbanebueprmdalqvjurtzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013216.1989527-501-178437506917564/AnsiballZ_setup.py
Dec 06 09:26:56 np0005548788.localdomain sudo[120015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65163 DF PROTO=TCP SPT=56142 DPT=9102 SEQ=666720505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA3CF10000000001030307) 
Dec 06 09:26:56 np0005548788.localdomain python3.9[120017]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:26:57 np0005548788.localdomain sudo[120015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:57 np0005548788.localdomain sudo[120069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvqezpobfkwlqbvrtrcxivairrgytbmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013216.1989527-501-178437506917564/AnsiballZ_dnf.py
Dec 06 09:26:57 np0005548788.localdomain sudo[120069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:57 np0005548788.localdomain python3.9[120071]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:26:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59030 DF PROTO=TCP SPT=41036 DPT=9100 SEQ=2489842493 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA45F00000000001030307) 
Dec 06 09:27:01 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:27:01 np0005548788.localdomain systemd-sysv-generator[120110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:27:01 np0005548788.localdomain systemd-rc-local-generator[120104]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:27:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:27:01 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:27:02 np0005548788.localdomain sudo[120069]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:03 np0005548788.localdomain sudo[120208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbzrqhsnlroylqthyioplwanohbeowmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013223.1459498-537-93991131197914/AnsiballZ_command.py
Dec 06 09:27:03 np0005548788.localdomain sudo[120208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:03 np0005548788.localdomain python3.9[120210]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:27:04 np0005548788.localdomain sudo[120208]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49434 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=443636623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA5D2E0000000001030307) 
Dec 06 09:27:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60114 DF PROTO=TCP SPT=60806 DPT=9882 SEQ=1382884013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA5DF00000000001030307) 
Dec 06 09:27:05 np0005548788.localdomain sudo[120447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvryokqtfgfbgdjiopazhuopaznanblk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013225.270729-561-14932792388557/AnsiballZ_selinux.py
Dec 06 09:27:05 np0005548788.localdomain sudo[120447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:06 np0005548788.localdomain python3.9[120449]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 06 09:27:06 np0005548788.localdomain sudo[120447]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:06 np0005548788.localdomain sudo[120539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udjwwgalrdbrbdtbqcfencaxpndpspuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013226.5736566-594-41101578288736/AnsiballZ_command.py
Dec 06 09:27:06 np0005548788.localdomain sudo[120539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:06 np0005548788.localdomain python3.9[120541]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 06 09:27:07 np0005548788.localdomain sudo[120539]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49436 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=443636623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA69300000000001030307) 
Dec 06 09:27:07 np0005548788.localdomain sudo[120632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqyvabeeomilltaoqrxjcldtwvfsllvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013227.688469-618-165098391771681/AnsiballZ_file.py
Dec 06 09:27:07 np0005548788.localdomain sudo[120632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:08 np0005548788.localdomain python3.9[120634]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:27:08 np0005548788.localdomain sudo[120632]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:08 np0005548788.localdomain sudo[120724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbogbekkysryvbkorrwxxgenhaphyrac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013228.3966722-642-88049828624871/AnsiballZ_mount.py
Dec 06 09:27:08 np0005548788.localdomain sudo[120724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:09 np0005548788.localdomain python3.9[120726]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 06 09:27:09 np0005548788.localdomain sudo[120724]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:10 np0005548788.localdomain sudo[120816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itcjhxrsbdgywfllgshwjzcfhvwsrdqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013229.9612987-726-153674527238268/AnsiballZ_file.py
Dec 06 09:27:10 np0005548788.localdomain sudo[120816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:10 np0005548788.localdomain python3.9[120818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:10 np0005548788.localdomain sudo[120816]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27765 DF PROTO=TCP SPT=46178 DPT=9105 SEQ=2719702939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA73F00000000001030307) 
Dec 06 09:27:10 np0005548788.localdomain sudo[120908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcndgotziaygrqtyntnlavzpxhuayakj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013230.6337285-750-15740440737123/AnsiballZ_stat.py
Dec 06 09:27:10 np0005548788.localdomain sudo[120908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:11 np0005548788.localdomain python3.9[120910]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:27:11 np0005548788.localdomain sudo[120908]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:11 np0005548788.localdomain sudo[120981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orlxarznkjuafzievgxhngairnwsvfum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013230.6337285-750-15740440737123/AnsiballZ_copy.py
Dec 06 09:27:11 np0005548788.localdomain sudo[120981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:11 np0005548788.localdomain python3.9[120983]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013230.6337285-750-15740440737123/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:27:11 np0005548788.localdomain sudo[120981]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:12 np0005548788.localdomain sudo[121073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rczfgzrrthmqsfhvxyqheqgavvefttsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013232.4113295-822-79468297066272/AnsiballZ_stat.py
Dec 06 09:27:12 np0005548788.localdomain sudo[121073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:12 np0005548788.localdomain python3.9[121075]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:27:12 np0005548788.localdomain sudo[121073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58377 DF PROTO=TCP SPT=36764 DPT=9100 SEQ=35052410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA7F4F0000000001030307) 
Dec 06 09:27:14 np0005548788.localdomain sudo[121167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwuhhrtymxbwjgjjnczzuswepdpojpiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013233.561992-861-255788952785096/AnsiballZ_getent.py
Dec 06 09:27:14 np0005548788.localdomain sudo[121167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:14 np0005548788.localdomain python3.9[121169]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 06 09:27:14 np0005548788.localdomain sudo[121167]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:14 np0005548788.localdomain sshd[121185]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:15 np0005548788.localdomain sudo[121262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bchdevxghnkwjonaihkrfuyzduftidnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013234.9735265-891-96189069673114/AnsiballZ_getent.py
Dec 06 09:27:15 np0005548788.localdomain sudo[121262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:15 np0005548788.localdomain python3.9[121264]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 06 09:27:15 np0005548788.localdomain sudo[121262]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:16 np0005548788.localdomain sudo[121355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njhfdokmhmdfqsndwmtcynyxugpknndo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013236.0429604-915-221166163373447/AnsiballZ_group.py
Dec 06 09:27:16 np0005548788.localdomain sudo[121355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:16 np0005548788.localdomain python3.9[121357]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:27:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58379 DF PROTO=TCP SPT=36764 DPT=9100 SEQ=35052410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA8B710000000001030307) 
Dec 06 09:27:16 np0005548788.localdomain groupmod[121358]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Dec 06 09:27:16 np0005548788.localdomain groupmod[121358]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Dec 06 09:27:16 np0005548788.localdomain sudo[121355]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:16 np0005548788.localdomain sshd[121185]: Received disconnect from 36.50.177.119 port 37700:11: Bye Bye [preauth]
Dec 06 09:27:16 np0005548788.localdomain sshd[121185]: Disconnected from authenticating user root 36.50.177.119 port 37700 [preauth]
Dec 06 09:27:17 np0005548788.localdomain sudo[121453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvfjjrhjdmumlnlwvqxboqqgmhnpbmnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013236.9560006-942-52057661449485/AnsiballZ_file.py
Dec 06 09:27:17 np0005548788.localdomain sudo[121453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:17 np0005548788.localdomain python3.9[121455]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 06 09:27:17 np0005548788.localdomain sudo[121453]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:18 np0005548788.localdomain sudo[121545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wteqvabpmljpxyncfasqowgaebsnawzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013237.8726768-975-242488960755734/AnsiballZ_dnf.py
Dec 06 09:27:18 np0005548788.localdomain sudo[121545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:18 np0005548788.localdomain python3.9[121547]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:27:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20251 DF PROTO=TCP SPT=37542 DPT=9102 SEQ=118590925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DA96570000000001030307) 
Dec 06 09:27:20 np0005548788.localdomain sudo[121550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:27:20 np0005548788.localdomain sudo[121550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:27:20 np0005548788.localdomain sudo[121550]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:20 np0005548788.localdomain sudo[121565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:27:20 np0005548788.localdomain sudo[121565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:27:21 np0005548788.localdomain sudo[121565]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:21 np0005548788.localdomain sudo[121545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:21 np0005548788.localdomain sudo[121701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzlwyyecvemvjqvkmmqksszdyxyysfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013241.624052-999-180650466499162/AnsiballZ_file.py
Dec 06 09:27:21 np0005548788.localdomain sudo[121701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:21 np0005548788.localdomain sudo[121704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:27:21 np0005548788.localdomain sudo[121704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:27:21 np0005548788.localdomain sudo[121704]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:22 np0005548788.localdomain python3.9[121703]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:22 np0005548788.localdomain sudo[121701]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20253 DF PROTO=TCP SPT=37542 DPT=9102 SEQ=118590925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DAA2700000000001030307) 
Dec 06 09:27:22 np0005548788.localdomain sudo[121808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqreanuqwwggjamzpqwrpkpwblsmjnmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013242.2821205-1023-90693681557721/AnsiballZ_stat.py
Dec 06 09:27:22 np0005548788.localdomain sudo[121808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:22 np0005548788.localdomain python3.9[121810]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:27:22 np0005548788.localdomain sudo[121808]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:23 np0005548788.localdomain sudo[121881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwcyvmaccagtlrsntxnpgthghuxkwhdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013242.2821205-1023-90693681557721/AnsiballZ_copy.py
Dec 06 09:27:23 np0005548788.localdomain sudo[121881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:23 np0005548788.localdomain python3.9[121883]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013242.2821205-1023-90693681557721/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:23 np0005548788.localdomain sudo[121881]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20254 DF PROTO=TCP SPT=37542 DPT=9102 SEQ=118590925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DAB2300000000001030307) 
Dec 06 09:27:28 np0005548788.localdomain sudo[121973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwncbslyyxpgvpdldcblqogvdwtbjmaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013248.012338-1068-276997798270379/AnsiballZ_systemd.py
Dec 06 09:27:28 np0005548788.localdomain sudo[121973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:28 np0005548788.localdomain python3.9[121975]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:27:28 np0005548788.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:27:28 np0005548788.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:27:28 np0005548788.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:27:28 np0005548788.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 09:27:28 np0005548788.localdomain systemd-modules-load[121979]: Module 'msr' is built in
Dec 06 09:27:28 np0005548788.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 09:27:29 np0005548788.localdomain sudo[121973]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58381 DF PROTO=TCP SPT=36764 DPT=9100 SEQ=35052410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DABBF00000000001030307) 
Dec 06 09:27:29 np0005548788.localdomain sshd[121995]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:30 np0005548788.localdomain sudo[122072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkdjhftxdjoupppdrjugdpmtomqteizq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013249.987948-1092-270417842634476/AnsiballZ_stat.py
Dec 06 09:27:30 np0005548788.localdomain sudo[122072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:30 np0005548788.localdomain sshd[121995]: Received disconnect from 45.119.84.54 port 33298:11: Bye Bye [preauth]
Dec 06 09:27:30 np0005548788.localdomain sshd[121995]: Disconnected from authenticating user root 45.119.84.54 port 33298 [preauth]
Dec 06 09:27:33 np0005548788.localdomain python3.9[122074]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:27:33 np0005548788.localdomain sudo[122072]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:33 np0005548788.localdomain sudo[122146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pleylwmjhbtjdtrwisesplixwahbxnol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013249.987948-1092-270417842634476/AnsiballZ_copy.py
Dec 06 09:27:33 np0005548788.localdomain sudo[122146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20255 DF PROTO=TCP SPT=37542 DPT=9102 SEQ=118590925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DAD1F00000000001030307) 
Dec 06 09:27:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8069 DF PROTO=TCP SPT=37596 DPT=9101 SEQ=2245759777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DAD25E0000000001030307) 
Dec 06 09:27:35 np0005548788.localdomain python3.9[122148]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013249.987948-1092-270417842634476/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:35 np0005548788.localdomain sudo[122146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:36 np0005548788.localdomain sudo[122238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzcrketrecnaofkohtajbfywsnuhsioi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013255.8632886-1146-89515852259256/AnsiballZ_dnf.py
Dec 06 09:27:36 np0005548788.localdomain sudo[122238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:36 np0005548788.localdomain python3.9[122240]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:27:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8071 DF PROTO=TCP SPT=37596 DPT=9101 SEQ=2245759777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DADE700000000001030307) 
Dec 06 09:27:39 np0005548788.localdomain sudo[122238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49336 DF PROTO=TCP SPT=57602 DPT=9105 SEQ=1365862696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DAE9F10000000001030307) 
Dec 06 09:27:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42246 DF PROTO=TCP SPT=34432 DPT=9100 SEQ=3576815518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DAF4800000000001030307) 
Dec 06 09:27:45 np0005548788.localdomain python3.9[122332]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:27:45 np0005548788.localdomain python3.9[122424]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 06 09:27:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42248 DF PROTO=TCP SPT=34432 DPT=9100 SEQ=3576815518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB00700000000001030307) 
Dec 06 09:27:47 np0005548788.localdomain python3.9[122514]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:27:47 np0005548788.localdomain sshd[122529]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:48 np0005548788.localdomain sudo[122606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcvrffitglakkodvanbrhrostvmlhsdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013268.2253542-1269-160128550482044/AnsiballZ_systemd.py
Dec 06 09:27:48 np0005548788.localdomain sudo[122606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:48 np0005548788.localdomain python3.9[122608]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:27:48 np0005548788.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 09:27:48 np0005548788.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 06 09:27:48 np0005548788.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 09:27:48 np0005548788.localdomain systemd[1]: tuned.service: Consumed 1.908s CPU time, no IO.
Dec 06 09:27:48 np0005548788.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 09:27:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27144 DF PROTO=TCP SPT=37682 DPT=9882 SEQ=3726360289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB0C2A0000000001030307) 
Dec 06 09:27:50 np0005548788.localdomain sshd[122529]: Received disconnect from 45.78.219.195 port 41468:11: Bye Bye [preauth]
Dec 06 09:27:50 np0005548788.localdomain sshd[122529]: Disconnected from authenticating user root 45.78.219.195 port 41468 [preauth]
Dec 06 09:27:50 np0005548788.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 09:27:50 np0005548788.localdomain sudo[122606]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34390 DF PROTO=TCP SPT=52080 DPT=9102 SEQ=4159781177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB17710000000001030307) 
Dec 06 09:27:52 np0005548788.localdomain python3.9[122710]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 06 09:27:56 np0005548788.localdomain sudo[122800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kreyblqwwxkxagbgibljhtawvqonlyao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013276.0058193-1440-78968465601507/AnsiballZ_systemd.py
Dec 06 09:27:56 np0005548788.localdomain sudo[122800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34391 DF PROTO=TCP SPT=52080 DPT=9102 SEQ=4159781177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB27300000000001030307) 
Dec 06 09:27:56 np0005548788.localdomain python3.9[122802]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:27:57 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:27:57 np0005548788.localdomain systemd-rc-local-generator[122828]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:27:57 np0005548788.localdomain systemd-sysv-generator[122832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:27:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:27:58 np0005548788.localdomain sudo[122800]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:58 np0005548788.localdomain sudo[122930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtsmuoaabanhxariyviekeynwrzqdulh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013278.1313908-1440-55011314669556/AnsiballZ_systemd.py
Dec 06 09:27:58 np0005548788.localdomain sudo[122930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:58 np0005548788.localdomain python3.9[122932]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:27:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42250 DF PROTO=TCP SPT=34432 DPT=9100 SEQ=3576815518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB2FF00000000001030307) 
Dec 06 09:27:58 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:27:58 np0005548788.localdomain systemd-rc-local-generator[122962]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:27:58 np0005548788.localdomain systemd-sysv-generator[122965]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:27:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:27:59 np0005548788.localdomain sudo[122930]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:59 np0005548788.localdomain sudo[123061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxnuvtgvwxjiipgbpudvtntgsomaznld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013279.3637388-1488-157026244560381/AnsiballZ_command.py
Dec 06 09:27:59 np0005548788.localdomain sudo[123061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:59 np0005548788.localdomain python3.9[123063]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:27:59 np0005548788.localdomain sudo[123061]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:00 np0005548788.localdomain sudo[123154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikzhruglnbutxezmdfcuwfzdtuupcndo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013280.0640159-1512-17412297250697/AnsiballZ_command.py
Dec 06 09:28:00 np0005548788.localdomain sudo[123154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:00 np0005548788.localdomain python3.9[123156]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:00 np0005548788.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Dec 06 09:28:00 np0005548788.localdomain sudo[123154]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:00 np0005548788.localdomain sudo[123247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vngsjlodylntkwhzqccbwzpzqcimelqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013280.7453926-1536-69912358707842/AnsiballZ_command.py
Dec 06 09:28:00 np0005548788.localdomain sudo[123247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:01 np0005548788.localdomain python3.9[123249]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:02 np0005548788.localdomain sudo[123247]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:03 np0005548788.localdomain sudo[123346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oukxaaviocxbxhkxkiigsyrkmfosmkdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013283.1367755-1560-275647223944947/AnsiballZ_command.py
Dec 06 09:28:03 np0005548788.localdomain sudo[123346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:03 np0005548788.localdomain python3.9[123348]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:03 np0005548788.localdomain sudo[123346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:04 np0005548788.localdomain sudo[123439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqvucgwpeokumfqwczbdsjhfoyxhegsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013283.8078592-1584-117319137433773/AnsiballZ_systemd.py
Dec 06 09:28:04 np0005548788.localdomain sudo[123439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:04 np0005548788.localdomain python3.9[123441]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:28:04 np0005548788.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 09:28:04 np0005548788.localdomain systemd[1]: Stopped Apply Kernel Variables.
Dec 06 09:28:04 np0005548788.localdomain systemd[1]: Stopping Apply Kernel Variables...
Dec 06 09:28:04 np0005548788.localdomain systemd[1]: Starting Apply Kernel Variables...
Dec 06 09:28:04 np0005548788.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 09:28:04 np0005548788.localdomain systemd[1]: Finished Apply Kernel Variables.
Dec 06 09:28:04 np0005548788.localdomain sudo[123439]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59460 DF PROTO=TCP SPT=35908 DPT=9101 SEQ=2525242225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB478E0000000001030307) 
Dec 06 09:28:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27148 DF PROTO=TCP SPT=37682 DPT=9882 SEQ=3726360289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB47F00000000001030307) 
Dec 06 09:28:05 np0005548788.localdomain sshd[117307]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:28:05 np0005548788.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Dec 06 09:28:05 np0005548788.localdomain systemd[1]: session-39.scope: Consumed 2min 169ms CPU time.
Dec 06 09:28:05 np0005548788.localdomain systemd-logind[765]: Session 39 logged out. Waiting for processes to exit.
Dec 06 09:28:05 np0005548788.localdomain systemd-logind[765]: Removed session 39.
Dec 06 09:28:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59462 DF PROTO=TCP SPT=35908 DPT=9101 SEQ=2525242225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB53B10000000001030307) 
Dec 06 09:28:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22282 DF PROTO=TCP SPT=46106 DPT=9105 SEQ=466839694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB5DF00000000001030307) 
Dec 06 09:28:13 np0005548788.localdomain sshd[123461]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:13 np0005548788.localdomain sshd[123461]: Accepted publickey for zuul from 192.168.122.30 port 46380 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:28:13 np0005548788.localdomain systemd-logind[765]: New session 40 of user zuul.
Dec 06 09:28:13 np0005548788.localdomain systemd[1]: Started Session 40 of User zuul.
Dec 06 09:28:13 np0005548788.localdomain sshd[123461]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:28:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42964 DF PROTO=TCP SPT=39092 DPT=9100 SEQ=2508754225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB69B00000000001030307) 
Dec 06 09:28:14 np0005548788.localdomain python3.9[123554]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:15 np0005548788.localdomain python3.9[123648]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42966 DF PROTO=TCP SPT=39092 DPT=9100 SEQ=2508754225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB75B10000000001030307) 
Dec 06 09:28:18 np0005548788.localdomain sudo[123742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvafbsbpfxevwoitedltcrsyntnnitpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013298.46752-110-172093423344425/AnsiballZ_command.py
Dec 06 09:28:18 np0005548788.localdomain sudo[123742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:19 np0005548788.localdomain python3.9[123744]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:19 np0005548788.localdomain sudo[123742]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62394 DF PROTO=TCP SPT=35852 DPT=9882 SEQ=2321901592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB81590000000001030307) 
Dec 06 09:28:20 np0005548788.localdomain python3.9[123835]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:20 np0005548788.localdomain sudo[123929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcgetowhiowylidrlgmpwxknvoaughuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013300.5782702-170-200860640880058/AnsiballZ_setup.py
Dec 06 09:28:20 np0005548788.localdomain sudo[123929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:21 np0005548788.localdomain python3.9[123931]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:28:21 np0005548788.localdomain sudo[123929]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:22 np0005548788.localdomain sudo[123983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfiigbevshtqhdpakbdrxqeushmfusex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013300.5782702-170-200860640880058/AnsiballZ_dnf.py
Dec 06 09:28:22 np0005548788.localdomain sudo[123983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:22 np0005548788.localdomain sudo[123985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:28:22 np0005548788.localdomain sudo[123985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:28:22 np0005548788.localdomain sudo[123985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:22 np0005548788.localdomain sudo[124001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:28:22 np0005548788.localdomain sudo[124001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:28:22 np0005548788.localdomain python3.9[123995]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:28:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35275 DF PROTO=TCP SPT=60032 DPT=9102 SEQ=3686365053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB8CB00000000001030307) 
Dec 06 09:28:22 np0005548788.localdomain sudo[124001]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:24 np0005548788.localdomain sudo[124050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:28:24 np0005548788.localdomain sudo[124050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:28:24 np0005548788.localdomain sudo[124050]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:25 np0005548788.localdomain sudo[123983]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:25 np0005548788.localdomain sudo[124154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuintxynlrsfitvylboqwgieddpkjoji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013305.5577958-206-717658892008/AnsiballZ_setup.py
Dec 06 09:28:25 np0005548788.localdomain sudo[124154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:26 np0005548788.localdomain python3.9[124156]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:28:26 np0005548788.localdomain sudo[124154]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35276 DF PROTO=TCP SPT=60032 DPT=9102 SEQ=3686365053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DB9C700000000001030307) 
Dec 06 09:28:27 np0005548788.localdomain sudo[124301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udiddfvzimmzilfyojpimuconlgyccup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013306.764804-239-250093415186997/AnsiballZ_file.py
Dec 06 09:28:27 np0005548788.localdomain sudo[124301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:27 np0005548788.localdomain python3.9[124303]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:27 np0005548788.localdomain sudo[124301]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:28 np0005548788.localdomain sudo[124393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucptgjaykktxmmuuwkdsbqxrlbwzsbbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013307.534461-263-40728074688871/AnsiballZ_command.py
Dec 06 09:28:28 np0005548788.localdomain sudo[124393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:28 np0005548788.localdomain python3.9[124395]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:28 np0005548788.localdomain sudo[124393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42968 DF PROTO=TCP SPT=39092 DPT=9100 SEQ=2508754225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBA5F00000000001030307) 
Dec 06 09:28:29 np0005548788.localdomain sudo[124496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqhnflfptmdfctouvhrxocamgisybcqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013308.7564003-287-229692389647061/AnsiballZ_stat.py
Dec 06 09:28:29 np0005548788.localdomain sudo[124496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:29 np0005548788.localdomain python3.9[124498]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:28:29 np0005548788.localdomain sudo[124496]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:29 np0005548788.localdomain sudo[124544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exrddtiinwlcgsasckodelzvxlbcghil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013308.7564003-287-229692389647061/AnsiballZ_file.py
Dec 06 09:28:29 np0005548788.localdomain sudo[124544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:29 np0005548788.localdomain python3.9[124546]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:29 np0005548788.localdomain sudo[124544]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:30 np0005548788.localdomain sudo[124636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uokkyvqtpfmjkiywihgiqeowoybhbhat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013310.0678248-323-189113904234758/AnsiballZ_stat.py
Dec 06 09:28:30 np0005548788.localdomain sudo[124636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:30 np0005548788.localdomain python3.9[124638]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:28:30 np0005548788.localdomain sudo[124636]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:31 np0005548788.localdomain sudo[124709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrkdhnhlbdqcbpdlufnmddeuwnnikqoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013310.0678248-323-189113904234758/AnsiballZ_copy.py
Dec 06 09:28:31 np0005548788.localdomain sudo[124709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:31 np0005548788.localdomain python3.9[124711]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013310.0678248-323-189113904234758/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:31 np0005548788.localdomain sudo[124709]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:31 np0005548788.localdomain sudo[124801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqmyaqkgaiamkrfhhqhtfehzpccvzhje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013311.522502-371-160512699654860/AnsiballZ_ini_file.py
Dec 06 09:28:31 np0005548788.localdomain sudo[124801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:33 np0005548788.localdomain python3.9[124803]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:33 np0005548788.localdomain sudo[124801]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:34 np0005548788.localdomain sudo[124893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvcauhewletfxpolwrsuzimvpzbymcer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013313.8975093-371-65680137688147/AnsiballZ_ini_file.py
Dec 06 09:28:34 np0005548788.localdomain sudo[124893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:34 np0005548788.localdomain python3.9[124895]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:34 np0005548788.localdomain sudo[124893]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35277 DF PROTO=TCP SPT=60032 DPT=9102 SEQ=3686365053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBBBF00000000001030307) 
Dec 06 09:28:34 np0005548788.localdomain sudo[124985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeoyffwuvjrtbecbbjldbwurhkriwtjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013314.4690726-371-129599501508031/AnsiballZ_ini_file.py
Dec 06 09:28:34 np0005548788.localdomain sudo[124985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49901 DF PROTO=TCP SPT=51838 DPT=9101 SEQ=2950576039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBBCBE0000000001030307) 
Dec 06 09:28:34 np0005548788.localdomain python3.9[124987]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:34 np0005548788.localdomain sudo[124985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:35 np0005548788.localdomain sudo[125077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtvcfcmvndibzzlxxnhevbifvrbmjyed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013315.0251079-371-169212376920866/AnsiballZ_ini_file.py
Dec 06 09:28:35 np0005548788.localdomain sudo[125077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:35 np0005548788.localdomain python3.9[125079]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:35 np0005548788.localdomain sudo[125077]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:37 np0005548788.localdomain python3.9[125169]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:37 np0005548788.localdomain sudo[125261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdniiktpjiytlndyxbsukjwqknwuvfsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013317.3348022-491-200230885751833/AnsiballZ_dnf.py
Dec 06 09:28:37 np0005548788.localdomain sudo[125261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:37 np0005548788.localdomain python3.9[125263]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49903 DF PROTO=TCP SPT=51838 DPT=9101 SEQ=2950576039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBC8B10000000001030307) 
Dec 06 09:28:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54018 DF PROTO=TCP SPT=59568 DPT=9105 SEQ=2684817798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBD3F00000000001030307) 
Dec 06 09:28:40 np0005548788.localdomain sudo[125261]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:41 np0005548788.localdomain sudo[125355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkkockrhbomolmkqcluutdmyymaavykd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013321.2424088-515-143280285938395/AnsiballZ_dnf.py
Dec 06 09:28:41 np0005548788.localdomain sudo[125355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:41 np0005548788.localdomain python3.9[125357]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51223 DF PROTO=TCP SPT=46414 DPT=9100 SEQ=3566574648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBDEDF0000000001030307) 
Dec 06 09:28:44 np0005548788.localdomain sudo[125355]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:45 np0005548788.localdomain sudo[125449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prqrdnytuqsjwhbiuozbbsgqbqqxqfbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013325.1343598-545-132800891972796/AnsiballZ_dnf.py
Dec 06 09:28:45 np0005548788.localdomain sudo[125449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:45 np0005548788.localdomain python3.9[125451]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51225 DF PROTO=TCP SPT=46414 DPT=9100 SEQ=3566574648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBEAF00000000001030307) 
Dec 06 09:28:48 np0005548788.localdomain sudo[125449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:49 np0005548788.localdomain sudo[125549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwrfvikzvticrrpjosouuuerkcztxmaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013329.1655078-572-100607496498740/AnsiballZ_dnf.py
Dec 06 09:28:49 np0005548788.localdomain sudo[125549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57847 DF PROTO=TCP SPT=42998 DPT=9882 SEQ=1853203280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DBF6890000000001030307) 
Dec 06 09:28:49 np0005548788.localdomain python3.9[125551]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59380 DF PROTO=TCP SPT=60212 DPT=9102 SEQ=140242520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC01F00000000001030307) 
Dec 06 09:28:52 np0005548788.localdomain sudo[125549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:53 np0005548788.localdomain sudo[125643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgrmxhkrvpjessgrknrxufigdztrfbas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013333.2061641-608-97255516394571/AnsiballZ_dnf.py
Dec 06 09:28:53 np0005548788.localdomain sudo[125643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:53 np0005548788.localdomain python3.9[125645]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59381 DF PROTO=TCP SPT=60212 DPT=9102 SEQ=140242520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC11B00000000001030307) 
Dec 06 09:28:56 np0005548788.localdomain sudo[125643]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:57 np0005548788.localdomain sudo[125737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dugewmwirjltnmiotstjvgcegsppnlmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013337.3360653-635-23658719692387/AnsiballZ_dnf.py
Dec 06 09:28:57 np0005548788.localdomain sudo[125737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:57 np0005548788.localdomain python3.9[125739]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51227 DF PROTO=TCP SPT=46414 DPT=9100 SEQ=3566574648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC1BF00000000001030307) 
Dec 06 09:29:00 np0005548788.localdomain sudo[125737]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:01 np0005548788.localdomain sudo[125831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggikyloxfskiewgqpvcgdyyqdelmezrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013341.6260421-662-73522707474077/AnsiballZ_dnf.py
Dec 06 09:29:01 np0005548788.localdomain sudo[125831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:02 np0005548788.localdomain python3.9[125833]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:29:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45654 DF PROTO=TCP SPT=59922 DPT=9101 SEQ=3615479829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC31EE0000000001030307) 
Dec 06 09:29:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59382 DF PROTO=TCP SPT=60212 DPT=9102 SEQ=140242520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC31F00000000001030307) 
Dec 06 09:29:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45656 DF PROTO=TCP SPT=59922 DPT=9101 SEQ=3615479829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC3DF00000000001030307) 
Dec 06 09:29:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39868 DF PROTO=TCP SPT=57776 DPT=9105 SEQ=3388668665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC47F10000000001030307) 
Dec 06 09:29:12 np0005548788.localdomain sudo[125831]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:13 np0005548788.localdomain sudo[125998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smtfxwrfcxseikokmvugxqiiuzxciykb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013352.7915282-698-30967829454529/AnsiballZ_file.py
Dec 06 09:29:13 np0005548788.localdomain sudo[125998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:13 np0005548788.localdomain python3.9[126000]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:29:13 np0005548788.localdomain sudo[125998]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61672 DF PROTO=TCP SPT=36168 DPT=9100 SEQ=1373369827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC540F0000000001030307) 
Dec 06 09:29:13 np0005548788.localdomain sudo[126103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-givrpvlaellbtwdjuukponjqvlgglzby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013353.434076-722-233868047829812/AnsiballZ_stat.py
Dec 06 09:29:13 np0005548788.localdomain sudo[126103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:13 np0005548788.localdomain python3.9[126105]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:29:13 np0005548788.localdomain sudo[126103]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:14 np0005548788.localdomain sudo[126176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zowvugcilaueusgfljklfembjueyawjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013353.434076-722-233868047829812/AnsiballZ_copy.py
Dec 06 09:29:14 np0005548788.localdomain sudo[126176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:14 np0005548788.localdomain python3.9[126178]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765013353.434076-722-233868047829812/.source.json _original_basename=.5czwopkc follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:29:14 np0005548788.localdomain sudo[126176]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:15 np0005548788.localdomain sudo[126268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwchmuusrtibuibwhymsubehlvxijtux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013354.9338732-776-142452014914431/AnsiballZ_podman_image.py
Dec 06 09:29:15 np0005548788.localdomain sudo[126268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:15 np0005548788.localdomain python3.9[126270]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61674 DF PROTO=TCP SPT=36168 DPT=9100 SEQ=1373369827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC60300000000001030307) 
Dec 06 09:29:16 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 77.5 (258 of 333 items), suggesting rotation.
Dec 06 09:29:16 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:29:16 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:29:16 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:29:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26046 DF PROTO=TCP SPT=60614 DPT=9882 SEQ=714661147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC6BBA0000000001030307) 
Dec 06 09:29:19 np0005548788.localdomain sshd[126310]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:21 np0005548788.localdomain podman[126283]: 2025-12-06 09:29:15.713720416 +0000 UTC m=+0.048191617 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:29:22 np0005548788.localdomain sudo[126268]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39029 DF PROTO=TCP SPT=41456 DPT=9102 SEQ=1465362649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC77300000000001030307) 
Dec 06 09:29:22 np0005548788.localdomain sshd[126310]: Connection closed by 101.47.142.76 port 41752 [preauth]
Dec 06 09:29:23 np0005548788.localdomain sshd[126472]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:23 np0005548788.localdomain sudo[126483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvajgwvlfkubblzwdgadlnudwpnemrgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013362.6330283-809-120477657257102/AnsiballZ_podman_image.py
Dec 06 09:29:23 np0005548788.localdomain sudo[126483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:23 np0005548788.localdomain python3.9[126485]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:24 np0005548788.localdomain sshd[126472]: Received disconnect from 148.227.3.232 port 50164:11: Bye Bye [preauth]
Dec 06 09:29:24 np0005548788.localdomain sshd[126472]: Disconnected from authenticating user root 148.227.3.232 port 50164 [preauth]
Dec 06 09:29:24 np0005548788.localdomain sudo[126510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:29:24 np0005548788.localdomain sudo[126510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:24 np0005548788.localdomain sudo[126510]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:24 np0005548788.localdomain sudo[126525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:29:24 np0005548788.localdomain sudo[126525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:24 np0005548788.localdomain sudo[126525]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:25 np0005548788.localdomain sudo[126573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:29:25 np0005548788.localdomain sudo[126573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:25 np0005548788.localdomain sudo[126573]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:25 np0005548788.localdomain sudo[126588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 09:29:25 np0005548788.localdomain sudo[126588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39030 DF PROTO=TCP SPT=41456 DPT=9102 SEQ=1465362649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC86F10000000001030307) 
Dec 06 09:29:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61676 DF PROTO=TCP SPT=36168 DPT=9100 SEQ=1373369827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DC8FF00000000001030307) 
Dec 06 09:29:30 np0005548788.localdomain sudo[126588]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:31 np0005548788.localdomain sudo[126656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:29:31 np0005548788.localdomain sudo[126656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:31 np0005548788.localdomain sudo[126656]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:31 np0005548788.localdomain podman[126497]: 2025-12-06 09:29:23.798053821 +0000 UTC m=+0.050425606 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:29:31 np0005548788.localdomain sudo[126483]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:32 np0005548788.localdomain sudo[126819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flqnzvnjqejkeztwlyjdgilwqvmecmkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013372.2658746-845-207630235927276/AnsiballZ_podman_image.py
Dec 06 09:29:32 np0005548788.localdomain sudo[126819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:32 np0005548788.localdomain python3.9[126821]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:34 np0005548788.localdomain podman[126835]: 2025-12-06 09:29:32.865433935 +0000 UTC m=+0.053030768 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 06 09:29:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45327 DF PROTO=TCP SPT=40086 DPT=9101 SEQ=89893756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCA71E0000000001030307) 
Dec 06 09:29:34 np0005548788.localdomain sudo[126819]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26050 DF PROTO=TCP SPT=60614 DPT=9882 SEQ=714661147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCA7F00000000001030307) 
Dec 06 09:29:35 np0005548788.localdomain sudo[126997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfbyvoyzpobqqaktgnhpfsjdntihqgye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013375.1582732-872-193373033422957/AnsiballZ_podman_image.py
Dec 06 09:29:35 np0005548788.localdomain sudo[126997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:35 np0005548788.localdomain python3.9[126999]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:36 np0005548788.localdomain podman[127011]: 2025-12-06 09:29:35.81298549 +0000 UTC m=+0.051356196 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:29:37 np0005548788.localdomain sudo[126997]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:37 np0005548788.localdomain sudo[127175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjvksxvyljvzgtqadnyyhxubfkoylync ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013377.441565-899-242997892868533/AnsiballZ_podman_image.py
Dec 06 09:29:37 np0005548788.localdomain sudo[127175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45329 DF PROTO=TCP SPT=40086 DPT=9101 SEQ=89893756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCB3300000000001030307) 
Dec 06 09:29:37 np0005548788.localdomain python3.9[127177]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52483 DF PROTO=TCP SPT=49454 DPT=9105 SEQ=2430289144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCBDF00000000001030307) 
Dec 06 09:29:41 np0005548788.localdomain podman[127191]: 2025-12-06 09:29:38.044748424 +0000 UTC m=+0.034615385 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 06 09:29:41 np0005548788.localdomain sudo[127175]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:42 np0005548788.localdomain sudo[127365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsgiwndyllceskjruihkciktjwdjlvdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013381.8378034-899-251992442617958/AnsiballZ_podman_image.py
Dec 06 09:29:42 np0005548788.localdomain sudo[127365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:42 np0005548788.localdomain python3.9[127367]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37641 DF PROTO=TCP SPT=49254 DPT=9100 SEQ=2647139657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCC9400000000001030307) 
Dec 06 09:29:44 np0005548788.localdomain podman[127379]: 2025-12-06 09:29:42.436907387 +0000 UTC m=+0.048738074 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 06 09:29:44 np0005548788.localdomain sudo[127365]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:45 np0005548788.localdomain sshd[123461]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:29:45 np0005548788.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Dec 06 09:29:45 np0005548788.localdomain systemd[1]: session-40.scope: Consumed 1min 31.295s CPU time.
Dec 06 09:29:45 np0005548788.localdomain systemd-logind[765]: Session 40 logged out. Waiting for processes to exit.
Dec 06 09:29:45 np0005548788.localdomain systemd-logind[765]: Removed session 40.
Dec 06 09:29:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37643 DF PROTO=TCP SPT=49254 DPT=9100 SEQ=2647139657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCD5300000000001030307) 
Dec 06 09:29:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25480 DF PROTO=TCP SPT=46538 DPT=9882 SEQ=3764767136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCE0E90000000001030307) 
Dec 06 09:29:50 np0005548788.localdomain sshd[127489]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:51 np0005548788.localdomain sshd[127489]: Accepted publickey for zuul from 192.168.122.30 port 53502 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:29:51 np0005548788.localdomain systemd-logind[765]: New session 41 of user zuul.
Dec 06 09:29:51 np0005548788.localdomain systemd[1]: Started Session 41 of User zuul.
Dec 06 09:29:51 np0005548788.localdomain sshd[127489]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:29:52 np0005548788.localdomain python3.9[127582]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:29:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64353 DF PROTO=TCP SPT=52316 DPT=9102 SEQ=934245886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCEC300000000001030307) 
Dec 06 09:29:53 np0005548788.localdomain sudo[127932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fntenksdmsfwdbqfvcrqwzybortxpvdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013392.705539-68-188295923577007/AnsiballZ_getent.py
Dec 06 09:29:53 np0005548788.localdomain sudo[127932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:54 np0005548788.localdomain python3.9[127934]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 06 09:29:54 np0005548788.localdomain sudo[127932]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:54 np0005548788.localdomain sudo[128025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiyoljmsmuzqgytjdeatgxsamiaeoeig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013394.6469035-104-9175383792404/AnsiballZ_setup.py
Dec 06 09:29:54 np0005548788.localdomain sudo[128025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:55 np0005548788.localdomain python3.9[128027]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:29:55 np0005548788.localdomain sudo[128025]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:56 np0005548788.localdomain sudo[128079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwmzvtrinndzptiybtespikpxcthabpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013394.6469035-104-9175383792404/AnsiballZ_dnf.py
Dec 06 09:29:56 np0005548788.localdomain sudo[128079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:56 np0005548788.localdomain python3.9[128081]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:29:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64354 DF PROTO=TCP SPT=52316 DPT=9102 SEQ=934245886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DCFBF00000000001030307) 
Dec 06 09:29:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37645 DF PROTO=TCP SPT=49254 DPT=9100 SEQ=2647139657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD05F00000000001030307) 
Dec 06 09:29:59 np0005548788.localdomain sudo[128079]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:00 np0005548788.localdomain sudo[128331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezbhwszskppymqhoulafnmftdwwbihhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013400.3297827-146-81091867751964/AnsiballZ_dnf.py
Dec 06 09:30:00 np0005548788.localdomain sudo[128331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:00 np0005548788.localdomain python3.9[128333]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:03 np0005548788.localdomain sudo[128331]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64355 DF PROTO=TCP SPT=52316 DPT=9102 SEQ=934245886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD1BF00000000001030307) 
Dec 06 09:30:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=765 DF PROTO=TCP SPT=53574 DPT=9101 SEQ=3769703905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD1C4E0000000001030307) 
Dec 06 09:30:05 np0005548788.localdomain sudo[128568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipgqancirxoajndpwksciowaeondkeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013404.2535253-170-226110764221266/AnsiballZ_systemd.py
Dec 06 09:30:05 np0005548788.localdomain sudo[128568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:05 np0005548788.localdomain python3.9[128570]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:30:06 np0005548788.localdomain sudo[128568]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=767 DF PROTO=TCP SPT=53574 DPT=9101 SEQ=3769703905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD28700000000001030307) 
Dec 06 09:30:08 np0005548788.localdomain python3.9[128663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:30:09 np0005548788.localdomain sudo[128753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzwlgwofwdyctzviyzjabaxzypfwlsvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013409.1707945-224-127779886365481/AnsiballZ_sefcontext.py
Dec 06 09:30:09 np0005548788.localdomain sudo[128753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:30:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:30:09 np0005548788.localdomain python3.9[128755]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 06 09:30:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19913 DF PROTO=TCP SPT=57556 DPT=9105 SEQ=1253317147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD33F00000000001030307) 
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  Converting 2743 SID table entries...
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:30:11 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:30:11 np0005548788.localdomain sudo[128753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:12 np0005548788.localdomain python3.9[129117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:30:13 np0005548788.localdomain sudo[129213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuopcuoeevvrkmmdtdcaeuwppupclvmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013413.24843-278-233275748334616/AnsiballZ_dnf.py
Dec 06 09:30:13 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Dec 06 09:30:13 np0005548788.localdomain sudo[129213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38467 DF PROTO=TCP SPT=51548 DPT=9100 SEQ=1719186519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD3E700000000001030307) 
Dec 06 09:30:13 np0005548788.localdomain python3.9[129215]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:30:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:30:15 np0005548788.localdomain sshd[129218]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38469 DF PROTO=TCP SPT=51548 DPT=9100 SEQ=1719186519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD4A710000000001030307) 
Dec 06 09:30:17 np0005548788.localdomain sudo[129213]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:18 np0005548788.localdomain sudo[129309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmaocfchiazxscxclefphecbbupllums ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013417.977581-302-101051529597639/AnsiballZ_command.py
Dec 06 09:30:18 np0005548788.localdomain sudo[129309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:18 np0005548788.localdomain python3.9[129311]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:30:19 np0005548788.localdomain sudo[129309]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51098 DF PROTO=TCP SPT=37978 DPT=9882 SEQ=4150725745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD561A0000000001030307) 
Dec 06 09:30:20 np0005548788.localdomain sudo[129554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqhjgkwateogktqsezeeetgrkbydvono ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013419.5848312-326-58664487243112/AnsiballZ_file.py
Dec 06 09:30:20 np0005548788.localdomain sudo[129554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:20 np0005548788.localdomain python3.9[129556]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:30:20 np0005548788.localdomain sudo[129554]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:20 np0005548788.localdomain python3.9[129646]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:30:21 np0005548788.localdomain sudo[129738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhabmitznqtlqpiqrnxnilutvjcfoogh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013421.1571665-380-86731562438477/AnsiballZ_dnf.py
Dec 06 09:30:21 np0005548788.localdomain sudo[129738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:21 np0005548788.localdomain python3.9[129740]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40391 DF PROTO=TCP SPT=34516 DPT=9102 SEQ=4225248233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD61700000000001030307) 
Dec 06 09:30:25 np0005548788.localdomain sudo[129738]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:25 np0005548788.localdomain sudo[129832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiotfxtgyvsliyrziayshcagoxdirdmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013425.2449727-404-44897683977005/AnsiballZ_dnf.py
Dec 06 09:30:25 np0005548788.localdomain sudo[129832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:25 np0005548788.localdomain python3.9[129834]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40392 DF PROTO=TCP SPT=34516 DPT=9102 SEQ=4225248233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD71300000000001030307) 
Dec 06 09:30:27 np0005548788.localdomain sshd[129218]: Received disconnect from 45.78.219.195 port 46064:11: Bye Bye [preauth]
Dec 06 09:30:27 np0005548788.localdomain sshd[129218]: Disconnected from authenticating user root 45.78.219.195 port 46064 [preauth]
Dec 06 09:30:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38471 DF PROTO=TCP SPT=51548 DPT=9100 SEQ=1719186519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD79F10000000001030307) 
Dec 06 09:30:28 np0005548788.localdomain sudo[129832]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:29 np0005548788.localdomain sudo[129926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyxhxtspiqjbktqhroxmifszginpecfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013429.1298354-428-36800674174032/AnsiballZ_systemd.py
Dec 06 09:30:29 np0005548788.localdomain sudo[129926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:29 np0005548788.localdomain python3.9[129928]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:30:30 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:30:30 np0005548788.localdomain systemd-sysv-generator[129960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:30:30 np0005548788.localdomain systemd-rc-local-generator[129955]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:30:30 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:30:31 np0005548788.localdomain sudo[129926]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:31 np0005548788.localdomain sudo[129969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:30:31 np0005548788.localdomain sudo[129969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:30:31 np0005548788.localdomain sudo[129969]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:31 np0005548788.localdomain sudo[129992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:30:31 np0005548788.localdomain sudo[129992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:30:31 np0005548788.localdomain sudo[130102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvsaxanpwjsnjtzvrxxtrvipeltdraiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013431.5949106-458-24841185599262/AnsiballZ_stat.py
Dec 06 09:30:31 np0005548788.localdomain sudo[130102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:32 np0005548788.localdomain python3.9[130107]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:30:32 np0005548788.localdomain sudo[129992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:32 np0005548788.localdomain sudo[130102]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:32 np0005548788.localdomain sudo[130211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdlmgoubrmqqfvrgelvjhimhbxkbeuqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013432.3033497-485-84861401266887/AnsiballZ_ini_file.py
Dec 06 09:30:32 np0005548788.localdomain sudo[130211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:32 np0005548788.localdomain python3.9[130213]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:32 np0005548788.localdomain sudo[130211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:33 np0005548788.localdomain sudo[130305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxbgvugqvuppohtdmljboaoeqizuubkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013433.09429-509-208488915240205/AnsiballZ_ini_file.py
Dec 06 09:30:33 np0005548788.localdomain sudo[130305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:33 np0005548788.localdomain python3.9[130307]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:33 np0005548788.localdomain sudo[130305]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:34 np0005548788.localdomain sudo[130397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elkwbunbeqmmnvnbxzwrnttkglnmlzzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013433.777366-533-80786395825358/AnsiballZ_ini_file.py
Dec 06 09:30:34 np0005548788.localdomain sudo[130397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:34 np0005548788.localdomain python3.9[130399]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:34 np0005548788.localdomain sudo[130397]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51396 DF PROTO=TCP SPT=53352 DPT=9101 SEQ=3005384827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD917E0000000001030307) 
Dec 06 09:30:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51102 DF PROTO=TCP SPT=37978 DPT=9882 SEQ=4150725745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD91F00000000001030307) 
Dec 06 09:30:35 np0005548788.localdomain sudo[130489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufldkmvorrgdbjsskftqzlewukifffna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013434.7397647-563-257626536090314/AnsiballZ_stat.py
Dec 06 09:30:35 np0005548788.localdomain sudo[130489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:35 np0005548788.localdomain auditd[728]: Audit daemon rotating log files
Dec 06 09:30:35 np0005548788.localdomain sudo[130492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:30:35 np0005548788.localdomain sudo[130492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:30:35 np0005548788.localdomain sudo[130492]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:35 np0005548788.localdomain python3.9[130491]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:35 np0005548788.localdomain sudo[130489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:35 np0005548788.localdomain sudo[130577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyzbtqtckcwinebkqwzglgzxgfcfstio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013434.7397647-563-257626536090314/AnsiballZ_copy.py
Dec 06 09:30:35 np0005548788.localdomain sudo[130577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:35 np0005548788.localdomain python3.9[130579]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013434.7397647-563-257626536090314/.source _original_basename=.a8po9m0z follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:35 np0005548788.localdomain sudo[130577]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:36 np0005548788.localdomain sudo[130669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmvttjlzmcafynsqcftomdrhvqonfhbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013436.031263-608-226123986618584/AnsiballZ_file.py
Dec 06 09:30:36 np0005548788.localdomain sudo[130669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:36 np0005548788.localdomain python3.9[130671]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:36 np0005548788.localdomain sudo[130669]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:37 np0005548788.localdomain sudo[130761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpcixezwwskuuyedjdfhrtbnhyxoffcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013436.7161732-632-250462666528835/AnsiballZ_edpm_os_net_config_mappings.py
Dec 06 09:30:37 np0005548788.localdomain sudo[130761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:37 np0005548788.localdomain python3.9[130763]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 06 09:30:37 np0005548788.localdomain sudo[130761]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:37 np0005548788.localdomain sudo[130853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njivujumilsljeiqcyejzrmhxyvhbnkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013437.5838-659-228903658310632/AnsiballZ_file.py
Dec 06 09:30:37 np0005548788.localdomain sudo[130853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51398 DF PROTO=TCP SPT=53352 DPT=9101 SEQ=3005384827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DD9D700000000001030307) 
Dec 06 09:30:38 np0005548788.localdomain python3.9[130855]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:38 np0005548788.localdomain sudo[130853]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:38 np0005548788.localdomain sudo[130945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-malgblywbbbiorzkldmhbrsemiuyvyuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013438.4503927-689-273340686907847/AnsiballZ_stat.py
Dec 06 09:30:38 np0005548788.localdomain sudo[130945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:38 np0005548788.localdomain python3.9[130947]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:38 np0005548788.localdomain sudo[130945]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:39 np0005548788.localdomain sudo[131018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilkxzknllixrmftzkxvnmpbpzjgavqfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013438.4503927-689-273340686907847/AnsiballZ_copy.py
Dec 06 09:30:39 np0005548788.localdomain sudo[131018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:39 np0005548788.localdomain python3.9[131020]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013438.4503927-689-273340686907847/.source.yaml _original_basename=.006fcimo follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:39 np0005548788.localdomain sudo[131018]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:40 np0005548788.localdomain sudo[131110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tddmkmmwyhiimxwdhjqqxgluequqtikj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013439.6781988-734-26802006590267/AnsiballZ_slurp.py
Dec 06 09:30:40 np0005548788.localdomain sudo[131110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:40 np0005548788.localdomain python3.9[131112]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 06 09:30:40 np0005548788.localdomain sudo[131110]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26474 DF PROTO=TCP SPT=37780 DPT=9105 SEQ=2826903978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDA7F00000000001030307) 
Dec 06 09:30:41 np0005548788.localdomain sudo[131215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-befjyswejymyizgytuszrnqputhawgls ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013441.134934-761-13486224255030/async_wrapper.py j664754382095 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013441.134934-761-13486224255030/AnsiballZ_edpm_os_net_config.py _
Dec 06 09:30:41 np0005548788.localdomain sudo[131215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:42 np0005548788.localdomain ansible-async_wrapper.py[131217]: Invoked with j664754382095 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013441.134934-761-13486224255030/AnsiballZ_edpm_os_net_config.py _
Dec 06 09:30:42 np0005548788.localdomain ansible-async_wrapper.py[131220]: Starting module and watcher
Dec 06 09:30:42 np0005548788.localdomain ansible-async_wrapper.py[131220]: Start watching 131221 (300)
Dec 06 09:30:42 np0005548788.localdomain ansible-async_wrapper.py[131221]: Start module (131221)
Dec 06 09:30:42 np0005548788.localdomain ansible-async_wrapper.py[131217]: Return async_wrapper task started.
Dec 06 09:30:42 np0005548788.localdomain sudo[131215]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:42 np0005548788.localdomain python3.9[131222]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Dec 06 09:30:42 np0005548788.localdomain ansible-async_wrapper.py[131221]: Module complete (131221)
Dec 06 09:30:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45964 DF PROTO=TCP SPT=51824 DPT=9100 SEQ=2921042162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDB39F0000000001030307) 
Dec 06 09:30:45 np0005548788.localdomain sudo[131312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzwujtwruyrusdqzwdhwdnkonhakbdbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013445.226033-761-6084576269566/AnsiballZ_async_status.py
Dec 06 09:30:45 np0005548788.localdomain sudo[131312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:45 np0005548788.localdomain python3.9[131314]: ansible-ansible.legacy.async_status Invoked with jid=j664754382095.131217 mode=status _async_dir=/root/.ansible_async
Dec 06 09:30:45 np0005548788.localdomain sudo[131312]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:46 np0005548788.localdomain sudo[131371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeoqfexwarqylsadztcslfuhhyoswthd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013445.226033-761-6084576269566/AnsiballZ_async_status.py
Dec 06 09:30:46 np0005548788.localdomain sudo[131371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:46 np0005548788.localdomain python3.9[131373]: ansible-ansible.legacy.async_status Invoked with jid=j664754382095.131217 mode=cleanup _async_dir=/root/.ansible_async
Dec 06 09:30:46 np0005548788.localdomain sudo[131371]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45966 DF PROTO=TCP SPT=51824 DPT=9100 SEQ=2921042162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDBFB00000000001030307) 
Dec 06 09:30:46 np0005548788.localdomain sudo[131463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubajhkujmhdrchodsujthlglfcdsasie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013446.5461526-827-85089278662196/AnsiballZ_stat.py
Dec 06 09:30:46 np0005548788.localdomain sudo[131463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:47 np0005548788.localdomain python3.9[131465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:47 np0005548788.localdomain sudo[131463]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:47 np0005548788.localdomain ansible-async_wrapper.py[131220]: Done in kid B.
Dec 06 09:30:47 np0005548788.localdomain sudo[131536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-homcldhzwhobiqjrexoltwidxisxczzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013446.5461526-827-85089278662196/AnsiballZ_copy.py
Dec 06 09:30:47 np0005548788.localdomain sudo[131536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:47 np0005548788.localdomain python3.9[131538]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013446.5461526-827-85089278662196/.source.returncode _original_basename=.o48uznh0 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:47 np0005548788.localdomain sudo[131536]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:48 np0005548788.localdomain sudo[131628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytgnetqplgygusoakgijtuqeidwurbjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013447.7842007-875-63657766844501/AnsiballZ_stat.py
Dec 06 09:30:48 np0005548788.localdomain sudo[131628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:48 np0005548788.localdomain python3.9[131630]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:48 np0005548788.localdomain sudo[131628]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:48 np0005548788.localdomain sudo[131701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbhdtjrawqmqtpiiutsnfpmgyafzhigm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013447.7842007-875-63657766844501/AnsiballZ_copy.py
Dec 06 09:30:48 np0005548788.localdomain sudo[131701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:48 np0005548788.localdomain python3.9[131703]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013447.7842007-875-63657766844501/.source.cfg _original_basename=.ptq0yyeg follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:48 np0005548788.localdomain sudo[131701]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:49 np0005548788.localdomain sudo[131793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qagnmoozcqunrkbdndzjdsqlxmcsiwhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013448.90944-920-129397357853784/AnsiballZ_systemd.py
Dec 06 09:30:49 np0005548788.localdomain sudo[131793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:49 np0005548788.localdomain python3.9[131795]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:30:49 np0005548788.localdomain systemd[1]: Reloading Network Manager...
Dec 06 09:30:49 np0005548788.localdomain NetworkManager[5968]: <info>  [1765013449.5382] audit: op="reload" arg="0" pid=131799 uid=0 result="success"
Dec 06 09:30:49 np0005548788.localdomain NetworkManager[5968]: <info>  [1765013449.5391] config: signal: SIGHUP (no changes from disk)
Dec 06 09:30:49 np0005548788.localdomain systemd[1]: Reloaded Network Manager.
Dec 06 09:30:49 np0005548788.localdomain sudo[131793]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46979 DF PROTO=TCP SPT=48710 DPT=9882 SEQ=2060256118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDCB4A0000000001030307) 
Dec 06 09:30:49 np0005548788.localdomain sshd[127489]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:30:49 np0005548788.localdomain systemd-logind[765]: Session 41 logged out. Waiting for processes to exit.
Dec 06 09:30:49 np0005548788.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Dec 06 09:30:49 np0005548788.localdomain systemd[1]: session-41.scope: Consumed 36.201s CPU time.
Dec 06 09:30:49 np0005548788.localdomain systemd-logind[765]: Removed session 41.
Dec 06 09:30:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59019 DF PROTO=TCP SPT=36516 DPT=9102 SEQ=1096637344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDD6B00000000001030307) 
Dec 06 09:30:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59020 DF PROTO=TCP SPT=36516 DPT=9102 SEQ=1096637344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDE6700000000001030307) 
Dec 06 09:30:56 np0005548788.localdomain sshd[131814]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:56 np0005548788.localdomain sshd[131814]: Accepted publickey for zuul from 192.168.122.30 port 40394 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:30:56 np0005548788.localdomain systemd-logind[765]: New session 42 of user zuul.
Dec 06 09:30:56 np0005548788.localdomain systemd[1]: Started Session 42 of User zuul.
Dec 06 09:30:56 np0005548788.localdomain sshd[131814]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:30:57 np0005548788.localdomain python3.9[131907]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:30:58 np0005548788.localdomain python3.9[132001]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:30:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45968 DF PROTO=TCP SPT=51824 DPT=9100 SEQ=2921042162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DDEFF10000000001030307) 
Dec 06 09:31:01 np0005548788.localdomain python3.9[132146]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:01 np0005548788.localdomain sshd[131814]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:31:01 np0005548788.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Dec 06 09:31:01 np0005548788.localdomain systemd[1]: session-42.scope: Consumed 2.190s CPU time.
Dec 06 09:31:01 np0005548788.localdomain systemd-logind[765]: Session 42 logged out. Waiting for processes to exit.
Dec 06 09:31:01 np0005548788.localdomain systemd-logind[765]: Removed session 42.
Dec 06 09:31:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59021 DF PROTO=TCP SPT=36516 DPT=9102 SEQ=1096637344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE05F10000000001030307) 
Dec 06 09:31:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24371 DF PROTO=TCP SPT=57412 DPT=9101 SEQ=3484364532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE06AE0000000001030307) 
Dec 06 09:31:06 np0005548788.localdomain sshd[132162]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:07 np0005548788.localdomain sshd[132162]: Accepted publickey for zuul from 192.168.122.30 port 39874 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:31:07 np0005548788.localdomain systemd-logind[765]: New session 43 of user zuul.
Dec 06 09:31:07 np0005548788.localdomain systemd[1]: Started Session 43 of User zuul.
Dec 06 09:31:07 np0005548788.localdomain sshd[132162]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:31:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24373 DF PROTO=TCP SPT=57412 DPT=9101 SEQ=3484364532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE12B00000000001030307) 
Dec 06 09:31:08 np0005548788.localdomain python3.9[132255]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:31:09 np0005548788.localdomain python3.9[132349]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:31:10 np0005548788.localdomain sudo[132443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfmoookqzpzsyilahbthgicxpoxvckmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013470.1780717-80-51677205718870/AnsiballZ_setup.py
Dec 06 09:31:10 np0005548788.localdomain sudo[132443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:10 np0005548788.localdomain python3.9[132445]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43079 DF PROTO=TCP SPT=41526 DPT=9105 SEQ=380980603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE1DF00000000001030307) 
Dec 06 09:31:11 np0005548788.localdomain sudo[132443]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:11 np0005548788.localdomain sudo[132497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hocnhzlbvnvtkxwbvaikbiimgkysczrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013470.1780717-80-51677205718870/AnsiballZ_dnf.py
Dec 06 09:31:11 np0005548788.localdomain sudo[132497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:11 np0005548788.localdomain python3.9[132499]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36468 DF PROTO=TCP SPT=37810 DPT=9100 SEQ=2013405376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE28D00000000001030307) 
Dec 06 09:31:14 np0005548788.localdomain sudo[132497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:15 np0005548788.localdomain sudo[132591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvjonnqjysqopjpwrczoxkuifnlkozmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013474.8867292-116-213563815350390/AnsiballZ_setup.py
Dec 06 09:31:15 np0005548788.localdomain sudo[132591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:15 np0005548788.localdomain python3.9[132593]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:15 np0005548788.localdomain sudo[132591]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36470 DF PROTO=TCP SPT=37810 DPT=9100 SEQ=2013405376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE34F00000000001030307) 
Dec 06 09:31:17 np0005548788.localdomain sudo[132738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwjoufnchmbmicxkbzddjijqqajpshdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013477.0256367-149-248707667192130/AnsiballZ_file.py
Dec 06 09:31:17 np0005548788.localdomain sudo[132738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:17 np0005548788.localdomain python3.9[132740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:17 np0005548788.localdomain sudo[132738]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:18 np0005548788.localdomain sudo[132830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avouxzyelcfuuwfzashgcltmauvrdelr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013477.8352218-173-113851956477353/AnsiballZ_command.py
Dec 06 09:31:18 np0005548788.localdomain sudo[132830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:18 np0005548788.localdomain python3.9[132832]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:18 np0005548788.localdomain sudo[132830]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:19 np0005548788.localdomain sudo[132934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbgmrhnfhlvsfuayrnnbwgaxygsdvnsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013478.6912658-197-187668768942193/AnsiballZ_stat.py
Dec 06 09:31:19 np0005548788.localdomain sudo[132934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:19 np0005548788.localdomain python3.9[132936]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:19 np0005548788.localdomain sudo[132934]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:19 np0005548788.localdomain sudo[132982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rviltyvgmsaokppwpskqdpinprrnozmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013478.6912658-197-187668768942193/AnsiballZ_file.py
Dec 06 09:31:19 np0005548788.localdomain sudo[132982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58705 DF PROTO=TCP SPT=48946 DPT=9882 SEQ=4076028861 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE407B0000000001030307) 
Dec 06 09:31:19 np0005548788.localdomain python3.9[132984]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:19 np0005548788.localdomain sudo[132982]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:20 np0005548788.localdomain sudo[133074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqmuvhjmgulkfsgntogwgdgienmbsotk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013479.9412751-233-102984064299991/AnsiballZ_stat.py
Dec 06 09:31:20 np0005548788.localdomain sudo[133074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:20 np0005548788.localdomain python3.9[133076]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:20 np0005548788.localdomain sudo[133074]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:20 np0005548788.localdomain sudo[133122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjgavqnwexsfcavsscizwhpoxmoxfqxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013479.9412751-233-102984064299991/AnsiballZ_file.py
Dec 06 09:31:20 np0005548788.localdomain sudo[133122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:20 np0005548788.localdomain python3.9[133124]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:20 np0005548788.localdomain sudo[133122]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:21 np0005548788.localdomain sudo[133214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvnfxmdajhrgfpnrmmfyotoibpaswbok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013481.1046023-272-64451194904733/AnsiballZ_ini_file.py
Dec 06 09:31:21 np0005548788.localdomain sudo[133214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:21 np0005548788.localdomain python3.9[133216]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:21 np0005548788.localdomain sudo[133214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:22 np0005548788.localdomain sudo[133306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbtninuozcgavaunvltkbtelyffkpece ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013481.8240335-272-235109472861904/AnsiballZ_ini_file.py
Dec 06 09:31:22 np0005548788.localdomain sudo[133306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:22 np0005548788.localdomain python3.9[133308]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:22 np0005548788.localdomain sudo[133306]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60649 DF PROTO=TCP SPT=42696 DPT=9102 SEQ=2223048179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE4BF00000000001030307) 
Dec 06 09:31:22 np0005548788.localdomain sudo[133398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxojfjlgiotzuqjoyarmjpgvlfvluhhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013482.4221396-272-232956921339913/AnsiballZ_ini_file.py
Dec 06 09:31:22 np0005548788.localdomain sudo[133398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:22 np0005548788.localdomain python3.9[133400]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:22 np0005548788.localdomain sudo[133398]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:23 np0005548788.localdomain sudo[133490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiywvrqfjnzfsvxhntauxcczfiaxdhcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013483.0395546-272-8238277711347/AnsiballZ_ini_file.py
Dec 06 09:31:23 np0005548788.localdomain sudo[133490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:23 np0005548788.localdomain python3.9[133492]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:23 np0005548788.localdomain sudo[133490]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:24 np0005548788.localdomain sudo[133582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwuvfqiqmtootbaxdtwafbwqlyfujeek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013483.7574391-365-4095072462545/AnsiballZ_dnf.py
Dec 06 09:31:24 np0005548788.localdomain sudo[133582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:24 np0005548788.localdomain python3.9[133584]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60650 DF PROTO=TCP SPT=42696 DPT=9102 SEQ=2223048179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE5BB10000000001030307) 
Dec 06 09:31:27 np0005548788.localdomain sudo[133582]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:28 np0005548788.localdomain sudo[133676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuctvqxoqbewylpzerhtrrxphxlapfke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013487.8883018-398-13403080220460/AnsiballZ_setup.py
Dec 06 09:31:28 np0005548788.localdomain sudo[133676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:28 np0005548788.localdomain python3.9[133678]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:31:28 np0005548788.localdomain sudo[133676]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:28 np0005548788.localdomain sudo[133770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icfkbftnoutfyaqiofmdcpsurrzwrkmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013488.6298823-422-205877393534012/AnsiballZ_stat.py
Dec 06 09:31:28 np0005548788.localdomain sudo[133770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:29 np0005548788.localdomain python3.9[133772]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:29 np0005548788.localdomain sudo[133770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36472 DF PROTO=TCP SPT=37810 DPT=9100 SEQ=2013405376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE65F10000000001030307) 
Dec 06 09:31:29 np0005548788.localdomain sudo[133862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgnyjyoupcipxosxdecwhtmhxylchvlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013489.358145-449-262621895716380/AnsiballZ_stat.py
Dec 06 09:31:29 np0005548788.localdomain sudo[133862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:29 np0005548788.localdomain python3.9[133864]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:29 np0005548788.localdomain sudo[133862]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:30 np0005548788.localdomain sudo[133954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylznuxviiztkvbobjhwvbpmkalefhiag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013490.1505923-479-153166361364689/AnsiballZ_command.py
Dec 06 09:31:30 np0005548788.localdomain sudo[133954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:30 np0005548788.localdomain python3.9[133956]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:30 np0005548788.localdomain sudo[133954]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:31 np0005548788.localdomain sudo[134047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qigfzwqqdpebjvrjuxkiionnsbgunged ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013490.954656-509-134483827014736/AnsiballZ_service_facts.py
Dec 06 09:31:31 np0005548788.localdomain sudo[134047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:31 np0005548788.localdomain python3.9[134049]: ansible-service_facts Invoked
Dec 06 09:31:31 np0005548788.localdomain network[134066]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:31:31 np0005548788.localdomain network[134067]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:31:31 np0005548788.localdomain network[134068]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:31:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:31:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5661 DF PROTO=TCP SPT=57074 DPT=9101 SEQ=3124196304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE7BDE0000000001030307) 
Dec 06 09:31:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60651 DF PROTO=TCP SPT=42696 DPT=9102 SEQ=2223048179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE7BF00000000001030307) 
Dec 06 09:31:35 np0005548788.localdomain sudo[134175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:31:35 np0005548788.localdomain sudo[134175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:35 np0005548788.localdomain sudo[134175]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:35 np0005548788.localdomain sudo[134190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:31:35 np0005548788.localdomain sudo[134190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:36 np0005548788.localdomain sudo[134190]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:36 np0005548788.localdomain sudo[134047]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:36 np0005548788.localdomain sudo[134240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:31:36 np0005548788.localdomain sudo[134240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:36 np0005548788.localdomain sudo[134240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:36 np0005548788.localdomain sudo[134262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 09:31:36 np0005548788.localdomain sudo[134262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 2025-12-06 09:31:37.081171841 +0000 UTC m=+0.081669616 container create 182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_raman, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, distribution-scope=public, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, architecture=x86_64)
Dec 06 09:31:37 np0005548788.localdomain systemd[1]: Started libpod-conmon-182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183.scope.
Dec 06 09:31:37 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 2025-12-06 09:31:37.047100364 +0000 UTC m=+0.047598179 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 2025-12-06 09:31:37.156620323 +0000 UTC m=+0.157118098 container init 182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_raman, RELEASE=main, ceph=True, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1763362218, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 2025-12-06 09:31:37.168590695 +0000 UTC m=+0.169088480 container start 182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_raman, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 2025-12-06 09:31:37.168858493 +0000 UTC m=+0.169356268 container attach 182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_raman, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, version=7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 09:31:37 np0005548788.localdomain elegant_raman[134339]: 167 167
Dec 06 09:31:37 np0005548788.localdomain systemd[1]: libpod-182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183.scope: Deactivated successfully.
Dec 06 09:31:37 np0005548788.localdomain podman[134324]: 2025-12-06 09:31:37.171756554 +0000 UTC m=+0.172254349 container died 182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_raman, io.buildah.version=1.41.4, release=1763362218, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:31:37 np0005548788.localdomain podman[134346]: 2025-12-06 09:31:37.284523354 +0000 UTC m=+0.103206835 container remove 182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_raman, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 09:31:37 np0005548788.localdomain systemd[1]: libpod-conmon-182e64c5c3b382eaa0c194cd8ce69bdb999acb7ae5c7a04eb3f0eaed9c283183.scope: Deactivated successfully.
Dec 06 09:31:37 np0005548788.localdomain podman[134413]: 
Dec 06 09:31:37 np0005548788.localdomain podman[134413]: 2025-12-06 09:31:37.46669583 +0000 UTC m=+0.057419614 container create 0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_chandrasekhar, name=rhceph, release=1763362218, GIT_CLEAN=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container)
Dec 06 09:31:37 np0005548788.localdomain systemd[1]: Started libpod-conmon-0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b.scope.
Dec 06 09:31:37 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:31:37 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e5849e7ec4f55696c2175a6d31fc57c854f1763ad9fec87dd39d32d62fc707b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:31:37 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e5849e7ec4f55696c2175a6d31fc57c854f1763ad9fec87dd39d32d62fc707b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:31:37 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e5849e7ec4f55696c2175a6d31fc57c854f1763ad9fec87dd39d32d62fc707b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:31:37 np0005548788.localdomain podman[134413]: 2025-12-06 09:31:37.523463942 +0000 UTC m=+0.114187726 container init 0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_chandrasekhar, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, release=1763362218)
Dec 06 09:31:37 np0005548788.localdomain podman[134413]: 2025-12-06 09:31:37.534900477 +0000 UTC m=+0.125624261 container start 0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_chandrasekhar, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph)
Dec 06 09:31:37 np0005548788.localdomain podman[134413]: 2025-12-06 09:31:37.535130584 +0000 UTC m=+0.125854368 container attach 0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_chandrasekhar, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, release=1763362218, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z)
Dec 06 09:31:37 np0005548788.localdomain podman[134413]: 2025-12-06 09:31:37.445795001 +0000 UTC m=+0.036518835 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:31:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5663 DF PROTO=TCP SPT=57074 DPT=9101 SEQ=3124196304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE87F00000000001030307) 
Dec 06 09:31:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0df4c46cff8b8d305b2d05d8967ca153130a32645c6457ce1e63b790210acb00-merged.mount: Deactivated successfully.
Dec 06 09:31:38 np0005548788.localdomain sudo[134863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibuqjitecmhqmwkvkhmeskboiharipsg ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1765013497.2886472-554-111937905742616/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1765013497.2886472-554-111937905742616/args
Dec 06 09:31:38 np0005548788.localdomain sudo[134863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:38 np0005548788.localdomain sudo[134863]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]: [
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:     {
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "available": false,
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "ceph_device": false,
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "lsm_data": {},
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "lvs": [],
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "path": "/dev/sr0",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "rejected_reasons": [
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "Has a FileSystem",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "Insufficient space (<5GB)"
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         ],
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         "sys_api": {
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "actuators": null,
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "device_nodes": "sr0",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "human_readable_size": "482.00 KB",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "id_bus": "ata",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "model": "QEMU DVD-ROM",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "nr_requests": "2",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "partitions": {},
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "path": "/dev/sr0",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "removable": "1",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "rev": "2.5+",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "ro": "0",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "rotational": "1",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "sas_address": "",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "sas_device_handle": "",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "scheduler_mode": "mq-deadline",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "sectors": 0,
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "sectorsize": "2048",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "size": 493568.0,
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "support_discard": "0",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "type": "disk",
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:             "vendor": "QEMU"
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:         }
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]:     }
Dec 06 09:31:38 np0005548788.localdomain agitated_chandrasekhar[134455]: ]
Dec 06 09:31:38 np0005548788.localdomain systemd[1]: libpod-0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b.scope: Deactivated successfully.
Dec 06 09:31:38 np0005548788.localdomain podman[135945]: 2025-12-06 09:31:38.48877414 +0000 UTC m=+0.040363895 container died 0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_chandrasekhar, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 06 09:31:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1e5849e7ec4f55696c2175a6d31fc57c854f1763ad9fec87dd39d32d62fc707b-merged.mount: Deactivated successfully.
Dec 06 09:31:38 np0005548788.localdomain podman[135945]: 2025-12-06 09:31:38.663759092 +0000 UTC m=+0.215348817 container remove 0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_chandrasekhar, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, version=7, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 09:31:38 np0005548788.localdomain systemd[1]: libpod-conmon-0c3dbe235a3ba56cbeafe5bd3507bc175e90710b456bfc8fb357a0c0d56c7c9b.scope: Deactivated successfully.
Dec 06 09:31:38 np0005548788.localdomain sudo[134262]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:38 np0005548788.localdomain sudo[136034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqchkrkpxxauclvefpgoxhgiacpwmitj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013498.4998136-587-148102010195926/AnsiballZ_dnf.py
Dec 06 09:31:38 np0005548788.localdomain sudo[136034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:39 np0005548788.localdomain python3.9[136036]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:39 np0005548788.localdomain sudo[136038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:31:39 np0005548788.localdomain sudo[136038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:39 np0005548788.localdomain sudo[136038]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34594 DF PROTO=TCP SPT=51934 DPT=9105 SEQ=3948930410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE91F10000000001030307) 
Dec 06 09:31:42 np0005548788.localdomain sudo[136034]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:43 np0005548788.localdomain sudo[136143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwufkpjzqoucwxlespveqlbovwszvvgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013502.72887-626-20242021337591/AnsiballZ_package_facts.py
Dec 06 09:31:43 np0005548788.localdomain sudo[136143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44972 DF PROTO=TCP SPT=43478 DPT=9100 SEQ=3338826032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DE9E1D0000000001030307) 
Dec 06 09:31:43 np0005548788.localdomain python3.9[136145]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 06 09:31:43 np0005548788.localdomain sudo[136143]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:44 np0005548788.localdomain sudo[136235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrlzvteqzjneulxmpwrudsenbrytakhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013504.6260576-656-266177864881621/AnsiballZ_stat.py
Dec 06 09:31:44 np0005548788.localdomain sudo[136235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:45 np0005548788.localdomain python3.9[136237]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:45 np0005548788.localdomain sudo[136235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:45 np0005548788.localdomain sudo[136310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rckgzgxskdlvlqvefsrpypnzwguunwov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013504.6260576-656-266177864881621/AnsiballZ_copy.py
Dec 06 09:31:45 np0005548788.localdomain sudo[136310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:45 np0005548788.localdomain python3.9[136312]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013504.6260576-656-266177864881621/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:45 np0005548788.localdomain sudo[136310]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:46 np0005548788.localdomain sudo[136404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwezbupfdwmzxqtzzbrxezbuzfudrwao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013506.1024241-701-245004766461891/AnsiballZ_stat.py
Dec 06 09:31:46 np0005548788.localdomain sudo[136404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:46 np0005548788.localdomain python3.9[136406]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:46 np0005548788.localdomain sudo[136404]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44974 DF PROTO=TCP SPT=43478 DPT=9100 SEQ=3338826032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DEAA300000000001030307) 
Dec 06 09:31:46 np0005548788.localdomain sudo[136479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvcdkohvqnfnwpzqxgwbxicqiwndjxjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013506.1024241-701-245004766461891/AnsiballZ_copy.py
Dec 06 09:31:46 np0005548788.localdomain sudo[136479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:47 np0005548788.localdomain python3.9[136481]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013506.1024241-701-245004766461891/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:47 np0005548788.localdomain sudo[136479]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:47 np0005548788.localdomain sshd[136498]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:47 np0005548788.localdomain sshd[136498]: Received disconnect from 148.227.3.232 port 47124:11: Bye Bye [preauth]
Dec 06 09:31:47 np0005548788.localdomain sshd[136498]: Disconnected from authenticating user root 148.227.3.232 port 47124 [preauth]
Dec 06 09:31:48 np0005548788.localdomain sudo[136575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfxgioglnvmdbcndrysbeadizggpfvqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013508.1108696-764-63576108568229/AnsiballZ_lineinfile.py
Dec 06 09:31:48 np0005548788.localdomain sudo[136575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:48 np0005548788.localdomain python3.9[136577]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:48 np0005548788.localdomain sudo[136575]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30170 DF PROTO=TCP SPT=57324 DPT=9882 SEQ=2423094356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DEB5AA0000000001030307) 
Dec 06 09:31:50 np0005548788.localdomain sudo[136669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eutodrpxcdltotpdmiykyozcllsddjik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013509.8248024-809-23322899763385/AnsiballZ_setup.py
Dec 06 09:31:50 np0005548788.localdomain sudo[136669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:50 np0005548788.localdomain python3.9[136671]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:50 np0005548788.localdomain sudo[136669]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:51 np0005548788.localdomain sudo[136723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbobhsihufobljneawrqxcdxxtlaoybj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013509.8248024-809-23322899763385/AnsiballZ_systemd.py
Dec 06 09:31:51 np0005548788.localdomain sudo[136723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:52 np0005548788.localdomain python3.9[136725]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:31:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18779 DF PROTO=TCP SPT=44574 DPT=9102 SEQ=2740015423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DEC0F00000000001030307) 
Dec 06 09:31:53 np0005548788.localdomain sudo[136723]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:54 np0005548788.localdomain sudo[136817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agqfenqacsvpncheipqhwbwhvabgadbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013513.9773865-857-205701353198681/AnsiballZ_setup.py
Dec 06 09:31:54 np0005548788.localdomain sudo[136817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:54 np0005548788.localdomain python3.9[136819]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:54 np0005548788.localdomain sudo[136817]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:55 np0005548788.localdomain sudo[136872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxkovoyfcpqrlhpqelwubzyctkcvihpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013513.9773865-857-205701353198681/AnsiballZ_systemd.py
Dec 06 09:31:55 np0005548788.localdomain sudo[136872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:55 np0005548788.localdomain python3.9[136874]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:31:55 np0005548788.localdomain chronyd[25948]: chronyd exiting
Dec 06 09:31:55 np0005548788.localdomain systemd[1]: Stopping NTP client/server...
Dec 06 09:31:55 np0005548788.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 09:31:55 np0005548788.localdomain systemd[1]: Stopped NTP client/server.
Dec 06 09:31:55 np0005548788.localdomain systemd[1]: Starting NTP client/server...
Dec 06 09:31:55 np0005548788.localdomain chronyd[136882]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 09:31:55 np0005548788.localdomain chronyd[136882]: Frequency -30.604 +/- 0.373 ppm read from /var/lib/chrony/drift
Dec 06 09:31:55 np0005548788.localdomain chronyd[136882]: Loaded seccomp filter (level 2)
Dec 06 09:31:55 np0005548788.localdomain systemd[1]: Started NTP client/server.
Dec 06 09:31:55 np0005548788.localdomain sudo[136872]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:56 np0005548788.localdomain sshd[132162]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:31:56 np0005548788.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Dec 06 09:31:56 np0005548788.localdomain systemd[1]: session-43.scope: Consumed 28.473s CPU time.
Dec 06 09:31:56 np0005548788.localdomain systemd-logind[765]: Session 43 logged out. Waiting for processes to exit.
Dec 06 09:31:56 np0005548788.localdomain systemd-logind[765]: Removed session 43.
Dec 06 09:31:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18780 DF PROTO=TCP SPT=44574 DPT=9102 SEQ=2740015423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DED0B10000000001030307) 
Dec 06 09:31:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44976 DF PROTO=TCP SPT=43478 DPT=9100 SEQ=3338826032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DED9F10000000001030307) 
Dec 06 09:32:01 np0005548788.localdomain sshd[136898]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:01 np0005548788.localdomain sshd[136898]: Accepted publickey for zuul from 192.168.122.30 port 59958 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:32:01 np0005548788.localdomain systemd-logind[765]: New session 44 of user zuul.
Dec 06 09:32:02 np0005548788.localdomain systemd[1]: Started Session 44 of User zuul.
Dec 06 09:32:02 np0005548788.localdomain sshd[136898]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:32:03 np0005548788.localdomain python3.9[136991]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:32:04 np0005548788.localdomain sudo[137085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrjdymmxqhlamqavpgflprnbiqnasmbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013523.8803384-59-112416852376213/AnsiballZ_file.py
Dec 06 09:32:04 np0005548788.localdomain sudo[137085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:04 np0005548788.localdomain python3.9[137087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:04 np0005548788.localdomain sudo[137085]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31598 DF PROTO=TCP SPT=47562 DPT=9101 SEQ=2089812155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DEF10E0000000001030307) 
Dec 06 09:32:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18781 DF PROTO=TCP SPT=44574 DPT=9102 SEQ=2740015423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DEF1F00000000001030307) 
Dec 06 09:32:05 np0005548788.localdomain sudo[137190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arymsewxkvsqiizrncumlasukkncvsem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013524.669825-83-147661594519819/AnsiballZ_stat.py
Dec 06 09:32:05 np0005548788.localdomain sudo[137190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:05 np0005548788.localdomain python3.9[137192]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:05 np0005548788.localdomain sudo[137190]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:05 np0005548788.localdomain sudo[137238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmdokefmeksrisnwmgqtxoerrkodqsni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013524.669825-83-147661594519819/AnsiballZ_file.py
Dec 06 09:32:05 np0005548788.localdomain sudo[137238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:05 np0005548788.localdomain python3.9[137240]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.3l9eeo3r recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:05 np0005548788.localdomain sudo[137238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:06 np0005548788.localdomain sudo[137330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcjqcurwhlyjgwmlndqwaqzxdgvqedup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013526.2340424-143-235831309850428/AnsiballZ_stat.py
Dec 06 09:32:06 np0005548788.localdomain sudo[137330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:06 np0005548788.localdomain python3.9[137332]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:06 np0005548788.localdomain sudo[137330]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:07 np0005548788.localdomain sudo[137405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krigekmkqypelwwbtivymvuevaprjhii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013526.2340424-143-235831309850428/AnsiballZ_copy.py
Dec 06 09:32:07 np0005548788.localdomain sudo[137405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:07 np0005548788.localdomain python3.9[137407]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013526.2340424-143-235831309850428/.source _original_basename=.shcbcg7b follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:07 np0005548788.localdomain sudo[137405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:07 np0005548788.localdomain sudo[137497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-labkqweqbkezbzqlxkprnsahhmgvdslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013527.61295-191-238587589682145/AnsiballZ_file.py
Dec 06 09:32:07 np0005548788.localdomain sudo[137497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31600 DF PROTO=TCP SPT=47562 DPT=9101 SEQ=2089812155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DEFD300000000001030307) 
Dec 06 09:32:08 np0005548788.localdomain python3.9[137499]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:32:08 np0005548788.localdomain sudo[137497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:08 np0005548788.localdomain sudo[137589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghlguwpmruufajcdxbrkupcrobskommt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013528.231363-215-245002338677770/AnsiballZ_stat.py
Dec 06 09:32:08 np0005548788.localdomain sudo[137589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:08 np0005548788.localdomain python3.9[137591]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:08 np0005548788.localdomain sudo[137589]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:08 np0005548788.localdomain sudo[137662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugivevezpkwzidescltodeldnkfpntwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013528.231363-215-245002338677770/AnsiballZ_copy.py
Dec 06 09:32:08 np0005548788.localdomain sudo[137662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:09 np0005548788.localdomain python3.9[137664]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013528.231363-215-245002338677770/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:32:09 np0005548788.localdomain sudo[137662]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:09 np0005548788.localdomain sudo[137754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsuplssrwzxskrbcpcynpxdvnatdxugp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013529.291118-215-203168346098564/AnsiballZ_stat.py
Dec 06 09:32:09 np0005548788.localdomain sudo[137754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:09 np0005548788.localdomain python3.9[137756]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:09 np0005548788.localdomain sudo[137754]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:10 np0005548788.localdomain sudo[137827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paipjppejpzpusdqcjmiuczgrxxqvyeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013529.291118-215-203168346098564/AnsiballZ_copy.py
Dec 06 09:32:10 np0005548788.localdomain sudo[137827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:10 np0005548788.localdomain python3.9[137829]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013529.291118-215-203168346098564/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:32:10 np0005548788.localdomain sudo[137827]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59163 DF PROTO=TCP SPT=32980 DPT=9105 SEQ=1700496767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF07F00000000001030307) 
Dec 06 09:32:10 np0005548788.localdomain sudo[137919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksszvqirnlslhfoonlbxfclrbufvnlpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013530.4553595-302-28346015522965/AnsiballZ_file.py
Dec 06 09:32:10 np0005548788.localdomain sudo[137919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:10 np0005548788.localdomain python3.9[137921]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:10 np0005548788.localdomain sudo[137919]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:11 np0005548788.localdomain sudo[138011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yshocpjpbrvcdhzwpdgwjhpsdqhbtkfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013531.1004581-326-235268617746420/AnsiballZ_stat.py
Dec 06 09:32:11 np0005548788.localdomain sudo[138011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:12 np0005548788.localdomain python3.9[138013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:12 np0005548788.localdomain sudo[138011]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:12 np0005548788.localdomain sudo[138084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlaxpzenelomnaenzlngvpuyihdzfscr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013531.1004581-326-235268617746420/AnsiballZ_copy.py
Dec 06 09:32:12 np0005548788.localdomain sudo[138084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:12 np0005548788.localdomain python3.9[138086]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013531.1004581-326-235268617746420/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:12 np0005548788.localdomain sudo[138084]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:13 np0005548788.localdomain sudo[138176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jarmiqimozuwjbethxfqxmnnwlvbnrzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013532.8000886-371-172981305211146/AnsiballZ_stat.py
Dec 06 09:32:13 np0005548788.localdomain sudo[138176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:13 np0005548788.localdomain python3.9[138178]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:13 np0005548788.localdomain sudo[138176]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26442 DF PROTO=TCP SPT=51942 DPT=9100 SEQ=1800593309 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF132F0000000001030307) 
Dec 06 09:32:14 np0005548788.localdomain sudo[138249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwucvjybyrxvaxhqvfszsiponmqzrjxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013532.8000886-371-172981305211146/AnsiballZ_copy.py
Dec 06 09:32:14 np0005548788.localdomain sudo[138249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:14 np0005548788.localdomain python3.9[138251]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013532.8000886-371-172981305211146/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:14 np0005548788.localdomain sudo[138249]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:15 np0005548788.localdomain sudo[138341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szeiaygqdsmchjfophgmrkxzimpphywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013534.5479367-416-90467694175605/AnsiballZ_systemd.py
Dec 06 09:32:15 np0005548788.localdomain sudo[138341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:15 np0005548788.localdomain python3.9[138343]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:32:15 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:32:15 np0005548788.localdomain systemd-rc-local-generator[138372]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:15 np0005548788.localdomain systemd-sysv-generator[138375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:15 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:32:15 np0005548788.localdomain systemd-rc-local-generator[138410]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:15 np0005548788.localdomain systemd-sysv-generator[138414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:16 np0005548788.localdomain systemd[1]: Starting EDPM Container Shutdown...
Dec 06 09:32:16 np0005548788.localdomain systemd[1]: Finished EDPM Container Shutdown.
Dec 06 09:32:16 np0005548788.localdomain sudo[138341]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:16 np0005548788.localdomain sudo[138511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmbuihmimipinxejwgfypwpoihtxokyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013536.2932725-440-229460703278862/AnsiballZ_stat.py
Dec 06 09:32:16 np0005548788.localdomain sudo[138511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26444 DF PROTO=TCP SPT=51942 DPT=9100 SEQ=1800593309 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF1F300000000001030307) 
Dec 06 09:32:16 np0005548788.localdomain python3.9[138513]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:16 np0005548788.localdomain sudo[138511]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:17 np0005548788.localdomain sudo[138584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqfkfjexagoafqkeofzeidtwcwgoaqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013536.2932725-440-229460703278862/AnsiballZ_copy.py
Dec 06 09:32:17 np0005548788.localdomain sudo[138584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:17 np0005548788.localdomain python3.9[138586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013536.2932725-440-229460703278862/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:17 np0005548788.localdomain sudo[138584]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:17 np0005548788.localdomain sudo[138676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hylwfhaquspkxvaodjlifabrkgxxybeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013537.5266414-485-201911281807322/AnsiballZ_stat.py
Dec 06 09:32:17 np0005548788.localdomain sudo[138676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:17 np0005548788.localdomain python3.9[138678]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:17 np0005548788.localdomain sudo[138676]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:18 np0005548788.localdomain sudo[138749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maimyncverldmptamnhaounwstqotuyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013537.5266414-485-201911281807322/AnsiballZ_copy.py
Dec 06 09:32:18 np0005548788.localdomain sudo[138749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:18 np0005548788.localdomain python3.9[138751]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013537.5266414-485-201911281807322/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:18 np0005548788.localdomain sudo[138749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:18 np0005548788.localdomain sudo[138841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afbfbchnvfqylatvhvcuqusqbbmuyapg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013538.6935544-530-50638472274683/AnsiballZ_systemd.py
Dec 06 09:32:18 np0005548788.localdomain sudo[138841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:19 np0005548788.localdomain python3.9[138843]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:32:19 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:32:19 np0005548788.localdomain systemd-rc-local-generator[138869]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:19 np0005548788.localdomain systemd-sysv-generator[138873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:19 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:19 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:32:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22994 DF PROTO=TCP SPT=38310 DPT=9882 SEQ=3046874587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF2ADA0000000001030307) 
Dec 06 09:32:19 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:32:19 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:32:19 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:32:19 np0005548788.localdomain sudo[138841]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:20 np0005548788.localdomain python3.9[138976]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:32:20 np0005548788.localdomain network[138993]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:32:20 np0005548788.localdomain network[138994]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:32:20 np0005548788.localdomain network[138995]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:32:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3024 DF PROTO=TCP SPT=36376 DPT=9102 SEQ=319518756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF36310000000001030307) 
Dec 06 09:32:24 np0005548788.localdomain sudo[139195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mntzuwtlilxgkvrxleahkjqoreopxyqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013544.2906542-608-176920061878326/AnsiballZ_stat.py
Dec 06 09:32:24 np0005548788.localdomain sudo[139195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:24 np0005548788.localdomain python3.9[139197]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:24 np0005548788.localdomain sudo[139195]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:25 np0005548788.localdomain sudo[139270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rargixpvdehuhhfebjhpwexdiijtywsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013544.2906542-608-176920061878326/AnsiballZ_copy.py
Dec 06 09:32:25 np0005548788.localdomain sudo[139270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:26 np0005548788.localdomain python3.9[139272]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013544.2906542-608-176920061878326/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:26 np0005548788.localdomain sudo[139270]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:26 np0005548788.localdomain sudo[139363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqfsnzvvybegtiufzqtebuhwufbyrodp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013546.2916176-653-62517299241863/AnsiballZ_systemd.py
Dec 06 09:32:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3025 DF PROTO=TCP SPT=36376 DPT=9102 SEQ=319518756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF45F00000000001030307) 
Dec 06 09:32:26 np0005548788.localdomain sudo[139363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:26 np0005548788.localdomain python3.9[139365]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:32:26 np0005548788.localdomain systemd[1]: Reloading OpenSSH server daemon...
Dec 06 09:32:26 np0005548788.localdomain sshd[119013]: Received SIGHUP; restarting.
Dec 06 09:32:26 np0005548788.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Dec 06 09:32:26 np0005548788.localdomain sshd[119013]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:26 np0005548788.localdomain sshd[119013]: Server listening on 0.0.0.0 port 22.
Dec 06 09:32:26 np0005548788.localdomain sshd[119013]: Server listening on :: port 22.
Dec 06 09:32:26 np0005548788.localdomain sudo[139363]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:28 np0005548788.localdomain sudo[139459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omgklbsfkvukukxkrbadtflueqlxfbml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013548.0928173-677-187159693871353/AnsiballZ_file.py
Dec 06 09:32:28 np0005548788.localdomain sudo[139459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:28 np0005548788.localdomain python3.9[139461]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:28 np0005548788.localdomain sudo[139459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:29 np0005548788.localdomain sudo[139551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmqnrexxkgrxilpqpdqlknytyemsjmwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013548.8080008-701-74696358013253/AnsiballZ_stat.py
Dec 06 09:32:29 np0005548788.localdomain sudo[139551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26446 DF PROTO=TCP SPT=51942 DPT=9100 SEQ=1800593309 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF4FF00000000001030307) 
Dec 06 09:32:29 np0005548788.localdomain python3.9[139553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:29 np0005548788.localdomain sudo[139551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:29 np0005548788.localdomain sudo[139624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlzqrzbtpwgbelbbjkacsotpxinedwxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013548.8080008-701-74696358013253/AnsiballZ_copy.py
Dec 06 09:32:29 np0005548788.localdomain sudo[139624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:29 np0005548788.localdomain python3.9[139626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013548.8080008-701-74696358013253/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:29 np0005548788.localdomain sudo[139624]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:30 np0005548788.localdomain sudo[139716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsubhmfnmqdhiwxydbhqjzfwyegifcld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013550.3022106-755-7828505390333/AnsiballZ_timezone.py
Dec 06 09:32:30 np0005548788.localdomain sudo[139716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:31 np0005548788.localdomain python3.9[139718]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 09:32:31 np0005548788.localdomain systemd[1]: Starting Time & Date Service...
Dec 06 09:32:31 np0005548788.localdomain systemd[1]: Started Time & Date Service.
Dec 06 09:32:31 np0005548788.localdomain sudo[139716]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:31 np0005548788.localdomain sudo[139812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yovqqkhjfcygjwwehruivhaynmjtfhnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013551.6252608-782-106594050232174/AnsiballZ_file.py
Dec 06 09:32:31 np0005548788.localdomain sudo[139812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:32 np0005548788.localdomain python3.9[139814]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:32 np0005548788.localdomain sudo[139812]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:32 np0005548788.localdomain sudo[139904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opotdlawxcyovlqncrakvfgrvaulkcll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013552.3489-806-241016064688797/AnsiballZ_stat.py
Dec 06 09:32:32 np0005548788.localdomain sudo[139904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:32 np0005548788.localdomain python3.9[139906]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:32 np0005548788.localdomain sudo[139904]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:33 np0005548788.localdomain sudo[139977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owgnyqpnnccfjuwdydblifoiinvuwceq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013552.3489-806-241016064688797/AnsiballZ_copy.py
Dec 06 09:32:33 np0005548788.localdomain sudo[139977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:33 np0005548788.localdomain python3.9[139979]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013552.3489-806-241016064688797/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:33 np0005548788.localdomain sudo[139977]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:33 np0005548788.localdomain sudo[140069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrvzwtxezutvrlaghkhjvnrsjvgkzcfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013553.5314429-851-105458603874952/AnsiballZ_stat.py
Dec 06 09:32:33 np0005548788.localdomain sudo[140069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:34 np0005548788.localdomain python3.9[140071]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:34 np0005548788.localdomain sudo[140069]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3026 DF PROTO=TCP SPT=36376 DPT=9102 SEQ=319518756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF65F00000000001030307) 
Dec 06 09:32:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58504 DF PROTO=TCP SPT=51202 DPT=9101 SEQ=139769497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF663E0000000001030307) 
Dec 06 09:32:34 np0005548788.localdomain sudo[140142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caaexsrdqkfymwzjmsxlgibikiepoqys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013553.5314429-851-105458603874952/AnsiballZ_copy.py
Dec 06 09:32:34 np0005548788.localdomain sudo[140142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:35 np0005548788.localdomain python3.9[140144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013553.5314429-851-105458603874952/.source.yaml _original_basename=.c4m2x7w8 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:35 np0005548788.localdomain sudo[140142]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:35 np0005548788.localdomain sudo[140234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkzrglxcnvdhgbywdqtehsceihmejrwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013555.2049994-896-281311703632242/AnsiballZ_stat.py
Dec 06 09:32:35 np0005548788.localdomain sudo[140234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:35 np0005548788.localdomain python3.9[140236]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:35 np0005548788.localdomain sudo[140234]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:36 np0005548788.localdomain sudo[140309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zctvdgxygbcgokrkqoyhztaucfqomohz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013555.2049994-896-281311703632242/AnsiballZ_copy.py
Dec 06 09:32:36 np0005548788.localdomain sudo[140309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:36 np0005548788.localdomain python3.9[140311]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013555.2049994-896-281311703632242/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:36 np0005548788.localdomain sudo[140309]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:37 np0005548788.localdomain sudo[140401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qognrwedhpiuvzltzhalqsyfnwmnlldr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013556.8896706-941-175232506025214/AnsiballZ_command.py
Dec 06 09:32:37 np0005548788.localdomain sudo[140401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:37 np0005548788.localdomain python3.9[140403]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:37 np0005548788.localdomain sudo[140401]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58506 DF PROTO=TCP SPT=51202 DPT=9101 SEQ=139769497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF72300000000001030307) 
Dec 06 09:32:38 np0005548788.localdomain sudo[140494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjdfgyrxbzjyrziinkbrxftxmlydoqsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013557.7706082-965-53991639408403/AnsiballZ_command.py
Dec 06 09:32:38 np0005548788.localdomain sudo[140494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:38 np0005548788.localdomain python3.9[140496]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:38 np0005548788.localdomain sudo[140494]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:38 np0005548788.localdomain sudo[140587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cemoyveelfcrfhgkyctlzgcyklopoynk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013558.493195-989-241510446714491/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:32:38 np0005548788.localdomain sudo[140587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:39 np0005548788.localdomain python3[140589]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:32:39 np0005548788.localdomain sudo[140587]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548788.localdomain sudo[140617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:32:39 np0005548788.localdomain sudo[140617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:39 np0005548788.localdomain sudo[140617]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548788.localdomain sudo[140651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:32:39 np0005548788.localdomain sudo[140651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:39 np0005548788.localdomain sudo[140709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvxatwawujzphasdtfisdroczvhupsuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013559.3295834-1013-25706940079721/AnsiballZ_stat.py
Dec 06 09:32:39 np0005548788.localdomain sudo[140709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:39 np0005548788.localdomain python3.9[140711]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:39 np0005548788.localdomain sudo[140709]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548788.localdomain sudo[140651]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548788.localdomain sudo[140747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:32:39 np0005548788.localdomain sudo[140747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:39 np0005548788.localdomain sudo[140747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548788.localdomain sudo[140783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:32:40 np0005548788.localdomain sudo[140783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:40 np0005548788.localdomain sudo[140834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nctemohjbmlxwysorzskfggeenlhxcln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013559.3295834-1013-25706940079721/AnsiballZ_copy.py
Dec 06 09:32:40 np0005548788.localdomain sudo[140834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:40 np0005548788.localdomain python3.9[140836]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013559.3295834-1013-25706940079721/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:40 np0005548788.localdomain sudo[140834]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548788.localdomain sudo[140783]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18224 DF PROTO=TCP SPT=58034 DPT=9105 SEQ=2634803433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF7DF10000000001030307) 
Dec 06 09:32:40 np0005548788.localdomain sudo[140957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hycfaryhyqoezuogghjoyluujydjxtha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013560.6171317-1058-16806809480959/AnsiballZ_stat.py
Dec 06 09:32:40 np0005548788.localdomain sudo[140957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:41 np0005548788.localdomain python3.9[140959]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:41 np0005548788.localdomain sudo[140960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:32:41 np0005548788.localdomain sudo[140960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:41 np0005548788.localdomain sudo[140960]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:41 np0005548788.localdomain sudo[140957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:41 np0005548788.localdomain sudo[141045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vungchbreyvpcwitcjvtoykxlltplpfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013560.6171317-1058-16806809480959/AnsiballZ_copy.py
Dec 06 09:32:41 np0005548788.localdomain sudo[141045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:41 np0005548788.localdomain python3.9[141047]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013560.6171317-1058-16806809480959/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:41 np0005548788.localdomain sudo[141045]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:42 np0005548788.localdomain sudo[141137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibxtcfznlmtzserqnyuvwhwpyjhfekdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013561.9300306-1103-82643791809498/AnsiballZ_stat.py
Dec 06 09:32:42 np0005548788.localdomain sudo[141137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:42 np0005548788.localdomain python3.9[141139]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:42 np0005548788.localdomain sudo[141137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:42 np0005548788.localdomain sshd[141167]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:42 np0005548788.localdomain sudo[141212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzlspfqvucdgrmomkgqnjlaolukesvyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013561.9300306-1103-82643791809498/AnsiballZ_copy.py
Dec 06 09:32:42 np0005548788.localdomain sudo[141212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:42 np0005548788.localdomain python3.9[141214]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013561.9300306-1103-82643791809498/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:42 np0005548788.localdomain sudo[141212]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:43 np0005548788.localdomain sudo[141304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evrnpznteffgxfifwrdiqdlojaxrdhly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013563.136154-1148-109643487509425/AnsiballZ_stat.py
Dec 06 09:32:43 np0005548788.localdomain sudo[141304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60394 DF PROTO=TCP SPT=36034 DPT=9100 SEQ=1156818223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF885F0000000001030307) 
Dec 06 09:32:43 np0005548788.localdomain python3.9[141306]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:43 np0005548788.localdomain sudo[141304]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:43 np0005548788.localdomain sudo[141377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rormfgnwktzfbwhkqxbvnywexarjvcno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013563.136154-1148-109643487509425/AnsiballZ_copy.py
Dec 06 09:32:43 np0005548788.localdomain sudo[141377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:44 np0005548788.localdomain python3.9[141379]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013563.136154-1148-109643487509425/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:44 np0005548788.localdomain sudo[141377]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:44 np0005548788.localdomain sudo[141469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmsbajuvyrtgizxlleyajbwnzdvwgltl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013564.6331167-1193-235778713596223/AnsiballZ_stat.py
Dec 06 09:32:44 np0005548788.localdomain sudo[141469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:45 np0005548788.localdomain python3.9[141471]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:45 np0005548788.localdomain sudo[141469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:45 np0005548788.localdomain sudo[141542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snzeclvzylhnujkjjubafhljtsvvpcod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013564.6331167-1193-235778713596223/AnsiballZ_copy.py
Dec 06 09:32:45 np0005548788.localdomain sudo[141542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:45 np0005548788.localdomain python3.9[141544]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013564.6331167-1193-235778713596223/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:45 np0005548788.localdomain sudo[141542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:46 np0005548788.localdomain sudo[141634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bosdpkvexjfuwlsfhkzetucsnesiaxhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013565.9890714-1238-22855374055787/AnsiballZ_file.py
Dec 06 09:32:46 np0005548788.localdomain sudo[141634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:46 np0005548788.localdomain python3.9[141636]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:46 np0005548788.localdomain sudo[141634]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:47 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44978 DF PROTO=TCP SPT=43478 DPT=9100 SEQ=3338826032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DF97F00000000001030307) 
Dec 06 09:32:47 np0005548788.localdomain sudo[141726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnhfgjyqseidaadxtsjnzzayqmemliig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013567.3127747-1262-43566039078174/AnsiballZ_command.py
Dec 06 09:32:47 np0005548788.localdomain sudo[141726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:47 np0005548788.localdomain python3.9[141728]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:47 np0005548788.localdomain sudo[141726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:48 np0005548788.localdomain sudo[141821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slxhnpmzkljuinxfxtewbvbgibhmjqgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013568.0067186-1286-248231498944976/AnsiballZ_blockinfile.py
Dec 06 09:32:48 np0005548788.localdomain sudo[141821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:48 np0005548788.localdomain python3.9[141823]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:48 np0005548788.localdomain sudo[141821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:49 np0005548788.localdomain sudo[141914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awgykhimabwcpewrcvhmhqljrrnnttyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013568.9021606-1313-87874775643749/AnsiballZ_file.py
Dec 06 09:32:49 np0005548788.localdomain sudo[141914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:49 np0005548788.localdomain python3.9[141916]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:49 np0005548788.localdomain sudo[141914]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6902 DF PROTO=TCP SPT=40600 DPT=9882 SEQ=2104569248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DFA00A0000000001030307) 
Dec 06 09:32:49 np0005548788.localdomain sudo[142006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjlwwkdflamdhocrciayczmhdvvieybj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013569.4917102-1313-195523933308924/AnsiballZ_file.py
Dec 06 09:32:49 np0005548788.localdomain sudo[142006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:49 np0005548788.localdomain python3.9[142008]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:49 np0005548788.localdomain sudo[142006]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:50 np0005548788.localdomain sudo[142098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxrxdwmsmkkaunrebnlxdbkojhqcbhaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013570.2358289-1358-50591130399044/AnsiballZ_mount.py
Dec 06 09:32:50 np0005548788.localdomain sudo[142098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:50 np0005548788.localdomain python3.9[142100]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:32:50 np0005548788.localdomain sudo[142098]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:51 np0005548788.localdomain sudo[142191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yynatqlyyqxgwmqfdlddexarvggplgeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013571.1392844-1358-77793707772493/AnsiballZ_mount.py
Dec 06 09:32:51 np0005548788.localdomain sudo[142191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:51 np0005548788.localdomain python3.9[142193]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:32:51 np0005548788.localdomain sudo[142191]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:52 np0005548788.localdomain sshd[136898]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:32:52 np0005548788.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Dec 06 09:32:52 np0005548788.localdomain systemd[1]: session-44.scope: Consumed 29.224s CPU time.
Dec 06 09:32:52 np0005548788.localdomain systemd-logind[765]: Session 44 logged out. Waiting for processes to exit.
Dec 06 09:32:52 np0005548788.localdomain systemd-logind[765]: Removed session 44.
Dec 06 09:32:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30176 DF PROTO=TCP SPT=57324 DPT=9882 SEQ=2423094356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DFAFF00000000001030307) 
Dec 06 09:32:55 np0005548788.localdomain sshd[141167]: Received disconnect from 45.78.219.195 port 48946:11: Bye Bye [preauth]
Dec 06 09:32:55 np0005548788.localdomain sshd[141167]: Disconnected from authenticating user root 45.78.219.195 port 48946 [preauth]
Dec 06 09:32:58 np0005548788.localdomain sshd[142209]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:58 np0005548788.localdomain sshd[142209]: Accepted publickey for zuul from 192.168.122.30 port 43516 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:32:58 np0005548788.localdomain systemd-logind[765]: New session 45 of user zuul.
Dec 06 09:32:58 np0005548788.localdomain systemd[1]: Started Session 45 of User zuul.
Dec 06 09:32:58 np0005548788.localdomain sshd[142209]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:32:59 np0005548788.localdomain sudo[142302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwxpbalqnwgzvwsobqebgmjyskuogaab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013578.6632414-21-236789130779288/AnsiballZ_tempfile.py
Dec 06 09:32:59 np0005548788.localdomain sudo[142302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:59 np0005548788.localdomain python3.9[142304]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 06 09:32:59 np0005548788.localdomain sudo[142302]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:00 np0005548788.localdomain sudo[142394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvvtyqalycpjcdfewqczmdwohksvmvla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013580.3089442-93-185131041128688/AnsiballZ_stat.py
Dec 06 09:33:00 np0005548788.localdomain sudo[142394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:00 np0005548788.localdomain python3.9[142396]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:00 np0005548788.localdomain sudo[142394]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:01 np0005548788.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 09:33:02 np0005548788.localdomain sudo[142490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkjdirwajjgkczvzlrhudctfgfrzqvqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013581.6289396-141-188927744116602/AnsiballZ_slurp.py
Dec 06 09:33:02 np0005548788.localdomain sudo[142490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:02 np0005548788.localdomain python3.9[142492]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 06 09:33:02 np0005548788.localdomain sudo[142490]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:03 np0005548788.localdomain sudo[142582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysibwxobeudhvitthelybcvwlzeqcknc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013583.0937347-189-5017512408997/AnsiballZ_stat.py
Dec 06 09:33:03 np0005548788.localdomain sudo[142582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:03 np0005548788.localdomain python3.9[142584]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.7xswvkkw follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:03 np0005548788.localdomain sudo[142582]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:04 np0005548788.localdomain sudo[142657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epxlgybzqagngtclistxluxxzxebgqjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013583.0937347-189-5017512408997/AnsiballZ_copy.py
Dec 06 09:33:04 np0005548788.localdomain sudo[142657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:04 np0005548788.localdomain python3.9[142659]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.7xswvkkw mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013583.0937347-189-5017512408997/.source.7xswvkkw _original_basename=.ndhl7x5p follow=False checksum=3e842c629948eb11ff005810a7264dbaf8a6d16e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:04 np0005548788.localdomain sudo[142657]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1528 DF PROTO=TCP SPT=41824 DPT=9101 SEQ=1640587077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DFDB6E0000000001030307) 
Dec 06 09:33:06 np0005548788.localdomain sudo[142749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thbwhbdscljazajcqrvgjlxqddubxfle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013586.0877204-279-206911912119309/AnsiballZ_setup.py
Dec 06 09:33:06 np0005548788.localdomain sudo[142749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:06 np0005548788.localdomain python3.9[142751]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:06 np0005548788.localdomain sudo[142749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:08 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56397 DF PROTO=TCP SPT=54260 DPT=9105 SEQ=3467501764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DFEB300000000001030307) 
Dec 06 09:33:09 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31604 DF PROTO=TCP SPT=47562 DPT=9101 SEQ=2089812155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DFEBF00000000001030307) 
Dec 06 09:33:09 np0005548788.localdomain sudo[142841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vewyfoygwjgipqpyqxhmztjpqasogwmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013589.3690286-328-124461778300445/AnsiballZ_blockinfile.py
Dec 06 09:33:09 np0005548788.localdomain sudo[142841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:09 np0005548788.localdomain python3.9[142843]: ansible-ansible.builtin.blockinfile Invoked with block=np0005548785.localdomain,192.168.122.103,np0005548785* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC89JzJHuRLDUgmU66VPdPVwYLrvslBwa5i2QfiUzrnpt1lKz8ayq6QMRy5y5GgfjQQhX/YZiAjUSoogVsYDkoDaImXdtfQHFlFMLTlJPiYcA/cGAwMAE/vifpWoztBHUXkJ5YWUojkXzGoR8d7ESx/tTLG/9QrQDsW6JcV18mcFCQZdeWYWGWdLn6ynmQOZ0N4U6mYK1FqE+GKgP6L9PEjkC1ePo81AnYcdQ5Z1IETdcCcJytdvvxH/Zie1PiAaMAgMYhsqu7+DZRRTvg+cEMw3mRVuodIyQEbpZs8MjR3itViRfZ+UqYi6uKDnz1viLL0aACaYhOLzrE7bQ6Sl4j1MnMrWncUOv3Sq2fus+Y6oYmed84E6HUNljte7vVP9jwPclbCAmj5WuC/Av9dSqqHEpPRbKJ4tAuBrO2LBKS7J62FjRYiY807V1viyxUgjK5FmsQyfVr3/YOirluSx54e4XwxxDrAjtrd0x68H7/Mt6HP/79cWKaVbC7XUckYRmE=
                                                            np0005548785.localdomain,192.168.122.103,np0005548785* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPnHRGHw2U3XDUZBfS69ZpwocvZ2haE6Sebzf3BV40dJ
                                                            np0005548785.localdomain,192.168.122.103,np0005548785* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMgorOAtIXk7BOknkR82ERwiBlDoAcpTTo8DwXwOeKFxueIG2AzGwqy/M3AlognMpbS9bigTSmXKYzfS5SNcGD8=
                                                            np0005548786.localdomain,192.168.122.104,np0005548786* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDURzBA/aIGrwPgaIApy0UCTi4wdQhfDEx0QfkSAIn0ZptZcOkaR8BWtl9GijRPEp++Ep4qU04JcwHO1ZULd2UnCdDeg1Imwnf7x9HQBjAr0mH+tE0t4MBLtBbrk8Ep5ggyKATK1CvEl3NuGIS4gSSUWxzkR74Iju/GtrEMuVnMSsOw+auBofiv1ne4zyXqQWZORiK32DSolw1KyXGLyqG+JOpl3Kza5o79S1KUghfRzskZMm/AxFYciPmg4EQK/jL9Izj7qq3v8MaL8baeyqNlPaaRKCh+pkZlYtoPzDhe+vn/jwnDmQgqC1Bh+dkNiKEVlWz3mxoiMoeLY3jP/tMF2M4M8puGakPc2sqJxk1++Tv/lFRO3zBS+V2kECKI5DtQI6XThfLYXxIQl5SHr4yGEoxhMNt6YNQPLp6lg30kHO24YyNNA7LPFYYoOGUCaq5ZVUCF9lagMxcgkN0Bs+ZZqeni+53RqxoutiRZ0m9pIiqxGjrJjbNFXmofgfDBcUE=
                                                            np0005548786.localdomain,192.168.122.104,np0005548786* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILIgwHZ/0Q8K6t9dlBCQwEO6OABCR0J0IF6hfmA44GBM
                                                            np0005548786.localdomain,192.168.122.104,np0005548786* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBItDJKfsljV78XBJL8EuwSxDvfxuZ9Jz6PgjXVap/GJqsza+9ApDVkNpmAVhdxO9qX1PPD9KOxQjcrD2A8MXQ10=
                                                            np0005548787.localdomain,192.168.122.105,np0005548787* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXe0UZ2kJKcvYaHSnjIOf3QqkGhArLo32nvDm8Pl8ZVNWfdRV8R+e17etAicDq//fxWC+U9jiHp4qI6/0Jm64rPocmJKaA+r79sNpv+598NlGtVUfTYQ34Ze9bgaPkjAwKfPNrzjSDChyfkys4Hm0J7ttog5rvMcuRelxkFmoonOcuzBC+9ufI6qld7br5w4WDookwamkefbMCiwAZxrw2bSjoTu7/TEFbt7SM0lUIdqP5WvxpWK52OkjnakQ0BL4QHdRYz1kBx/vS0TFxXb2pMO291dfkxDl3H2oXXZZYK/LWy3nZyJEX+mD5J6WOEs5HC5GQQ+CNEV0wa2e/gJA7KBsyL5T6RBtH8id22sBHZkzcaDhUz1ZABGAiOx4rdrr4YFFFy/u00nX3ZCuRBPXYh37Pafl7GXcSKyhTmkCZI0591RdNmb1duh9ZIObRmPVp2+WIheAFvS7EU4B0+ZjAEbDJgiSa9VlUrlRFX0ajcFHR8FnwNRcoERO3A3h4/Tc=
                                                            np0005548787.localdomain,192.168.122.105,np0005548787* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGClV/UHC6wrHLH6ofPCeG9Z3WpaSbH42qD4AsTbywke
                                                            np0005548787.localdomain,192.168.122.105,np0005548787* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD+VBma5zUGbc6C8yvVJH1yH01D2HwvgMwJZ3Ew/fQ9uangWsK7hoczIcWgUhEN67mue6bMYPNkv+zbE5QDlLqA=
                                                            np0005548789.localdomain,192.168.122.107,np0005548789* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwH3rhRTvOINLmLdbeRXeXOiMzz+IXEuW2cXYAe50Wcc3ikH2RVGirWQrwLc8hAoA7UFCXADqEMxPg6/fLsQkbP7kLOpUtam8nuXvgt8VHM4RFl5wh9EOgZ7DWgjA7s3r2eQMcBhv82CjVMLY/YjnLuRNXCsJAqeG32qcKedKH/huEFvkb49U/UnNlxi5BfNrMlY9n5UQXE2rd6EKwP58aP/qQ1ie3p8nwHc36/MJcfEIABlLaoHK/LxnadOFTh93OkqVi7A0VQsKSmKD64nABiN7ML0NReoyRIQI5r3Dawe8v2K9jCBh5jY88TVsYUJqgwoZSSU73sYGHX4uF+PY8wL7qwn6mCzA17GGYeB8Dy0N8qwDqah6kUjpcLwGp7YaKf0FIZPBKcLVMrX6Tnwxer1j3kOIt3tgLZoz3mMfstWfCyvt9t+GEW5MCE+MBkY4Eree3uK7pI+wJ3vFQS9XVP00hjNiLWYmoaaW6rl8xtw7QtGhzmjcWbOxaZvHWE5E=
                                                            np0005548789.localdomain,192.168.122.107,np0005548789* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIM7zsgz8o1LOsRIDgDJ0j4aB+gvG7QE4PuIS5gi3px2U
                                                            np0005548789.localdomain,192.168.122.107,np0005548789* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNB22R613xD5iIn21fw712bqcytUxBHAFZPMSjpWL8XVTi6taleS2y8rpYqGoN21DgQgwO1SxmcqZLfwlh7T5/4=
                                                            np0005548790.localdomain,192.168.122.108,np0005548790* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDmdMCy44p73Ui+o09YQitqR9FILqoJ6AGYYutFVH6wn5m1j6oEoI4XgVFPR3UpG3SXdoiG7m0DRxC/WZZMpZbaQ3ZHbJJioRh1hV5uQtK5k2gtmS8uePng5UprbLncMXf+HIxNRvirU3r6zdgNGAroK0rN0nWESi/FNb2flu9Aw9JAsgIAAouW4IUoeyMGZ1AflhRhsWsQMstM9UEeGU+iTqV7al1URVCSq1finY99m+QC+Pftpd2C/+agboOIiVa63+D/RqqfYqh4C/PYfDbssYjcZzk3P90+HQ6uMKexX3HRnFbyje4eLSBHC0pjr/4pNfk/eSpdHeyMAPsP+QlBztdcPj9OnjcmT9ymeJRKF7GwNIWg3Pn9L2yY50d8l9Zu6rNIDW786XNcbm88yHdCHA5FE1A8XTWQRQ3eUSUsmsvf03pExAouRM4Fj8dvCu6wzG2SuyWqmdT5yCNrUG0e1CeE6PcfTLBeS5CJAwn5HM8aUndQQldWmaUbMPL5Jis=
                                                            np0005548790.localdomain,192.168.122.108,np0005548790* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPoyxTI8+8n9PWFBkZatum98GfJRQMd2qn9CijEFzfEz
                                                            np0005548790.localdomain,192.168.122.108,np0005548790* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNOlnHgYu82mRZ1QroLe1BG6rymOGDqDJGz5MpHZnXnhJ6iIwC87em0cGHiSKgU+UZ4DpWQTIlxwKsn9Jp9Hl1Y=
                                                            np0005548788.localdomain,192.168.122.106,np0005548788* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxIoAQH9YZnGrAxYR5prFQwo6HY5mwdDjndb+bp2pwvtVLM4ABIdCi+K1wpbhOpoO7BsYOf/tdBqemvSDleNo/ZLh3v3MmoVtoTtQZqLWsAQWFgJCjcGUGB+H3CHhtbp706coVQMlGD+UQqpCBy8WamMB/Ldy+hSHbLHwzuMzj8tO90vUbEyuKgOuu/X3ZFa+Yjo/asQ+PTrVfirh1QvRQ9aK22xH89KbThA/1an4OjnNGLCP752auSQ894B21QLKfqaMGPlpbjU8Wr6MP4zKV9lUzpQiFr6IU6cd4CeIsJDj7FnAZuBSmi8ewgm/r4ZWkmCSlqw8OpMC5soJnm8Q4PJTIFvT9eyyFCh9xmQkMhzE8P332LtYjZ+vXhYFU14e04mOQx5UrtHN8uWJVbOAwtLNAcenHyRtCQGkAZ6f9q0OvSuYr+o3FhHhN5ABu32AKAD8YpkjLypi+PbaiKNQW8XzPAHHbV8CGZ4B09ZWeQY49VA0bPxIYBXd1mEBlXSE=
                                                            np0005548788.localdomain,192.168.122.106,np0005548788* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJBkIOjRpLl815RvOqIZSSNUu/CGLqucfCRUist+ERWP
                                                            np0005548788.localdomain,192.168.122.106,np0005548788* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNyEL9+sMn9BF0LnCanz9jbKQTm6FNV71J4qGFTonom0KXHpLL1p0eyrgFY0iwGH2UtwJ6VWm5bm2RaQJmObwZI=
                                                             create=True mode=0644 path=/tmp/ansible.7xswvkkw state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:10 np0005548788.localdomain sudo[142841]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:11 np0005548788.localdomain sudo[142933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvmdisryleiibiopqmcpkmxvffgxdrhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013590.7720163-376-178113350620578/AnsiballZ_command.py
Dec 06 09:33:11 np0005548788.localdomain sudo[142933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:11 np0005548788.localdomain python3.9[142935]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.7xswvkkw' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:11 np0005548788.localdomain sudo[142933]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:12 np0005548788.localdomain sudo[143027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyhfrqvyugkdxkfnmnfamxdyrfwoikkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.125987-424-265832056991228/AnsiballZ_file.py
Dec 06 09:33:12 np0005548788.localdomain sudo[143027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:12 np0005548788.localdomain python3.9[143029]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.7xswvkkw state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:12 np0005548788.localdomain sudo[143027]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65253 DF PROTO=TCP SPT=53418 DPT=9100 SEQ=490072745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5DFFD900000000001030307) 
Dec 06 09:33:13 np0005548788.localdomain sshd[142209]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:13 np0005548788.localdomain systemd-logind[765]: Session 45 logged out. Waiting for processes to exit.
Dec 06 09:33:13 np0005548788.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Dec 06 09:33:13 np0005548788.localdomain systemd[1]: session-45.scope: Consumed 4.535s CPU time.
Dec 06 09:33:13 np0005548788.localdomain systemd-logind[765]: Removed session 45.
Dec 06 09:33:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25097 DF PROTO=TCP SPT=60656 DPT=9102 SEQ=278220360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E014940000000001030307) 
Dec 06 09:33:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18300 DF PROTO=TCP SPT=51130 DPT=9882 SEQ=1090965556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0153A0000000001030307) 
Dec 06 09:33:19 np0005548788.localdomain sshd[143044]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:20 np0005548788.localdomain sshd[143044]: Accepted publickey for zuul from 192.168.122.30 port 43786 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:33:20 np0005548788.localdomain systemd-logind[765]: New session 46 of user zuul.
Dec 06 09:33:20 np0005548788.localdomain systemd[1]: Started Session 46 of User zuul.
Dec 06 09:33:20 np0005548788.localdomain sshd[143044]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:33:21 np0005548788.localdomain python3.9[143137]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:22 np0005548788.localdomain sudo[143231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxnwtxvpbnczgjsyndcdakkpxfutxacr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013601.7301235-56-270634804619966/AnsiballZ_systemd.py
Dec 06 09:33:22 np0005548788.localdomain sudo[143231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:22 np0005548788.localdomain python3.9[143233]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:33:22 np0005548788.localdomain sudo[143231]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:24 np0005548788.localdomain sudo[143325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daeykujlwrlqijwcjrjjqlfhjwfyzfyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013603.7532022-80-128017281409243/AnsiballZ_systemd.py
Dec 06 09:33:24 np0005548788.localdomain sudo[143325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:24 np0005548788.localdomain python3.9[143327]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:33:24 np0005548788.localdomain sudo[143325]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:25 np0005548788.localdomain sudo[143418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhmxnhlqffwsriyxqilqyjodwtfmhbdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013604.6146288-107-69241436078915/AnsiballZ_command.py
Dec 06 09:33:25 np0005548788.localdomain sudo[143418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:25 np0005548788.localdomain python3.9[143420]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:25 np0005548788.localdomain sudo[143418]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:25 np0005548788.localdomain sudo[143511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsokvbjbjibgtjepdgyzdhwsmmfvnpji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013605.4160187-131-122834915691584/AnsiballZ_stat.py
Dec 06 09:33:25 np0005548788.localdomain sudo[143511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:26 np0005548788.localdomain python3.9[143513]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:26 np0005548788.localdomain sudo[143511]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:26 np0005548788.localdomain sudo[143605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqfwoukowbsnplpcsawoqfksytvyeckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013606.3034616-155-98686317734269/AnsiballZ_command.py
Dec 06 09:33:26 np0005548788.localdomain sudo[143605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:26 np0005548788.localdomain python3.9[143607]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:26 np0005548788.localdomain sudo[143605]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:27 np0005548788.localdomain sudo[143700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftvjspsylvkguwpbjfaniurjgzykvzze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013606.9785979-179-5939504608506/AnsiballZ_file.py
Dec 06 09:33:27 np0005548788.localdomain sudo[143700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:27 np0005548788.localdomain python3.9[143702]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:27 np0005548788.localdomain sudo[143700]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:28 np0005548788.localdomain sshd[143044]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:28 np0005548788.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Dec 06 09:33:28 np0005548788.localdomain systemd[1]: session-46.scope: Consumed 3.838s CPU time.
Dec 06 09:33:28 np0005548788.localdomain systemd-logind[765]: Session 46 logged out. Waiting for processes to exit.
Dec 06 09:33:28 np0005548788.localdomain systemd-logind[765]: Removed session 46.
Dec 06 09:33:33 np0005548788.localdomain sshd[143717]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:33 np0005548788.localdomain sshd[143717]: Accepted publickey for zuul from 192.168.122.30 port 51338 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:33:33 np0005548788.localdomain systemd-logind[765]: New session 47 of user zuul.
Dec 06 09:33:33 np0005548788.localdomain systemd[1]: Started Session 47 of User zuul.
Dec 06 09:33:33 np0005548788.localdomain sshd[143717]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:33:34 np0005548788.localdomain python3.9[143810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6019 DF PROTO=TCP SPT=51284 DPT=9101 SEQ=3129283675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0509E0000000001030307) 
Dec 06 09:33:35 np0005548788.localdomain sudo[143904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouukonolkejubowffxbrlueyjzyndjat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013615.1396246-62-19914469454884/AnsiballZ_setup.py
Dec 06 09:33:35 np0005548788.localdomain sudo[143904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:35 np0005548788.localdomain python3.9[143906]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:33:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6020 DF PROTO=TCP SPT=51284 DPT=9101 SEQ=3129283675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E054B10000000001030307) 
Dec 06 09:33:36 np0005548788.localdomain sudo[143904]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:36 np0005548788.localdomain sudo[143958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqqxwovcyjqszngbjlhzhmiloylfdoib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013615.1396246-62-19914469454884/AnsiballZ_dnf.py
Dec 06 09:33:36 np0005548788.localdomain sudo[143958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:36 np0005548788.localdomain python3.9[143960]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:33:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6021 DF PROTO=TCP SPT=51284 DPT=9101 SEQ=3129283675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E05CB10000000001030307) 
Dec 06 09:33:38 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63030 DF PROTO=TCP SPT=41206 DPT=9105 SEQ=597836557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E060600000000001030307) 
Dec 06 09:33:39 np0005548788.localdomain sudo[143958]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:39 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63031 DF PROTO=TCP SPT=41206 DPT=9105 SEQ=597836557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E064700000000001030307) 
Dec 06 09:33:41 np0005548788.localdomain python3.9[144052]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:41 np0005548788.localdomain sudo[144054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:33:41 np0005548788.localdomain sudo[144054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:33:41 np0005548788.localdomain sudo[144054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:41 np0005548788.localdomain sudo[144069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:33:41 np0005548788.localdomain sudo[144069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:33:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63032 DF PROTO=TCP SPT=41206 DPT=9105 SEQ=597836557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E06C710000000001030307) 
Dec 06 09:33:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6022 DF PROTO=TCP SPT=51284 DPT=9101 SEQ=3129283675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E06C710000000001030307) 
Dec 06 09:33:42 np0005548788.localdomain sudo[144069]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:42 np0005548788.localdomain sudo[144211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rimbpljywiilserigggwnanhwjlxmkso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013622.2332816-125-49362802658738/AnsiballZ_file.py
Dec 06 09:33:42 np0005548788.localdomain sudo[144211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:42 np0005548788.localdomain sudo[144197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:33:42 np0005548788.localdomain sudo[144197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:33:42 np0005548788.localdomain sudo[144197]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:42 np0005548788.localdomain python3.9[144220]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:42 np0005548788.localdomain sudo[144211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:43 np0005548788.localdomain sudo[144311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulukbdiwighhganjnqxaulvzzuptnijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013623.0113328-149-227697056573432/AnsiballZ_file.py
Dec 06 09:33:43 np0005548788.localdomain sudo[144311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16311 DF PROTO=TCP SPT=42810 DPT=9100 SEQ=1906261179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E072BF0000000001030307) 
Dec 06 09:33:43 np0005548788.localdomain python3.9[144313]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:43 np0005548788.localdomain sudo[144311]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:44 np0005548788.localdomain sudo[144403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqkmshcnclhxylwdunwyrdlmtkylvaai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013623.8130214-173-127539568809654/AnsiballZ_lineinfile.py
Dec 06 09:33:44 np0005548788.localdomain sudo[144403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:44 np0005548788.localdomain python3.9[144405]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:44 np0005548788.localdomain sudo[144403]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:44 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16312 DF PROTO=TCP SPT=42810 DPT=9100 SEQ=1906261179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E076B10000000001030307) 
Dec 06 09:33:45 np0005548788.localdomain python3.9[144495]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:33:45 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63033 DF PROTO=TCP SPT=41206 DPT=9105 SEQ=597836557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E07C310000000001030307) 
Dec 06 09:33:46 np0005548788.localdomain python3.9[144585]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16313 DF PROTO=TCP SPT=42810 DPT=9100 SEQ=1906261179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E07EB00000000001030307) 
Dec 06 09:33:46 np0005548788.localdomain python3.9[144677]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:47 np0005548788.localdomain sshd[143717]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:47 np0005548788.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Dec 06 09:33:47 np0005548788.localdomain systemd[1]: session-47.scope: Consumed 8.787s CPU time.
Dec 06 09:33:47 np0005548788.localdomain systemd-logind[765]: Session 47 logged out. Waiting for processes to exit.
Dec 06 09:33:47 np0005548788.localdomain systemd-logind[765]: Removed session 47.
Dec 06 09:33:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33277 DF PROTO=TCP SPT=55606 DPT=9102 SEQ=3874071164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E089C60000000001030307) 
Dec 06 09:33:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55242 DF PROTO=TCP SPT=50316 DPT=9882 SEQ=651789380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E08A6A0000000001030307) 
Dec 06 09:33:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33279 DF PROTO=TCP SPT=55606 DPT=9102 SEQ=3874071164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E095B10000000001030307) 
Dec 06 09:33:54 np0005548788.localdomain sshd[144692]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:54 np0005548788.localdomain sshd[144692]: Accepted publickey for zuul from 192.168.122.30 port 59414 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:33:54 np0005548788.localdomain systemd-logind[765]: New session 48 of user zuul.
Dec 06 09:33:54 np0005548788.localdomain systemd[1]: Started Session 48 of User zuul.
Dec 06 09:33:54 np0005548788.localdomain sshd[144692]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:33:55 np0005548788.localdomain python3.9[144785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33280 DF PROTO=TCP SPT=55606 DPT=9102 SEQ=3874071164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0A5700000000001030307) 
Dec 06 09:33:57 np0005548788.localdomain sudo[144879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxtwymsejsawvikhrmyqnljtquvyxnwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013636.802295-157-215528337320526/AnsiballZ_file.py
Dec 06 09:33:57 np0005548788.localdomain sudo[144879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:57 np0005548788.localdomain python3.9[144881]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:57 np0005548788.localdomain sudo[144879]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:58 np0005548788.localdomain sudo[144971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxyfwhbilmwzdpunbxcsbnbnsbyunpwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013637.5732675-183-97701722547066/AnsiballZ_stat.py
Dec 06 09:33:58 np0005548788.localdomain sudo[144971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:58 np0005548788.localdomain python3.9[144973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:58 np0005548788.localdomain sudo[144971]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16315 DF PROTO=TCP SPT=42810 DPT=9100 SEQ=1906261179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0ADF10000000001030307) 
Dec 06 09:33:58 np0005548788.localdomain sudo[145044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yygicngwsruebqvuraoxrhtjqbwmjhic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013637.5732675-183-97701722547066/AnsiballZ_copy.py
Dec 06 09:33:58 np0005548788.localdomain sudo[145044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:59 np0005548788.localdomain python3.9[145046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013637.5732675-183-97701722547066/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:59 np0005548788.localdomain sudo[145044]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:59 np0005548788.localdomain sudo[145136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olzuzghrpddzsdbznzolwzbuexrmzuac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013639.301924-230-67203699631872/AnsiballZ_file.py
Dec 06 09:33:59 np0005548788.localdomain sudo[145136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:00 np0005548788.localdomain python3.9[145138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:00 np0005548788.localdomain sudo[145136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:00 np0005548788.localdomain sudo[145228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjiiqsuhzihfpijwyoxogmuvghccjzpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013640.1707225-260-76584288186225/AnsiballZ_stat.py
Dec 06 09:34:00 np0005548788.localdomain sudo[145228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:00 np0005548788.localdomain python3.9[145230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:00 np0005548788.localdomain sudo[145228]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:01 np0005548788.localdomain sudo[145301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyirtcihsyiktjcawjhtsafrfpyywchn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013640.1707225-260-76584288186225/AnsiballZ_copy.py
Dec 06 09:34:01 np0005548788.localdomain sudo[145301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:02 np0005548788.localdomain python3.9[145303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013640.1707225-260-76584288186225/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:02 np0005548788.localdomain sudo[145301]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:02 np0005548788.localdomain sudo[145393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nulzwkmaaiqdlnzcfuiyoufaacsvdijx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013642.2892504-303-1814027611685/AnsiballZ_file.py
Dec 06 09:34:02 np0005548788.localdomain sudo[145393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:02 np0005548788.localdomain python3.9[145395]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:02 np0005548788.localdomain sudo[145393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:03 np0005548788.localdomain sudo[145485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfhhrhrlhmrnfdkmirkykqrvlwhpybyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013642.9015045-328-148964347720223/AnsiballZ_stat.py
Dec 06 09:34:03 np0005548788.localdomain sudo[145485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:03 np0005548788.localdomain python3.9[145487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:03 np0005548788.localdomain sudo[145485]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:04 np0005548788.localdomain sudo[145558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmszmmcjrxentkroghdfiliyeefrvdvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013642.9015045-328-148964347720223/AnsiballZ_copy.py
Dec 06 09:34:04 np0005548788.localdomain sudo[145558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:04 np0005548788.localdomain python3.9[145560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013642.9015045-328-148964347720223/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:04 np0005548788.localdomain sudo[145558]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13960 DF PROTO=TCP SPT=60724 DPT=9101 SEQ=1378448212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0C5CF0000000001030307) 
Dec 06 09:34:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55246 DF PROTO=TCP SPT=50316 DPT=9882 SEQ=651789380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0C5F00000000001030307) 
Dec 06 09:34:04 np0005548788.localdomain sudo[145650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujeskqsvtsoarhvjkaeliumxrlofdwvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013644.685909-379-120617592708477/AnsiballZ_file.py
Dec 06 09:34:04 np0005548788.localdomain sudo[145650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:05 np0005548788.localdomain chronyd[136882]: Selected source 167.160.187.179 (pool.ntp.org)
Dec 06 09:34:05 np0005548788.localdomain python3.9[145652]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:05 np0005548788.localdomain sudo[145650]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:05 np0005548788.localdomain sudo[145742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czggqexztdsaqalqwcxevfsypixjvkzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013645.6975-406-197390406770176/AnsiballZ_stat.py
Dec 06 09:34:05 np0005548788.localdomain sudo[145742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:06 np0005548788.localdomain python3.9[145744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:06 np0005548788.localdomain sudo[145742]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:06 np0005548788.localdomain sudo[145815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzhjlabhjzhjilpzlwbbhydbdhaswptm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013645.6975-406-197390406770176/AnsiballZ_copy.py
Dec 06 09:34:06 np0005548788.localdomain sudo[145815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:06 np0005548788.localdomain python3.9[145817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013645.6975-406-197390406770176/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:06 np0005548788.localdomain sudo[145815]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:07 np0005548788.localdomain sudo[145907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxwhljkgczhpflvdxoesxihdvccssrjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013646.960538-456-180794079681985/AnsiballZ_file.py
Dec 06 09:34:07 np0005548788.localdomain sudo[145907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:07 np0005548788.localdomain python3.9[145909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:07 np0005548788.localdomain sudo[145907]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13962 DF PROTO=TCP SPT=60724 DPT=9101 SEQ=1378448212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0D1F10000000001030307) 
Dec 06 09:34:07 np0005548788.localdomain sudo[145999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huqlelicqyuwngmgyuscggrvguhyirzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013647.6314068-484-108497766501821/AnsiballZ_stat.py
Dec 06 09:34:07 np0005548788.localdomain sudo[145999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:08 np0005548788.localdomain python3.9[146001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:08 np0005548788.localdomain sudo[145999]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:08 np0005548788.localdomain sudo[146072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwdgdvwebypjwhkmlupaifbcfldzjhdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013647.6314068-484-108497766501821/AnsiballZ_copy.py
Dec 06 09:34:08 np0005548788.localdomain sudo[146072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:08 np0005548788.localdomain python3.9[146074]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013647.6314068-484-108497766501821/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:08 np0005548788.localdomain sudo[146072]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:09 np0005548788.localdomain sudo[146164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yohpxryvsuqlmmlpmbevxfravrqmqoof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013648.9455135-536-249210522339987/AnsiballZ_file.py
Dec 06 09:34:09 np0005548788.localdomain sudo[146164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:09 np0005548788.localdomain python3.9[146166]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:09 np0005548788.localdomain sudo[146164]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:09 np0005548788.localdomain sudo[146256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phkykritdgwnftwdekjscqfbuhwpbhib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013649.586063-561-59719237106768/AnsiballZ_stat.py
Dec 06 09:34:09 np0005548788.localdomain sudo[146256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:10 np0005548788.localdomain python3.9[146258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:10 np0005548788.localdomain sudo[146256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:10 np0005548788.localdomain sudo[146329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdvkdlkfodqqpjwbqvwdqxkzwhrmrvod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013649.586063-561-59719237106768/AnsiballZ_copy.py
Dec 06 09:34:10 np0005548788.localdomain sudo[146329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63035 DF PROTO=TCP SPT=41206 DPT=9105 SEQ=597836557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0DBF00000000001030307) 
Dec 06 09:34:10 np0005548788.localdomain python3.9[146331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013649.586063-561-59719237106768/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:10 np0005548788.localdomain sudo[146329]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:11 np0005548788.localdomain sudo[146421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvjnbxlpmkcodjjmbpvyhfybwddnfxsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013650.8534298-611-144830746249380/AnsiballZ_file.py
Dec 06 09:34:11 np0005548788.localdomain sudo[146421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:11 np0005548788.localdomain python3.9[146423]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:11 np0005548788.localdomain sudo[146421]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:12 np0005548788.localdomain sudo[146513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeafxlqlpvmcvxxfdipeklpwwtymvghc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013652.0511541-637-173249375514193/AnsiballZ_stat.py
Dec 06 09:34:12 np0005548788.localdomain sudo[146513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:12 np0005548788.localdomain python3.9[146515]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:12 np0005548788.localdomain sudo[146513]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:12 np0005548788.localdomain sudo[146586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbmjbajzdzbxmlntbivhldoetntwcpfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013652.0511541-637-173249375514193/AnsiballZ_copy.py
Dec 06 09:34:12 np0005548788.localdomain sudo[146586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:13 np0005548788.localdomain python3.9[146588]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013652.0511541-637-173249375514193/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:13 np0005548788.localdomain sudo[146586]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22964 DF PROTO=TCP SPT=39988 DPT=9100 SEQ=3020971046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0E7EF0000000001030307) 
Dec 06 09:34:14 np0005548788.localdomain sudo[146678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulkiiexfmfedwkijanycsmzsfuzikfbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013654.0320446-688-35647605074149/AnsiballZ_file.py
Dec 06 09:34:14 np0005548788.localdomain sudo[146678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:14 np0005548788.localdomain python3.9[146680]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:14 np0005548788.localdomain sudo[146678]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:14 np0005548788.localdomain sudo[146770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ladbeskbskeczxvvlvwsaqxjfdnscwze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013654.7103508-713-98959804633524/AnsiballZ_stat.py
Dec 06 09:34:14 np0005548788.localdomain sudo[146770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:15 np0005548788.localdomain python3.9[146772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:15 np0005548788.localdomain sudo[146770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:15 np0005548788.localdomain sudo[146843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weidygklnioxzhncuwegkbfqfwcyadiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013654.7103508-713-98959804633524/AnsiballZ_copy.py
Dec 06 09:34:15 np0005548788.localdomain sudo[146843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:15 np0005548788.localdomain python3.9[146845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013654.7103508-713-98959804633524/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:15 np0005548788.localdomain sudo[146843]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:16 np0005548788.localdomain sshd[144692]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:34:16 np0005548788.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Dec 06 09:34:16 np0005548788.localdomain systemd[1]: session-48.scope: Consumed 12.419s CPU time.
Dec 06 09:34:16 np0005548788.localdomain systemd-logind[765]: Session 48 logged out. Waiting for processes to exit.
Dec 06 09:34:16 np0005548788.localdomain systemd-logind[765]: Removed session 48.
Dec 06 09:34:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22966 DF PROTO=TCP SPT=39988 DPT=9100 SEQ=3020971046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0F3F10000000001030307) 
Dec 06 09:34:17 np0005548788.localdomain sshd[146860]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:17 np0005548788.localdomain sshd[146860]: Received disconnect from 148.227.3.232 port 54498:11: Bye Bye [preauth]
Dec 06 09:34:17 np0005548788.localdomain sshd[146860]: Disconnected from authenticating user root 148.227.3.232 port 54498 [preauth]
Dec 06 09:34:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16489 DF PROTO=TCP SPT=51936 DPT=9102 SEQ=1117185392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E0FEF40000000001030307) 
Dec 06 09:34:19 np0005548788.localdomain sshd[146862]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:20 np0005548788.localdomain sshd[146862]: Received disconnect from 45.78.194.186 port 42184:11: Bye Bye [preauth]
Dec 06 09:34:20 np0005548788.localdomain sshd[146862]: Disconnected from authenticating user root 45.78.194.186 port 42184 [preauth]
Dec 06 09:34:21 np0005548788.localdomain sshd[146864]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:21 np0005548788.localdomain sshd[146864]: Accepted publickey for zuul from 192.168.122.30 port 53540 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:34:21 np0005548788.localdomain systemd-logind[765]: New session 49 of user zuul.
Dec 06 09:34:21 np0005548788.localdomain systemd[1]: Started Session 49 of User zuul.
Dec 06 09:34:21 np0005548788.localdomain sshd[146864]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:34:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16491 DF PROTO=TCP SPT=51936 DPT=9102 SEQ=1117185392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E10AF10000000001030307) 
Dec 06 09:34:22 np0005548788.localdomain sudo[146957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykhbkaiwowklshbrocuzlbbkrdegjbqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013661.8497763-26-29397544943140/AnsiballZ_file.py
Dec 06 09:34:22 np0005548788.localdomain sudo[146957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:22 np0005548788.localdomain python3.9[146959]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:22 np0005548788.localdomain sudo[146957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:23 np0005548788.localdomain sudo[147049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pavuszpfkjjrptalrwizsovcmwnywvky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013662.9893684-62-67821519568211/AnsiballZ_stat.py
Dec 06 09:34:23 np0005548788.localdomain sudo[147049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:23 np0005548788.localdomain python3.9[147051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:23 np0005548788.localdomain sudo[147049]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:24 np0005548788.localdomain sudo[147122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-golgfcvyigidlpbovrsrcfeldzudssqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013662.9893684-62-67821519568211/AnsiballZ_copy.py
Dec 06 09:34:24 np0005548788.localdomain sudo[147122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:24 np0005548788.localdomain python3.9[147124]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013662.9893684-62-67821519568211/.source.conf _original_basename=ceph.conf follow=False checksum=74b6793c28400fa0a16ce9abdc4efa82feeb961d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:24 np0005548788.localdomain sudo[147122]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:24 np0005548788.localdomain sudo[147214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buletmxsdrscnyqvpuglvnpkmnqwoadk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013664.4029026-62-48939010699486/AnsiballZ_stat.py
Dec 06 09:34:24 np0005548788.localdomain sudo[147214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:24 np0005548788.localdomain python3.9[147216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:24 np0005548788.localdomain sudo[147214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:25 np0005548788.localdomain sudo[147287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-draxphahbhlnbpslzbsbmxvlrbzadila ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013664.4029026-62-48939010699486/AnsiballZ_copy.py
Dec 06 09:34:25 np0005548788.localdomain sudo[147287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:25 np0005548788.localdomain python3.9[147289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013664.4029026-62-48939010699486/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=9d631b6552ddeaa0e75a39b18f2bdb583e0e85e3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:25 np0005548788.localdomain sudo[147287]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:25 np0005548788.localdomain sshd[146864]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:34:25 np0005548788.localdomain systemd-logind[765]: Session 49 logged out. Waiting for processes to exit.
Dec 06 09:34:25 np0005548788.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Dec 06 09:34:25 np0005548788.localdomain systemd[1]: session-49.scope: Consumed 2.384s CPU time.
Dec 06 09:34:25 np0005548788.localdomain systemd-logind[765]: Removed session 49.
Dec 06 09:34:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16492 DF PROTO=TCP SPT=51936 DPT=9102 SEQ=1117185392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E11AB00000000001030307) 
Dec 06 09:34:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22968 DF PROTO=TCP SPT=39988 DPT=9100 SEQ=3020971046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E123F10000000001030307) 
Dec 06 09:34:31 np0005548788.localdomain sshd[147304]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:31 np0005548788.localdomain sshd[147304]: Accepted publickey for zuul from 192.168.122.30 port 48620 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:34:31 np0005548788.localdomain systemd-logind[765]: New session 50 of user zuul.
Dec 06 09:34:31 np0005548788.localdomain systemd[1]: Started Session 50 of User zuul.
Dec 06 09:34:31 np0005548788.localdomain sshd[147304]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:34:32 np0005548788.localdomain python3.9[147397]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:34:33 np0005548788.localdomain sudo[147491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgppjcfjiupfotbgbyffuulyqddyyywy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013672.980454-62-47070805944854/AnsiballZ_file.py
Dec 06 09:34:33 np0005548788.localdomain sudo[147491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:33 np0005548788.localdomain python3.9[147493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:33 np0005548788.localdomain sudo[147491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:34 np0005548788.localdomain sudo[147583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlhiqlehydtdvkupbwlfnodqpkioddzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013673.8448591-62-195479294085949/AnsiballZ_file.py
Dec 06 09:34:34 np0005548788.localdomain sudo[147583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:34 np0005548788.localdomain python3.9[147585]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:34 np0005548788.localdomain sudo[147583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33025 DF PROTO=TCP SPT=49082 DPT=9101 SEQ=3950661405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E13AFE0000000001030307) 
Dec 06 09:34:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62895 DF PROTO=TCP SPT=59592 DPT=9882 SEQ=2705363970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E13BF00000000001030307) 
Dec 06 09:34:35 np0005548788.localdomain python3.9[147675]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:34:36 np0005548788.localdomain sudo[147765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbbzxviywwsgwdrirfowvqrfwzxcnonr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013675.235023-131-274439416479967/AnsiballZ_seboolean.py
Dec 06 09:34:36 np0005548788.localdomain sudo[147765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:36 np0005548788.localdomain python3.9[147767]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 09:34:36 np0005548788.localdomain sudo[147765]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:37 np0005548788.localdomain sudo[147857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snnybyerymaedlgenokxxyktzuljpqln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013676.9861574-161-249265713172529/AnsiballZ_setup.py
Dec 06 09:34:37 np0005548788.localdomain sudo[147857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:37 np0005548788.localdomain python3.9[147859]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:34:37 np0005548788.localdomain sudo[147857]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33027 DF PROTO=TCP SPT=49082 DPT=9101 SEQ=3950661405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E146F00000000001030307) 
Dec 06 09:34:38 np0005548788.localdomain sudo[147911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahlsxqshwwqdwsdzkmqlkvhuxwplaftz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013676.9861574-161-249265713172529/AnsiballZ_dnf.py
Dec 06 09:34:38 np0005548788.localdomain sudo[147911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:38 np0005548788.localdomain python3.9[147913]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:34:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33733 DF PROTO=TCP SPT=49922 DPT=9105 SEQ=1354783308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E151F00000000001030307) 
Dec 06 09:34:41 np0005548788.localdomain sudo[147911]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:42 np0005548788.localdomain sudo[148005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atigdnwcjglareihxnpcdryzmxgtszvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013682.0942345-197-88650928139493/AnsiballZ_systemd.py
Dec 06 09:34:42 np0005548788.localdomain sudo[148005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:42 np0005548788.localdomain sudo[148008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:34:42 np0005548788.localdomain sudo[148008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:42 np0005548788.localdomain sudo[148008]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:42 np0005548788.localdomain sudo[148023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:34:42 np0005548788.localdomain sudo[148023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:42 np0005548788.localdomain python3.9[148007]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:34:43 np0005548788.localdomain sudo[148005]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2139 DF PROTO=TCP SPT=41718 DPT=9100 SEQ=2548334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E15D1F0000000001030307) 
Dec 06 09:34:43 np0005548788.localdomain podman[148126]: 2025-12-06 09:34:43.756312547 +0000 UTC m=+0.111229494 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.41.4)
Dec 06 09:34:43 np0005548788.localdomain podman[148126]: 2025-12-06 09:34:43.86944787 +0000 UTC m=+0.224364737 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Dec 06 09:34:44 np0005548788.localdomain sudo[148023]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548788.localdomain sudo[148194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:34:44 np0005548788.localdomain sudo[148194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:44 np0005548788.localdomain sudo[148194]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548788.localdomain sudo[148222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:34:44 np0005548788.localdomain sudo[148222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:44 np0005548788.localdomain sudo[148313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhkeqdryezeuuksjeprmytzadoytvgqq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013684.3143466-221-102937184095942/AnsiballZ_edpm_nftables_snippet.py
Dec 06 09:34:44 np0005548788.localdomain sudo[148313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:44 np0005548788.localdomain python3[148315]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 06 09:34:44 np0005548788.localdomain sudo[148313]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548788.localdomain sudo[148222]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:45 np0005548788.localdomain sudo[148379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:34:45 np0005548788.localdomain sudo[148379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:45 np0005548788.localdomain sudo[148379]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:45 np0005548788.localdomain sudo[148437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urvxgmucfldmzgfupbjbqmcakogyenuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013685.3408296-248-106492872562806/AnsiballZ_file.py
Dec 06 09:34:45 np0005548788.localdomain sudo[148437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:45 np0005548788.localdomain python3.9[148439]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:45 np0005548788.localdomain sudo[148437]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:46 np0005548788.localdomain sudo[148529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kftjeozkhbuwkuinqezoqlzkjpiwbqhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013686.0171392-272-53532114026332/AnsiballZ_stat.py
Dec 06 09:34:46 np0005548788.localdomain sudo[148529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2141 DF PROTO=TCP SPT=41718 DPT=9100 SEQ=2548334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E169310000000001030307) 
Dec 06 09:34:46 np0005548788.localdomain python3.9[148531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:46 np0005548788.localdomain sudo[148529]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:46 np0005548788.localdomain sudo[148577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkcgvquxmpmwzugqblwtqmkukestcdyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013686.0171392-272-53532114026332/AnsiballZ_file.py
Dec 06 09:34:46 np0005548788.localdomain sudo[148577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:47 np0005548788.localdomain python3.9[148579]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:47 np0005548788.localdomain sudo[148577]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:47 np0005548788.localdomain sshd[148580]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:48 np0005548788.localdomain sudo[148671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ashnkohwiqovmockrcyjgajgcwlruwyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013688.0462017-308-18101445908481/AnsiballZ_stat.py
Dec 06 09:34:48 np0005548788.localdomain sudo[148671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:48 np0005548788.localdomain python3.9[148673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:48 np0005548788.localdomain sudo[148671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:48 np0005548788.localdomain sudo[148719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plfohwlyhfuaeomeijvxjivfbntvgogz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013688.0462017-308-18101445908481/AnsiballZ_file.py
Dec 06 09:34:48 np0005548788.localdomain sudo[148719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:49 np0005548788.localdomain python3.9[148721]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.744x3cl5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:49 np0005548788.localdomain sudo[148719]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23582 DF PROTO=TCP SPT=51674 DPT=9102 SEQ=482486079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E174240000000001030307) 
Dec 06 09:34:49 np0005548788.localdomain sudo[148811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgqzwydeojvillemuqtdlymcqbuusulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013689.1992939-344-253347141823751/AnsiballZ_stat.py
Dec 06 09:34:49 np0005548788.localdomain sudo[148811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:50 np0005548788.localdomain python3.9[148813]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:50 np0005548788.localdomain sudo[148811]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:50 np0005548788.localdomain sudo[148859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kysxqwuakocmscdbzlatcrsazasmdatv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013689.1992939-344-253347141823751/AnsiballZ_file.py
Dec 06 09:34:50 np0005548788.localdomain sudo[148859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:50 np0005548788.localdomain python3.9[148861]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:50 np0005548788.localdomain sudo[148859]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:51 np0005548788.localdomain sudo[148951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqytokzciwkcwcoybwpdexyhxfbsqayc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013690.8050117-383-46484521084309/AnsiballZ_command.py
Dec 06 09:34:51 np0005548788.localdomain sudo[148951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:51 np0005548788.localdomain python3.9[148953]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:34:51 np0005548788.localdomain sudo[148951]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:52 np0005548788.localdomain sudo[149044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kakotklzdkgwnqfrwzeilzssamtcywfz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013691.6148763-407-49397466065464/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:34:52 np0005548788.localdomain sudo[149044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:52 np0005548788.localdomain python3[149046]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:34:52 np0005548788.localdomain sudo[149044]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23584 DF PROTO=TCP SPT=51674 DPT=9102 SEQ=482486079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E180300000000001030307) 
Dec 06 09:34:52 np0005548788.localdomain sudo[149136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psojomavyosbshhdlxcanhnpidkenkfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.442362-431-6655194652846/AnsiballZ_stat.py
Dec 06 09:34:52 np0005548788.localdomain sudo[149136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:52 np0005548788.localdomain sshd[148580]: Received disconnect from 101.47.142.76 port 34866:11: Bye Bye [preauth]
Dec 06 09:34:52 np0005548788.localdomain sshd[148580]: Disconnected from authenticating user root 101.47.142.76 port 34866 [preauth]
Dec 06 09:34:52 np0005548788.localdomain python3.9[149138]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:52 np0005548788.localdomain sudo[149136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:53 np0005548788.localdomain sudo[149211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbevvdhfnbphgrovwmboxywfvndumfii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.442362-431-6655194652846/AnsiballZ_copy.py
Dec 06 09:34:53 np0005548788.localdomain sudo[149211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:53 np0005548788.localdomain python3.9[149213]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013692.442362-431-6655194652846/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:53 np0005548788.localdomain sudo[149211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:54 np0005548788.localdomain sudo[149303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggszpdfanrvghshjngipyswxbfhmrxza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013693.8882523-476-39389716067674/AnsiballZ_stat.py
Dec 06 09:34:54 np0005548788.localdomain sudo[149303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:54 np0005548788.localdomain python3.9[149305]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:54 np0005548788.localdomain sudo[149303]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:54 np0005548788.localdomain sudo[149378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojyehmjbpwetmjfpoovokxbdndqrjbkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013693.8882523-476-39389716067674/AnsiballZ_copy.py
Dec 06 09:34:54 np0005548788.localdomain sudo[149378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:54 np0005548788.localdomain python3.9[149380]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013693.8882523-476-39389716067674/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:54 np0005548788.localdomain sudo[149378]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:55 np0005548788.localdomain sudo[149470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shcjmomnbpadxntohxheuzzpcwimjsrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013695.2041607-521-249105724285831/AnsiballZ_stat.py
Dec 06 09:34:55 np0005548788.localdomain sudo[149470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:55 np0005548788.localdomain python3.9[149472]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:55 np0005548788.localdomain sudo[149470]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:56 np0005548788.localdomain sudo[149545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajkrvligykmstkoeyzfxxpzzaiuoxvfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013695.2041607-521-249105724285831/AnsiballZ_copy.py
Dec 06 09:34:56 np0005548788.localdomain sudo[149545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:56 np0005548788.localdomain python3.9[149547]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013695.2041607-521-249105724285831/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:56 np0005548788.localdomain sudo[149545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23585 DF PROTO=TCP SPT=51674 DPT=9102 SEQ=482486079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E18FF10000000001030307) 
Dec 06 09:34:56 np0005548788.localdomain sudo[149637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzwhnrmtyjiimtozdmatixevgqpqufzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.398774-566-203826436483119/AnsiballZ_stat.py
Dec 06 09:34:56 np0005548788.localdomain sudo[149637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:56 np0005548788.localdomain python3.9[149639]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:56 np0005548788.localdomain sudo[149637]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:57 np0005548788.localdomain sudo[149712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chkbxtafbstzivrmyrnvfblwvpmjtyai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.398774-566-203826436483119/AnsiballZ_copy.py
Dec 06 09:34:57 np0005548788.localdomain sudo[149712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:57 np0005548788.localdomain python3.9[149714]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013696.398774-566-203826436483119/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:57 np0005548788.localdomain sudo[149712]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:57 np0005548788.localdomain sudo[149804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qslfgaikkqwqpfklpvqmudaenkapeuls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013697.5783722-611-104940797920569/AnsiballZ_stat.py
Dec 06 09:34:57 np0005548788.localdomain sudo[149804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:58 np0005548788.localdomain python3.9[149806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:58 np0005548788.localdomain sudo[149804]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:58 np0005548788.localdomain sudo[149879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbglmrimuurdbwtudkulexitfkztiwil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013697.5783722-611-104940797920569/AnsiballZ_copy.py
Dec 06 09:34:58 np0005548788.localdomain sudo[149879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:58 np0005548788.localdomain python3.9[149881]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013697.5783722-611-104940797920569/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:58 np0005548788.localdomain sudo[149879]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:59 np0005548788.localdomain sudo[149971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxjizmcydzzjimwxpdrckfirzexrtycm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013698.8418913-656-20952760970425/AnsiballZ_file.py
Dec 06 09:34:59 np0005548788.localdomain sudo[149971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2143 DF PROTO=TCP SPT=41718 DPT=9100 SEQ=2548334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E199F10000000001030307) 
Dec 06 09:34:59 np0005548788.localdomain python3.9[149973]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:59 np0005548788.localdomain sudo[149971]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:00 np0005548788.localdomain sudo[150063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkkymqbgnmetocnzzfypohcxrcosevow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013700.3184297-680-241061820508538/AnsiballZ_command.py
Dec 06 09:35:00 np0005548788.localdomain sudo[150063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:00 np0005548788.localdomain python3.9[150065]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:00 np0005548788.localdomain sudo[150063]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:02 np0005548788.localdomain sudo[150158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiitajqmrmfawpdhlynyxhkgdlxiezlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013701.0163004-704-59962388297135/AnsiballZ_blockinfile.py
Dec 06 09:35:02 np0005548788.localdomain sudo[150158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:02 np0005548788.localdomain python3.9[150160]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:02 np0005548788.localdomain sudo[150158]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:03 np0005548788.localdomain sudo[150250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isefgaphcyfkbjskimyfqazjjluaupxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013703.1882272-731-183747868853263/AnsiballZ_command.py
Dec 06 09:35:03 np0005548788.localdomain sudo[150250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:03 np0005548788.localdomain python3.9[150252]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:03 np0005548788.localdomain sudo[150250]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:04 np0005548788.localdomain sudo[150343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rznkncvayplsvawkmqsetbareftoomsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013703.8314288-755-248084120013342/AnsiballZ_stat.py
Dec 06 09:35:04 np0005548788.localdomain sudo[150343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:04 np0005548788.localdomain python3.9[150345]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:04 np0005548788.localdomain sudo[150343]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23586 DF PROTO=TCP SPT=51674 DPT=9102 SEQ=482486079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1AFF00000000001030307) 
Dec 06 09:35:04 np0005548788.localdomain sudo[150437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaccmbkhlmhkpifulqgnczrbcjiamqmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013704.5144928-779-108010130685506/AnsiballZ_command.py
Dec 06 09:35:04 np0005548788.localdomain sudo[150437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28742 DF PROTO=TCP SPT=60650 DPT=9101 SEQ=264509407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1B02E0000000001030307) 
Dec 06 09:35:05 np0005548788.localdomain python3.9[150439]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:05 np0005548788.localdomain sudo[150437]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:05 np0005548788.localdomain sudo[150532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqrsvkrusdzeylvfepqlkjjnxrhwmlos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013705.2286034-803-74698211454593/AnsiballZ_file.py
Dec 06 09:35:05 np0005548788.localdomain sudo[150532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:05 np0005548788.localdomain python3.9[150534]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:05 np0005548788.localdomain sudo[150532]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:07 np0005548788.localdomain python3.9[150624]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28744 DF PROTO=TCP SPT=60650 DPT=9101 SEQ=264509407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1BC310000000001030307) 
Dec 06 09:35:08 np0005548788.localdomain sudo[150715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-insxkqxtssensufvvcqovijuqkqzdlyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013707.9953573-923-103106469521205/AnsiballZ_command.py
Dec 06 09:35:08 np0005548788.localdomain sudo[150715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:08 np0005548788.localdomain python3.9[150717]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005548788.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:a2:0d:dc:1c" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:08 np0005548788.localdomain ovs-vsctl[150718]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005548788.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:a2:0d:dc:1c external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 06 09:35:08 np0005548788.localdomain sudo[150715]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:08 np0005548788.localdomain sudo[150808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuxwzlegdgruzuonaqomvaeentrhvadf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013708.6974154-950-131275314141390/AnsiballZ_command.py
Dec 06 09:35:08 np0005548788.localdomain sudo[150808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:09 np0005548788.localdomain python3.9[150810]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:09 np0005548788.localdomain sudo[150808]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:09 np0005548788.localdomain python3.9[150903]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:10 np0005548788.localdomain sudo[150995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlkizsanaibjpcymbtmxygpnettrrqzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013710.1520963-1004-214965618083710/AnsiballZ_file.py
Dec 06 09:35:10 np0005548788.localdomain sudo[150995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:10 np0005548788.localdomain python3.9[150997]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:10 np0005548788.localdomain sudo[150995]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:11 np0005548788.localdomain sudo[151087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akbdvsborwpdbcrimjyqudyrlazpnify ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013710.8337243-1028-34229979504479/AnsiballZ_stat.py
Dec 06 09:35:11 np0005548788.localdomain sudo[151087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:11 np0005548788.localdomain python3.9[151089]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:11 np0005548788.localdomain sudo[151087]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:11 np0005548788.localdomain sudo[151135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pykyzdjtifxrnepljnqzouclxablpiiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013710.8337243-1028-34229979504479/AnsiballZ_file.py
Dec 06 09:35:11 np0005548788.localdomain sudo[151135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:11 np0005548788.localdomain sshd[151138]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:11 np0005548788.localdomain python3.9[151137]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:11 np0005548788.localdomain sudo[151135]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:11 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28745 DF PROTO=TCP SPT=60650 DPT=9101 SEQ=264509407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1CBF00000000001030307) 
Dec 06 09:35:12 np0005548788.localdomain sudo[151229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebcetzywqyovrxnksgpdbpuukisdlpoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013711.8832698-1028-168186657131586/AnsiballZ_stat.py
Dec 06 09:35:12 np0005548788.localdomain sudo[151229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:12 np0005548788.localdomain python3.9[151231]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:12 np0005548788.localdomain sudo[151229]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:12 np0005548788.localdomain sudo[151277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpzmzzfnsvrfqbhvruraztxkeijajpuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013711.8832698-1028-168186657131586/AnsiballZ_file.py
Dec 06 09:35:12 np0005548788.localdomain sudo[151277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:12 np0005548788.localdomain python3.9[151279]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:12 np0005548788.localdomain sudo[151277]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:13 np0005548788.localdomain sudo[151369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syalitekqwokpdheufhpvrhcnkoglvsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013713.0823338-1097-253411434778952/AnsiballZ_file.py
Dec 06 09:35:13 np0005548788.localdomain sudo[151369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28200 DF PROTO=TCP SPT=51722 DPT=9100 SEQ=3148621393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1D24F0000000001030307) 
Dec 06 09:35:13 np0005548788.localdomain python3.9[151371]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:13 np0005548788.localdomain sudo[151369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:14 np0005548788.localdomain sudo[151461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kshzxurvtlnobfxjzdlvezpeaxmxileq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013714.3260884-1121-50489268125646/AnsiballZ_stat.py
Dec 06 09:35:14 np0005548788.localdomain sudo[151461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:14 np0005548788.localdomain sshd[151138]: Received disconnect from 45.78.219.195 port 39948:11: Bye Bye [preauth]
Dec 06 09:35:14 np0005548788.localdomain sshd[151138]: Disconnected from authenticating user root 45.78.219.195 port 39948 [preauth]
Dec 06 09:35:14 np0005548788.localdomain python3.9[151463]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:14 np0005548788.localdomain sudo[151461]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:15 np0005548788.localdomain sudo[151509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxoacvlfaqcqdvckaandyspsgwrorbmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013714.3260884-1121-50489268125646/AnsiballZ_file.py
Dec 06 09:35:15 np0005548788.localdomain sudo[151509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:15 np0005548788.localdomain python3.9[151511]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:15 np0005548788.localdomain sudo[151509]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:15 np0005548788.localdomain sudo[151601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnxflvfxpzodigtnzgvcxouzbcvopyih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013715.4188476-1157-277668274392479/AnsiballZ_stat.py
Dec 06 09:35:15 np0005548788.localdomain sudo[151601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:15 np0005548788.localdomain python3.9[151603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:15 np0005548788.localdomain sudo[151601]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28202 DF PROTO=TCP SPT=51722 DPT=9100 SEQ=3148621393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1DE710000000001030307) 
Dec 06 09:35:16 np0005548788.localdomain sudo[151649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzbveyiqusosgzyqkuqapmqibkavalwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013715.4188476-1157-277668274392479/AnsiballZ_file.py
Dec 06 09:35:16 np0005548788.localdomain sudo[151649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:16 np0005548788.localdomain python3.9[151651]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:16 np0005548788.localdomain sudo[151649]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:17 np0005548788.localdomain sudo[151741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-creuevdghrbajuifgvtjmkrqckvejxog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013717.1408918-1193-145588456612117/AnsiballZ_systemd.py
Dec 06 09:35:17 np0005548788.localdomain sudo[151741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:17 np0005548788.localdomain python3.9[151743]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:17 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:35:17 np0005548788.localdomain systemd-sysv-generator[151775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:17 np0005548788.localdomain systemd-rc-local-generator[151772]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:17 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:18 np0005548788.localdomain sudo[151741]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:18 np0005548788.localdomain sudo[151871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvezivicajxalfbpnziskkieoxyvpxjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013718.4698193-1217-197830862515121/AnsiballZ_stat.py
Dec 06 09:35:18 np0005548788.localdomain sudo[151871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:18 np0005548788.localdomain python3.9[151873]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:19 np0005548788.localdomain sudo[151871]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:19 np0005548788.localdomain sudo[151919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiuvnfvmkkjdseehesmyyzexzzkdmsvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013718.4698193-1217-197830862515121/AnsiballZ_file.py
Dec 06 09:35:19 np0005548788.localdomain sudo[151919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:19 np0005548788.localdomain python3.9[151921]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:19 np0005548788.localdomain sudo[151919]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1257 DF PROTO=TCP SPT=59604 DPT=9102 SEQ=3836330229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1E9540000000001030307) 
Dec 06 09:35:19 np0005548788.localdomain sudo[152011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weggarhnarxnevkfeqqsaaftlnfomhkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013719.58198-1253-114188756450528/AnsiballZ_stat.py
Dec 06 09:35:19 np0005548788.localdomain sudo[152011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:20 np0005548788.localdomain python3.9[152013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:20 np0005548788.localdomain sudo[152011]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:20 np0005548788.localdomain sudo[152059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttlaaayvlmtxxvuleynsvylqmaselgtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013719.58198-1253-114188756450528/AnsiballZ_file.py
Dec 06 09:35:20 np0005548788.localdomain sudo[152059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:20 np0005548788.localdomain python3.9[152061]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:20 np0005548788.localdomain sudo[152059]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:20 np0005548788.localdomain sudo[152151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loygxwcksubouelpcsrzfkcwnymugwag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013720.6797469-1289-246674710106956/AnsiballZ_systemd.py
Dec 06 09:35:20 np0005548788.localdomain sudo[152151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:21 np0005548788.localdomain python3.9[152153]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:21 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:35:21 np0005548788.localdomain systemd-sysv-generator[152182]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:21 np0005548788.localdomain systemd-rc-local-generator[152177]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1259 DF PROTO=TCP SPT=59604 DPT=9102 SEQ=3836330229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E1F5700000000001030307) 
Dec 06 09:35:22 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:35:22 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:35:22 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:35:22 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:35:22 np0005548788.localdomain sudo[152151]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:23 np0005548788.localdomain sudo[152286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stzqdlqbzzuooygosmrrqouvmilshcjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.066013-1319-280322329662962/AnsiballZ_file.py
Dec 06 09:35:23 np0005548788.localdomain sudo[152286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:23 np0005548788.localdomain python3.9[152288]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:23 np0005548788.localdomain sudo[152286]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:24 np0005548788.localdomain sudo[152378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydeeywotendpwcbftublokvqhtgewpzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.7609887-1343-110414278562240/AnsiballZ_stat.py
Dec 06 09:35:24 np0005548788.localdomain sudo[152378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:24 np0005548788.localdomain python3.9[152380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:24 np0005548788.localdomain sudo[152378]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:24 np0005548788.localdomain sudo[152451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivxzaadnefguvnaatwqoralawuqyszfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.7609887-1343-110414278562240/AnsiballZ_copy.py
Dec 06 09:35:24 np0005548788.localdomain sudo[152451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:24 np0005548788.localdomain python3.9[152453]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013723.7609887-1343-110414278562240/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:24 np0005548788.localdomain sudo[152451]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:26 np0005548788.localdomain sudo[152543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deqkzfwirnctahoeggbctwedbfqepygh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013725.7568474-1394-130087631844798/AnsiballZ_file.py
Dec 06 09:35:26 np0005548788.localdomain sudo[152543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:26 np0005548788.localdomain python3.9[152545]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:26 np0005548788.localdomain sudo[152543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1260 DF PROTO=TCP SPT=59604 DPT=9102 SEQ=3836330229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E205300000000001030307) 
Dec 06 09:35:26 np0005548788.localdomain sudo[152635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpazumnirnfiketnfabyapvfbgqvdbsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013726.4715486-1418-268511455826927/AnsiballZ_stat.py
Dec 06 09:35:26 np0005548788.localdomain sudo[152635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:26 np0005548788.localdomain python3.9[152637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:26 np0005548788.localdomain sudo[152635]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:27 np0005548788.localdomain sudo[152710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-resjqdemeappfowllnblyftngwibjosk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013726.4715486-1418-268511455826927/AnsiballZ_copy.py
Dec 06 09:35:27 np0005548788.localdomain sudo[152710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:28 np0005548788.localdomain python3.9[152712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013726.4715486-1418-268511455826927/.source.json _original_basename=.je9fr35b follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:28 np0005548788.localdomain sudo[152710]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:28 np0005548788.localdomain sudo[152802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsyizbnmclypodqaetydwqcseujflkjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013728.3419716-1463-226956984852117/AnsiballZ_file.py
Dec 06 09:35:28 np0005548788.localdomain sudo[152802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28204 DF PROTO=TCP SPT=51722 DPT=9100 SEQ=3148621393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E20DF00000000001030307) 
Dec 06 09:35:28 np0005548788.localdomain python3.9[152804]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:28 np0005548788.localdomain sudo[152802]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:29 np0005548788.localdomain sudo[152894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybrflqnxkllxwphvtpgdidhrlporsqgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013729.044404-1487-103922675848115/AnsiballZ_stat.py
Dec 06 09:35:29 np0005548788.localdomain sudo[152894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:29 np0005548788.localdomain sudo[152894]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:29 np0005548788.localdomain sudo[152967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brorgjzxzehrcpydulovhffbqhapxajs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013729.044404-1487-103922675848115/AnsiballZ_copy.py
Dec 06 09:35:29 np0005548788.localdomain sudo[152967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:30 np0005548788.localdomain sudo[152967]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:30 np0005548788.localdomain sudo[153059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vifinrskjmvwznefazpgshtzqiahchno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013730.393311-1538-190623381310584/AnsiballZ_container_config_data.py
Dec 06 09:35:30 np0005548788.localdomain sudo[153059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:31 np0005548788.localdomain python3.9[153061]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 06 09:35:31 np0005548788.localdomain sudo[153059]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:31 np0005548788.localdomain sudo[153151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dobwcekajvnptgwxxkeycyrejrhsamdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013731.270659-1565-279202807151561/AnsiballZ_container_config_hash.py
Dec 06 09:35:31 np0005548788.localdomain sudo[153151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:31 np0005548788.localdomain python3.9[153153]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:35:31 np0005548788.localdomain sudo[153151]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:32 np0005548788.localdomain sudo[153243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjbnbhuqbxapzafuqpeupsxaqpbheaue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013732.1451404-1592-140779544271949/AnsiballZ_podman_container_info.py
Dec 06 09:35:32 np0005548788.localdomain sudo[153243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:32 np0005548788.localdomain python3.9[153245]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:35:33 np0005548788.localdomain sudo[153243]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2823 DF PROTO=TCP SPT=40906 DPT=9101 SEQ=261819077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E225600000000001030307) 
Dec 06 09:35:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1261 DF PROTO=TCP SPT=59604 DPT=9102 SEQ=3836330229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E225F00000000001030307) 
Dec 06 09:35:37 np0005548788.localdomain sudo[153362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usqkhduitxayhjkvjyistsapsouauvmo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013736.122511-1631-12345471260311/AnsiballZ_edpm_container_manage.py
Dec 06 09:35:37 np0005548788.localdomain sudo[153362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:37 np0005548788.localdomain python3[153364]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:35:37 np0005548788.localdomain python3[153364]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c",
                                                                    "Digest": "sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:38:47.246477714Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 345722821,
                                                                    "VirtualSize": 345722821,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:ba9362d2aeb297e34b0679b2fc8168350c70a5b0ec414daf293bf2bc013e9088",
                                                                              "sha256:aae3b8a85314314b9db80a043fdf3f3b1d0b69927faca0303c73969a23dddd0f"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:22.759131427Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:13:25.258260855Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:13:28.025145079Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:13.535675197Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:47.244104142Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:48.759416475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:35:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2825 DF PROTO=TCP SPT=40906 DPT=9101 SEQ=261819077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E231710000000001030307) 
Dec 06 09:35:38 np0005548788.localdomain podman[153413]: 2025-12-06 09:35:38.004111949 +0000 UTC m=+0.102694506 container remove 6e134ca188993758db86d978b13436b9df0277996c7b97545d2d98941b9e24ef (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true)
Dec 06 09:35:38 np0005548788.localdomain python3[153364]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Dec 06 09:35:38 np0005548788.localdomain podman[153426]: 
Dec 06 09:35:38 np0005548788.localdomain podman[153426]: 2025-12-06 09:35:38.113524913 +0000 UTC m=+0.087047249 container create 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:35:38 np0005548788.localdomain podman[153426]: 2025-12-06 09:35:38.073348903 +0000 UTC m=+0.046871309 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:35:38 np0005548788.localdomain python3[153364]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:35:38 np0005548788.localdomain sudo[153362]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:38 np0005548788.localdomain sudo[153554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmidpktdtqloaeikakozrhrgpkswaage ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013738.4602542-1655-112400319457447/AnsiballZ_stat.py
Dec 06 09:35:38 np0005548788.localdomain sudo[153554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:38 np0005548788.localdomain python3.9[153556]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:38 np0005548788.localdomain sudo[153554]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:39 np0005548788.localdomain sudo[153648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpzqysaudfhazilmxjrhbysgemwlsags ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013739.2092478-1682-170996382064519/AnsiballZ_file.py
Dec 06 09:35:39 np0005548788.localdomain sudo[153648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:39 np0005548788.localdomain python3.9[153650]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:39 np0005548788.localdomain sudo[153648]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:39 np0005548788.localdomain sudo[153694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sppqiezwhkmzlzfkatyucpjdgfcqkseb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013739.2092478-1682-170996382064519/AnsiballZ_stat.py
Dec 06 09:35:39 np0005548788.localdomain sudo[153694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:40 np0005548788.localdomain python3.9[153696]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:40 np0005548788.localdomain sudo[153694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32683 DF PROTO=TCP SPT=55812 DPT=9105 SEQ=1225522228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E23BF00000000001030307) 
Dec 06 09:35:40 np0005548788.localdomain sudo[153785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uahnegjztqmsunhuuoneqodtpnnhxnxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.2477188-1682-176560592487035/AnsiballZ_copy.py
Dec 06 09:35:40 np0005548788.localdomain sudo[153785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:40 np0005548788.localdomain python3.9[153787]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013740.2477188-1682-176560592487035/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:40 np0005548788.localdomain sudo[153785]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:41 np0005548788.localdomain sudo[153831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxktsyddcciqnmcmgkuaraptqwmxkpnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.2477188-1682-176560592487035/AnsiballZ_systemd.py
Dec 06 09:35:41 np0005548788.localdomain sudo[153831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:41 np0005548788.localdomain python3.9[153833]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:35:41 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:35:41 np0005548788.localdomain systemd-sysv-generator[153864]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:41 np0005548788.localdomain systemd-rc-local-generator[153861]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:42 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:42 np0005548788.localdomain sudo[153831]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:42 np0005548788.localdomain sudo[153913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqsvxzoppqdsoqxywyafrmpvurleiuet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.2477188-1682-176560592487035/AnsiballZ_systemd.py
Dec 06 09:35:42 np0005548788.localdomain sudo[153913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:42 np0005548788.localdomain python3.9[153915]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:42 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:35:42 np0005548788.localdomain systemd-rc-local-generator[153939]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:42 np0005548788.localdomain systemd-sysv-generator[153942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Starting ovn_controller container...
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:35:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4e3997e06d3baa6964322ac7cda04a2ee78374850fbc0920c8950d528820722/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:35:43 np0005548788.localdomain podman[153956]: 2025-12-06 09:35:43.373880312 +0000 UTC m=+0.165072877 container init 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:35:43 np0005548788.localdomain ovn_controller[153970]: + sudo -E kolla_set_configs
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:35:43 np0005548788.localdomain podman[153956]: 2025-12-06 09:35:43.414274418 +0000 UTC m=+0.205466993 container start 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec 06 09:35:43 np0005548788.localdomain edpm-start-podman-container[153956]: ovn_controller
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:35:43 np0005548788.localdomain podman[153977]: 2025-12-06 09:35:43.517630994 +0000 UTC m=+0.096113931 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller)
Dec 06 09:35:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26074 DF PROTO=TCP SPT=41250 DPT=9100 SEQ=2694684717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E247800000000001030307) 
Dec 06 09:35:43 np0005548788.localdomain podman[153977]: 2025-12-06 09:35:43.607869261 +0000 UTC m=+0.186352238 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:35:43 np0005548788.localdomain podman[153977]: unhealthy
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Queued start job for default target Main User Target.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Created slice User Application Slice.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Reached target Paths.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Reached target Timers.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Starting D-Bus User Message Bus Socket...
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Starting Create User's Volatile Files and Directories...
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Failed with result 'exit-code'.
Dec 06 09:35:43 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 06 09:35:43 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:35:43 np0005548788.localdomain edpm-start-podman-container[153955]: Creating additional drop-in dependency for "ovn_controller" (948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7)
Dec 06 09:35:43 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:35:43 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Finished Create User's Volatile Files and Directories.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Listening on D-Bus User Message Bus Socket.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Reached target Sockets.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Reached target Basic System.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Reached target Main User Target.
Dec 06 09:35:43 np0005548788.localdomain systemd[153998]: Startup finished in 142ms.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Started Session c11 of User root.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:35:43 np0005548788.localdomain systemd-rc-local-generator[154058]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:43 np0005548788.localdomain systemd-sysv-generator[154064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:43 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:44 np0005548788.localdomain systemd[1]: Started ovn_controller container.
Dec 06 09:35:44 np0005548788.localdomain sudo[153913]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: INFO:__main__:Validating config file
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: INFO:__main__:Writing out command to execute
Dec 06 09:35:44 np0005548788.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: ++ cat /run_command
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + ARGS=
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + sudo kolla_copy_cacerts
Dec 06 09:35:44 np0005548788.localdomain systemd[1]: Started Session c12 of User root.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + [[ ! -n '' ]]
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + . kolla_extend_start
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + umask 0022
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Dec 06 09:35:44 np0005548788.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00013|main|INFO|OVS feature set changed, force recompute.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00021|main|INFO|OVS feature set changed, force recompute.
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00022|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:35:44Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548788.localdomain sudo[154168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usqcnneecyatyalygefiowetzgrqrohy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013744.2397335-1766-106577752317544/AnsiballZ_command.py
Dec 06 09:35:44 np0005548788.localdomain sudo[154168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:44 np0005548788.localdomain python3.9[154170]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:44 np0005548788.localdomain ovs-vsctl[154171]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 06 09:35:44 np0005548788.localdomain sudo[154168]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:45 np0005548788.localdomain sudo[154261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okxgzyuvmkjxaqimrkwqjdubgvyvwfae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013744.9068887-1790-155402470488683/AnsiballZ_command.py
Dec 06 09:35:45 np0005548788.localdomain sudo[154261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:45 np0005548788.localdomain python3.9[154263]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:45 np0005548788.localdomain ovs-vsctl[154265]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 06 09:35:45 np0005548788.localdomain sudo[154261]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:45 np0005548788.localdomain sudo[154281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:35:45 np0005548788.localdomain sudo[154281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:35:45 np0005548788.localdomain sudo[154281]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:45 np0005548788.localdomain sudo[154296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:35:45 np0005548788.localdomain sudo[154296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:35:46 np0005548788.localdomain sudo[154404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwtyqblklswkzcqzjjxvyznmvxpqkodo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013746.0513945-1832-117530929039294/AnsiballZ_command.py
Dec 06 09:35:46 np0005548788.localdomain sudo[154404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:46 np0005548788.localdomain sudo[154296]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:46 np0005548788.localdomain python3.9[154408]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:46 np0005548788.localdomain ovs-vsctl[154421]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 06 09:35:46 np0005548788.localdomain sudo[154404]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26076 DF PROTO=TCP SPT=41250 DPT=9100 SEQ=2694684717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E253700000000001030307) 
Dec 06 09:35:46 np0005548788.localdomain sudo[154436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:35:47 np0005548788.localdomain sudo[154436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:35:47 np0005548788.localdomain sudo[154436]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:47 np0005548788.localdomain sshd[147304]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:35:47 np0005548788.localdomain systemd[1]: session-50.scope: Deactivated successfully.
Dec 06 09:35:47 np0005548788.localdomain systemd[1]: session-50.scope: Consumed 42.651s CPU time.
Dec 06 09:35:47 np0005548788.localdomain systemd-logind[765]: Session 50 logged out. Waiting for processes to exit.
Dec 06 09:35:47 np0005548788.localdomain systemd-logind[765]: Removed session 50.
Dec 06 09:35:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37956 DF PROTO=TCP SPT=44854 DPT=9102 SEQ=1462555426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E25E840000000001030307) 
Dec 06 09:35:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37958 DF PROTO=TCP SPT=44854 DPT=9102 SEQ=1462555426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E26A700000000001030307) 
Dec 06 09:35:52 np0005548788.localdomain sshd[154451]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:52 np0005548788.localdomain sshd[154451]: Accepted publickey for zuul from 192.168.122.30 port 33334 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:35:52 np0005548788.localdomain systemd-logind[765]: New session 52 of user zuul.
Dec 06 09:35:52 np0005548788.localdomain systemd[1]: Started Session 52 of User zuul.
Dec 06 09:35:52 np0005548788.localdomain sshd[154451]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:35:53 np0005548788.localdomain python3.9[154544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Activating special unit Exit the Session...
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped target Main User Target.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped target Basic System.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped target Paths.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped target Sockets.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped target Timers.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Closed D-Bus User Message Bus Socket.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Removed slice User Application Slice.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Reached target Shutdown.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Finished Exit the Session.
Dec 06 09:35:54 np0005548788.localdomain systemd[153998]: Reached target Exit the Session.
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 09:35:54 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 09:35:54 np0005548788.localdomain sudo[154642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icgqjkfhriubzqcfoobvmbgqdplnfuvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013754.4901226-62-224055856382043/AnsiballZ_file.py
Dec 06 09:35:54 np0005548788.localdomain sudo[154642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:55 np0005548788.localdomain python3.9[154644]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:55 np0005548788.localdomain sudo[154642]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:55 np0005548788.localdomain sudo[154734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usbddtdvifetseebduaittgusamabfbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013755.2520823-62-271280517131592/AnsiballZ_file.py
Dec 06 09:35:55 np0005548788.localdomain sudo[154734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:55 np0005548788.localdomain python3.9[154736]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:55 np0005548788.localdomain sudo[154734]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:56 np0005548788.localdomain sudo[154826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukjwksjfirrsdkogecllmtnvonrkavev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013755.9082196-62-30630875830803/AnsiballZ_file.py
Dec 06 09:35:56 np0005548788.localdomain sudo[154826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:56 np0005548788.localdomain python3.9[154828]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:56 np0005548788.localdomain sudo[154826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37959 DF PROTO=TCP SPT=44854 DPT=9102 SEQ=1462555426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E27A300000000001030307) 
Dec 06 09:35:56 np0005548788.localdomain sudo[154918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmxiekjsimpxjjdzzvnjqgsqzrvjtfoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013756.5120823-62-31881767772504/AnsiballZ_file.py
Dec 06 09:35:56 np0005548788.localdomain sudo[154918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:57 np0005548788.localdomain python3.9[154920]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:57 np0005548788.localdomain sudo[154918]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:57 np0005548788.localdomain sudo[155010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loxgebuzxswgvwwfsmehwoaydujdrgir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013757.1418998-62-233104590716372/AnsiballZ_file.py
Dec 06 09:35:57 np0005548788.localdomain sudo[155010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:57 np0005548788.localdomain python3.9[155012]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:57 np0005548788.localdomain sudo[155010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:58 np0005548788.localdomain python3.9[155102]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:58 np0005548788.localdomain sudo[155192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fimrhfkyvmoimgcwdhftvgupmschcszv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013758.4814413-194-6771163041146/AnsiballZ_seboolean.py
Dec 06 09:35:58 np0005548788.localdomain sudo[155192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26078 DF PROTO=TCP SPT=41250 DPT=9100 SEQ=2694684717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E283F10000000001030307) 
Dec 06 09:35:59 np0005548788.localdomain python3.9[155194]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 09:35:59 np0005548788.localdomain sudo[155192]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:00 np0005548788.localdomain python3.9[155284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:00 np0005548788.localdomain python3.9[155357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013759.432657-218-106827662770555/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:01 np0005548788.localdomain python3.9[155447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:02 np0005548788.localdomain python3.9[155521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013760.8913229-263-60361452969068/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:03 np0005548788.localdomain sudo[155611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqnhsdeamksgzrbjnwdjdzgkktutdtjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013762.871309-314-869227046855/AnsiballZ_setup.py
Dec 06 09:36:03 np0005548788.localdomain sudo[155611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:03 np0005548788.localdomain python3.9[155613]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:36:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:36:03Z|00023|memory|INFO|15040 kB peak resident set size after 19.4 seconds
Dec 06 09:36:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T09:36:03Z|00024|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:813 ofctrl_desired_flow_usage-KB:9 ofctrl_installed_flow_usage-KB:7 ofctrl_sb_flow_ref_usage-KB:3
Dec 06 09:36:03 np0005548788.localdomain sudo[155611]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:04 np0005548788.localdomain sudo[155665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nogiiwxmydlryscwxuwnyotozmteikbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013762.871309-314-869227046855/AnsiballZ_dnf.py
Dec 06 09:36:04 np0005548788.localdomain sudo[155665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:04 np0005548788.localdomain python3.9[155667]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:36:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37960 DF PROTO=TCP SPT=44854 DPT=9102 SEQ=1462555426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E299F00000000001030307) 
Dec 06 09:36:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32149 DF PROTO=TCP SPT=46892 DPT=9101 SEQ=666606063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E29A8E0000000001030307) 
Dec 06 09:36:07 np0005548788.localdomain sudo[155665]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32151 DF PROTO=TCP SPT=46892 DPT=9101 SEQ=666606063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2A6B00000000001030307) 
Dec 06 09:36:08 np0005548788.localdomain sudo[155759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szwswyfditkjuqhrhhhbqudaovjmyntg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013767.7628975-350-56311494192027/AnsiballZ_systemd.py
Dec 06 09:36:08 np0005548788.localdomain sudo[155759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:08 np0005548788.localdomain python3.9[155761]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:36:08 np0005548788.localdomain sudo[155759]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:09 np0005548788.localdomain python3.9[155854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:09 np0005548788.localdomain python3.9[155925]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013768.8226762-374-23498830827308/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:10 np0005548788.localdomain python3.9[156015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27399 DF PROTO=TCP SPT=52540 DPT=9105 SEQ=2711600204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2B1F10000000001030307) 
Dec 06 09:36:10 np0005548788.localdomain python3.9[156086]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013769.850952-374-6286050368786/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:12 np0005548788.localdomain python3.9[156176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:12 np0005548788.localdomain python3.9[156247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013771.594838-506-205511370524003/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36428 DF PROTO=TCP SPT=34660 DPT=9100 SEQ=1244586180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2BCB00000000001030307) 
Dec 06 09:36:13 np0005548788.localdomain python3.9[156337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:14 np0005548788.localdomain python3.9[156408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013773.2413538-506-190043616587930/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:36:14 np0005548788.localdomain podman[156423]: 2025-12-06 09:36:14.261412804 +0000 UTC m=+0.078229295 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:36:14 np0005548788.localdomain podman[156423]: 2025-12-06 09:36:14.33491112 +0000 UTC m=+0.151727671 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:36:14 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:36:14 np0005548788.localdomain python3.9[156523]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:15 np0005548788.localdomain sudo[156615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzkqcamdlzyiubxjmtygoojgvdgcksmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013775.7355428-620-9257417500971/AnsiballZ_file.py
Dec 06 09:36:15 np0005548788.localdomain sudo[156615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:16 np0005548788.localdomain python3.9[156617]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:16 np0005548788.localdomain sudo[156615]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:16 np0005548788.localdomain sudo[156707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyfbrvpofhsktrbsjzfmgpgdfprbfydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013776.334445-644-39837140190289/AnsiballZ_stat.py
Dec 06 09:36:16 np0005548788.localdomain sudo[156707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36430 DF PROTO=TCP SPT=34660 DPT=9100 SEQ=1244586180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2C8B00000000001030307) 
Dec 06 09:36:16 np0005548788.localdomain python3.9[156709]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:16 np0005548788.localdomain sudo[156707]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:17 np0005548788.localdomain sudo[156755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvstyqmcdyqiqagbgslyitcvbiswumpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013776.334445-644-39837140190289/AnsiballZ_file.py
Dec 06 09:36:17 np0005548788.localdomain sudo[156755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:17 np0005548788.localdomain python3.9[156757]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:17 np0005548788.localdomain sudo[156755]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:17 np0005548788.localdomain sudo[156847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etjragzrnaxttgpqsqqcbzxqfqafacct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013777.411753-644-141711494136337/AnsiballZ_stat.py
Dec 06 09:36:17 np0005548788.localdomain sudo[156847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:17 np0005548788.localdomain python3.9[156849]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:17 np0005548788.localdomain sudo[156847]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:18 np0005548788.localdomain sudo[156895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brofacwcyfnwfnifsymzrfhceqmeakco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013777.411753-644-141711494136337/AnsiballZ_file.py
Dec 06 09:36:18 np0005548788.localdomain sudo[156895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:18 np0005548788.localdomain python3.9[156897]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:18 np0005548788.localdomain sudo[156895]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:18 np0005548788.localdomain sudo[156987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tstdcamzhaknqnefnrbjeahnomxluvbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013778.5023768-713-57057694979319/AnsiballZ_file.py
Dec 06 09:36:18 np0005548788.localdomain sudo[156987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:18 np0005548788.localdomain python3.9[156989]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:18 np0005548788.localdomain sudo[156987]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:19 np0005548788.localdomain sudo[157079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkcfyyjxhphfazgeqrcksnhtvymsfkty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013779.1006005-737-183949124860892/AnsiballZ_stat.py
Dec 06 09:36:19 np0005548788.localdomain sudo[157079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38811 DF PROTO=TCP SPT=44622 DPT=9102 SEQ=3537628924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2D3B40000000001030307) 
Dec 06 09:36:19 np0005548788.localdomain python3.9[157081]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:19 np0005548788.localdomain sudo[157079]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:19 np0005548788.localdomain sudo[157127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsfgodiiokqbhaapnsnikvxzkvmrctdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013779.1006005-737-183949124860892/AnsiballZ_file.py
Dec 06 09:36:19 np0005548788.localdomain sudo[157127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:20 np0005548788.localdomain python3.9[157129]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:20 np0005548788.localdomain sudo[157127]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:20 np0005548788.localdomain sudo[157219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kypahhpzymtyvzhypyenfwzwpxevxhnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013780.2502508-773-195486139181677/AnsiballZ_stat.py
Dec 06 09:36:20 np0005548788.localdomain sudo[157219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:20 np0005548788.localdomain python3.9[157221]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:20 np0005548788.localdomain sudo[157219]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:20 np0005548788.localdomain sudo[157267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvnwvszobtdddzgcfxxodekddvlzjeng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013780.2502508-773-195486139181677/AnsiballZ_file.py
Dec 06 09:36:20 np0005548788.localdomain sudo[157267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:21 np0005548788.localdomain python3.9[157269]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:21 np0005548788.localdomain sudo[157267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:21 np0005548788.localdomain sudo[157359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcypudlmjykalniljgfksmvxhtuzyyme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013781.365069-809-239538031032825/AnsiballZ_systemd.py
Dec 06 09:36:21 np0005548788.localdomain sudo[157359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:21 np0005548788.localdomain python3.9[157361]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:21 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:36:22 np0005548788.localdomain systemd-rc-local-generator[157384]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:22 np0005548788.localdomain systemd-sysv-generator[157389]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:22 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:22 np0005548788.localdomain sudo[157359]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38813 DF PROTO=TCP SPT=44622 DPT=9102 SEQ=3537628924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2DFB10000000001030307) 
Dec 06 09:36:22 np0005548788.localdomain sudo[157489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmflyxyjvdviwtprjbrvkrbudbfmckwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013782.5195448-833-279894831601188/AnsiballZ_stat.py
Dec 06 09:36:22 np0005548788.localdomain sudo[157489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:22 np0005548788.localdomain python3.9[157491]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:23 np0005548788.localdomain sudo[157489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:23 np0005548788.localdomain sudo[157537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujeeuvyfvqzufvznromobexnkunobivr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013782.5195448-833-279894831601188/AnsiballZ_file.py
Dec 06 09:36:23 np0005548788.localdomain sudo[157537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:23 np0005548788.localdomain python3.9[157539]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:23 np0005548788.localdomain sudo[157537]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:23 np0005548788.localdomain sudo[157629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-foknltfgbmneelbkuzgupvcofpnglcsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013783.686914-869-279798656000113/AnsiballZ_stat.py
Dec 06 09:36:23 np0005548788.localdomain sudo[157629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:24 np0005548788.localdomain python3.9[157631]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:24 np0005548788.localdomain sudo[157629]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:24 np0005548788.localdomain sudo[157677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsyvwhbkituvxaxrsgnxqrgaqpmqwalg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013783.686914-869-279798656000113/AnsiballZ_file.py
Dec 06 09:36:24 np0005548788.localdomain sudo[157677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:24 np0005548788.localdomain python3.9[157679]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:24 np0005548788.localdomain sudo[157677]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:25 np0005548788.localdomain sudo[157769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxohuvdvznjnnwehnzdkwovmunneilah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013784.7563365-905-62555987705947/AnsiballZ_systemd.py
Dec 06 09:36:25 np0005548788.localdomain sudo[157769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:25 np0005548788.localdomain python3.9[157771]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:25 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:36:25 np0005548788.localdomain systemd-rc-local-generator[157796]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:25 np0005548788.localdomain systemd-sysv-generator[157801]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:25 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:36:25 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:36:25 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:36:25 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:36:25 np0005548788.localdomain sudo[157769]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38814 DF PROTO=TCP SPT=44622 DPT=9102 SEQ=3537628924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2EF700000000001030307) 
Dec 06 09:36:26 np0005548788.localdomain sudo[157903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlbgvyzkobvoivyklugxcfngtlbdwloa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013786.6790626-935-235885108017009/AnsiballZ_file.py
Dec 06 09:36:26 np0005548788.localdomain sudo[157903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:27 np0005548788.localdomain python3.9[157905]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:27 np0005548788.localdomain sudo[157903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:27 np0005548788.localdomain sudo[157995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyklifchrmoiwxswqucuvmrsfjdfuzss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013787.3520281-959-56994233297158/AnsiballZ_stat.py
Dec 06 09:36:27 np0005548788.localdomain sudo[157995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:27 np0005548788.localdomain python3.9[157997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:27 np0005548788.localdomain sudo[157995]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36432 DF PROTO=TCP SPT=34660 DPT=9100 SEQ=1244586180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E2F7F10000000001030307) 
Dec 06 09:36:28 np0005548788.localdomain sudo[158068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiqxjhratotztrdwxzdgndmkhtjjbpdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013787.3520281-959-56994233297158/AnsiballZ_copy.py
Dec 06 09:36:28 np0005548788.localdomain sudo[158068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:29 np0005548788.localdomain python3.9[158070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013787.3520281-959-56994233297158/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:29 np0005548788.localdomain sudo[158068]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:29 np0005548788.localdomain sudo[158160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nclpwfkgjyezuussipzdvsemhjcfhtfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013789.5471117-1010-273235223263055/AnsiballZ_file.py
Dec 06 09:36:29 np0005548788.localdomain sudo[158160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:30 np0005548788.localdomain python3.9[158162]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:30 np0005548788.localdomain sudo[158160]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:30 np0005548788.localdomain sudo[158252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwjlvegwepnojfyywkcegfookipjbsxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013790.2443426-1034-119045408839326/AnsiballZ_stat.py
Dec 06 09:36:30 np0005548788.localdomain sudo[158252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:30 np0005548788.localdomain python3.9[158254]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:30 np0005548788.localdomain sudo[158252]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:31 np0005548788.localdomain sudo[158327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asspzuopeqhbuywjhuswphaeqejldofy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013790.2443426-1034-119045408839326/AnsiballZ_copy.py
Dec 06 09:36:31 np0005548788.localdomain sudo[158327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:31 np0005548788.localdomain python3.9[158329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013790.2443426-1034-119045408839326/.source.json _original_basename=.o2kb8m97 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:31 np0005548788.localdomain sudo[158327]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:31 np0005548788.localdomain sudo[158419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evucygnvilbkbkryxemuejjhduwbudhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013791.4706895-1079-116726812122891/AnsiballZ_file.py
Dec 06 09:36:31 np0005548788.localdomain sudo[158419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:31 np0005548788.localdomain python3.9[158421]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:31 np0005548788.localdomain sudo[158419]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:32 np0005548788.localdomain sudo[158511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obeuxlgqfyjrxjpkxtlznvtqxjnyibth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013792.1814628-1103-163459335202422/AnsiballZ_stat.py
Dec 06 09:36:32 np0005548788.localdomain sudo[158511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:32 np0005548788.localdomain sudo[158511]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:32 np0005548788.localdomain sudo[158584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fclndetjagxsvkujyvijecfhjmzhdcfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013792.1814628-1103-163459335202422/AnsiballZ_copy.py
Dec 06 09:36:32 np0005548788.localdomain sudo[158584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:33 np0005548788.localdomain sudo[158584]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:34 np0005548788.localdomain sudo[158676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvjkoxkpfcmwmnmjzvkufkjzrlasgbzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013793.6181521-1154-276784056898597/AnsiballZ_container_config_data.py
Dec 06 09:36:34 np0005548788.localdomain sudo[158676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:34 np0005548788.localdomain python3.9[158678]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 06 09:36:34 np0005548788.localdomain sudo[158676]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22498 DF PROTO=TCP SPT=49882 DPT=9101 SEQ=2012729025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E30FBE0000000001030307) 
Dec 06 09:36:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38815 DF PROTO=TCP SPT=44622 DPT=9102 SEQ=3537628924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E30FF00000000001030307) 
Dec 06 09:36:34 np0005548788.localdomain sudo[158768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqzshwwkzbexwgtthedkuwbqxpvezrya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013794.470129-1181-153583651880554/AnsiballZ_container_config_hash.py
Dec 06 09:36:34 np0005548788.localdomain sudo[158768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:35 np0005548788.localdomain python3.9[158770]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:36:35 np0005548788.localdomain sudo[158768]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:35 np0005548788.localdomain sudo[158860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrhsvtqoiqzvczmozsjhuztgyjyovnmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013795.3909206-1208-214278305632512/AnsiballZ_podman_container_info.py
Dec 06 09:36:35 np0005548788.localdomain sudo[158860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:36 np0005548788.localdomain python3.9[158862]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:36:36 np0005548788.localdomain sudo[158860]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22500 DF PROTO=TCP SPT=49882 DPT=9101 SEQ=2012729025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E31BB10000000001030307) 
Dec 06 09:36:39 np0005548788.localdomain sudo[158979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyxskgwexyftcxbjbetzcocggzhyujgu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013799.4018185-1247-264432225688029/AnsiballZ_edpm_container_manage.py
Dec 06 09:36:39 np0005548788.localdomain sudo[158979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:40 np0005548788.localdomain python3[158981]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:36:40 np0005548788.localdomain python3[158981]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9",
                                                                    "Digest": "sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:29:20.327314945Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784141054,
                                                                    "VirtualSize": 784141054,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53/diff:/var/lib/containers/storage/overlay/2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:75abaaa40a93c0e2bba524b6f8d4eb5f1c4c9a33db70c892c7582ec5b0827e5e",
                                                                              "sha256:01f43f620d1ea2a9e584abe0cc14c336bedcf55765127c000d743f536dd36f25",
                                                                              "sha256:0bf5bd378602f28be423f5e84abddff3b103396fae3c167031b6e3fcfcf6f120"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:50.18897737Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:50.762138914Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:13.720608935Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:27.636630318Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:40.546186661Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:52.875291445Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:27:22.608862134Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:35.764559413Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:40.983506098Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:44.803537768Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:20.324920691Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:20.324983383Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:24.215761584Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:36:40 np0005548788.localdomain sshd[159038]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:40 np0005548788.localdomain podman[159032]: 2025-12-06 09:36:40.485730826 +0000 UTC m=+0.101084094 container remove 215d31f199aaa515d9eaf75ffa1725433495ff447f4495825bf3bd0a105c6b86 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dc659970751309b021f4b1201ffad0ee'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 06 09:36:40 np0005548788.localdomain python3[158981]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Dec 06 09:36:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34670 DF PROTO=TCP SPT=39520 DPT=9105 SEQ=2482924199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E325F00000000001030307) 
Dec 06 09:36:40 np0005548788.localdomain podman[159049]: 
Dec 06 09:36:40 np0005548788.localdomain podman[159049]: 2025-12-06 09:36:40.594348449 +0000 UTC m=+0.088981621 container create 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 09:36:40 np0005548788.localdomain podman[159049]: 2025-12-06 09:36:40.551975012 +0000 UTC m=+0.046608184 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:36:40 np0005548788.localdomain python3[158981]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:36:40 np0005548788.localdomain sudo[158979]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:40 np0005548788.localdomain sshd[159038]: Received disconnect from 148.227.3.232 port 36308:11: Bye Bye [preauth]
Dec 06 09:36:40 np0005548788.localdomain sshd[159038]: Disconnected from authenticating user root 148.227.3.232 port 36308 [preauth]
Dec 06 09:36:41 np0005548788.localdomain sudo[159174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anstoisddhvrnccowbkhgdxhlpzpuuzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013801.2544518-1271-214786651133189/AnsiballZ_stat.py
Dec 06 09:36:41 np0005548788.localdomain sudo[159174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:41 np0005548788.localdomain python3.9[159176]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:41 np0005548788.localdomain sudo[159174]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:42 np0005548788.localdomain sudo[159268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdswcbthazvydmvgtiemhxlbjhggawpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.0205593-1298-21302889805826/AnsiballZ_file.py
Dec 06 09:36:42 np0005548788.localdomain sudo[159268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:42 np0005548788.localdomain python3.9[159270]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:42 np0005548788.localdomain sudo[159268]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:42 np0005548788.localdomain sudo[159314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkliawtxagbxzatxwivgdynafgqakgyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.0205593-1298-21302889805826/AnsiballZ_stat.py
Dec 06 09:36:42 np0005548788.localdomain sudo[159314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:42 np0005548788.localdomain python3.9[159316]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:42 np0005548788.localdomain sudo[159314]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:43 np0005548788.localdomain sudo[159405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vncpkwsarddthincszjxiyclpqnqhubg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013803.0094388-1298-38620159265910/AnsiballZ_copy.py
Dec 06 09:36:43 np0005548788.localdomain sudo[159405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26910 DF PROTO=TCP SPT=34320 DPT=9100 SEQ=3540750611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E331DF0000000001030307) 
Dec 06 09:36:43 np0005548788.localdomain python3.9[159407]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013803.0094388-1298-38620159265910/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:43 np0005548788.localdomain sudo[159405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:43 np0005548788.localdomain sudo[159451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfbjrbrlwwtxksocwifmumhhqldahita ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013803.0094388-1298-38620159265910/AnsiballZ_systemd.py
Dec 06 09:36:43 np0005548788.localdomain sudo[159451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:44 np0005548788.localdomain python3.9[159453]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:36:44 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:36:44 np0005548788.localdomain systemd-rc-local-generator[159476]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:44 np0005548788.localdomain systemd-sysv-generator[159482]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:44 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:36:44 np0005548788.localdomain sudo[159451]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:44 np0005548788.localdomain podman[159489]: 2025-12-06 09:36:44.523735875 +0000 UTC m=+0.054918327 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 06 09:36:44 np0005548788.localdomain podman[159489]: 2025-12-06 09:36:44.59987911 +0000 UTC m=+0.131061692 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:36:44 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:36:44 np0005548788.localdomain sudo[159557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iadsqdxojsqnjtbmawcpujqhbfydojug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013803.0094388-1298-38620159265910/AnsiballZ_systemd.py
Dec 06 09:36:44 np0005548788.localdomain sudo[159557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:45 np0005548788.localdomain python3.9[159559]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:36:45 np0005548788.localdomain systemd-sysv-generator[159587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:45 np0005548788.localdomain systemd-rc-local-generator[159584]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: Starting ovn_metadata_agent container...
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: tmp-crun.eQjzq3.mount: Deactivated successfully.
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:36:45 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7ccbc464910f8d3dcd30a378525fbe51120ca6180cd422318361c2a8cf1588c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:36:45 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7ccbc464910f8d3dcd30a378525fbe51120ca6180cd422318361c2a8cf1588c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:36:45 np0005548788.localdomain podman[159601]: 2025-12-06 09:36:45.633958311 +0000 UTC m=+0.162282241 container init 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + sudo -E kolla_set_configs
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:36:45 np0005548788.localdomain podman[159601]: 2025-12-06 09:36:45.680479782 +0000 UTC m=+0.208803692 container start 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:36:45 np0005548788.localdomain edpm-start-podman-container[159601]: ovn_metadata_agent
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Validating config file
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Copying service configuration files
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Writing out command to execute
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: ++ cat /run_command
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + CMD=neutron-ovn-metadata-agent
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + ARGS=
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + sudo kolla_copy_cacerts
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + [[ ! -n '' ]]
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + . kolla_extend_start
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: Running command: 'neutron-ovn-metadata-agent'
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + umask 0022
Dec 06 09:36:45 np0005548788.localdomain ovn_metadata_agent[159615]: + exec neutron-ovn-metadata-agent
Dec 06 09:36:45 np0005548788.localdomain podman[159623]: 2025-12-06 09:36:45.780090492 +0000 UTC m=+0.095962023 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:36:45 np0005548788.localdomain podman[159623]: 2025-12-06 09:36:45.866595163 +0000 UTC m=+0.182466684 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:36:45 np0005548788.localdomain edpm-start-podman-container[159600]: Creating additional drop-in dependency for "ovn_metadata_agent" (66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476)
Dec 06 09:36:45 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:36:45 np0005548788.localdomain systemd-rc-local-generator[159688]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:46 np0005548788.localdomain systemd-sysv-generator[159692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:46 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:46 np0005548788.localdomain systemd[1]: Started ovn_metadata_agent container.
Dec 06 09:36:46 np0005548788.localdomain sudo[159557]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26912 DF PROTO=TCP SPT=34320 DPT=9100 SEQ=3540750611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E33DF00000000001030307) 
Dec 06 09:36:47 np0005548788.localdomain sudo[159719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:36:47 np0005548788.localdomain sudo[159719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:36:47 np0005548788.localdomain sudo[159719]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.358 159620 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.359 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.360 159620 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.361 159620 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.362 159620 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.363 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.364 159620 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.365 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.366 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.367 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.368 159620 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.369 159620 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.370 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.371 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.372 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.373 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain sudo[159734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.374 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.375 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.376 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.377 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain sudo[159734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.378 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.379 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.380 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.381 159620 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.382 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.383 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.384 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.385 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.386 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.387 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.388 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.389 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.390 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.391 159620 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.400 159620 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.400 159620 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.400 159620 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.401 159620 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.401 159620 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.415 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 61ffd9e7-81c6-44c4-94c0-846d9931f97c (UUID: 61ffd9e7-81c6-44c4-94c0-846d9931f97c) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.433 159620 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.433 159620 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.433 159620 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.433 159620 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.435 159620 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.437 159620 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.448 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '61ffd9e7-81c6-44c4-94c0-846d9931f97c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], external_ids={'neutron:ovn-metadata-id': 'f7c7dee4-8aa9-5802-aa1e-0e9982b26e8c', 'neutron:ovn-metadata-sb-cfg': '1'}, name=61ffd9e7-81c6-44c4-94c0-846d9931f97c, nb_cfg_timestamp=1765013752690, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.449 159620 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f3cb0db9b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.449 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.449 159620 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.449 159620 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.450 159620 INFO oslo_service.service [-] Starting 1 workers
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.453 159620 DEBUG oslo_service.service [-] Started child 159749 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.455 159620 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpkqy_7mkd/privsep.sock']
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.457 159749 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1940381'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.481 159749 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.482 159749 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.482 159749 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.484 159749 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.486 159749 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:36:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.494 159749 INFO eventlet.wsgi.server [-] (159749) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 06 09:36:47 np0005548788.localdomain sshd[154451]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:36:47 np0005548788.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Dec 06 09:36:47 np0005548788.localdomain systemd[1]: session-52.scope: Consumed 32.347s CPU time.
Dec 06 09:36:47 np0005548788.localdomain systemd-logind[765]: Session 52 logged out. Waiting for processes to exit.
Dec 06 09:36:47 np0005548788.localdomain systemd-logind[765]: Removed session 52.
Dec 06 09:36:47 np0005548788.localdomain sudo[159734]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.098 159620 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.099 159620 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkqy_7mkd/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.977 159785 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.984 159785 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.987 159785 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:47.987 159785 INFO oslo.privsep.daemon [-] privsep daemon running as pid 159785
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.103 159785 DEBUG oslo.privsep.daemon [-] privsep: reply[72801ec1-b1c3-423e-915a-341ae33a1986]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:48 np0005548788.localdomain sudo[159790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:36:48 np0005548788.localdomain sudo[159790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.531 159785 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.532 159785 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.532 159785 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:36:48 np0005548788.localdomain sudo[159790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.986 159785 DEBUG oslo.privsep.daemon [-] privsep: reply[424bc4b9-cbd4-4210-890a-8e852fe7a582]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.990 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, column=external_ids, values=({'neutron:ovn-metadata-id': 'f7c7dee4-8aa9-5802-aa1e-0e9982b26e8c'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.991 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:36:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:48.992 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.005 159620 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.006 159620 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.007 159620 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.008 159620 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.009 159620 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.010 159620 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.011 159620 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.012 159620 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.013 159620 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.014 159620 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.015 159620 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.016 159620 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.017 159620 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.018 159620 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.019 159620 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.020 159620 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.021 159620 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.022 159620 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.023 159620 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.024 159620 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.025 159620 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.026 159620 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.027 159620 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.028 159620 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.029 159620 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.030 159620 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.031 159620 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.032 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.033 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.034 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.035 159620 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.036 159620 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.036 159620 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:36:49.036 159620 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:36:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45678 DF PROTO=TCP SPT=56614 DPT=9102 SEQ=754960749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E348E60000000001030307) 
Dec 06 09:36:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45680 DF PROTO=TCP SPT=56614 DPT=9102 SEQ=754960749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E354F00000000001030307) 
Dec 06 09:36:53 np0005548788.localdomain sshd[159805]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:54 np0005548788.localdomain sshd[159805]: Accepted publickey for zuul from 192.168.122.30 port 57600 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:36:54 np0005548788.localdomain systemd-logind[765]: New session 53 of user zuul.
Dec 06 09:36:54 np0005548788.localdomain systemd[1]: Started Session 53 of User zuul.
Dec 06 09:36:54 np0005548788.localdomain sshd[159805]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:36:55 np0005548788.localdomain python3.9[159898]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:36:56 np0005548788.localdomain sudo[159992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsyfzoisevxuywtbimhfgjckhpjhlbws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013815.6581945-62-58374621376502/AnsiballZ_command.py
Dec 06 09:36:56 np0005548788.localdomain sudo[159992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:56 np0005548788.localdomain python3.9[159994]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:56 np0005548788.localdomain sudo[159992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45681 DF PROTO=TCP SPT=56614 DPT=9102 SEQ=754960749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E364B00000000001030307) 
Dec 06 09:36:56 np0005548788.localdomain sudo[160097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivmkucnladqiuujpvcdzrlvvcqjnekwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013816.5115592-86-203287111224926/AnsiballZ_command.py
Dec 06 09:36:56 np0005548788.localdomain sudo[160097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:56 np0005548788.localdomain python3.9[160099]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:57 np0005548788.localdomain systemd[1]: tmp-crun.aSeMtf.mount: Deactivated successfully.
Dec 06 09:36:57 np0005548788.localdomain systemd[1]: libpod-33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09.scope: Deactivated successfully.
Dec 06 09:36:57 np0005548788.localdomain podman[160100]: 2025-12-06 09:36:57.059567397 +0000 UTC m=+0.083124124 container died 33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:36:57 np0005548788.localdomain podman[160100]: 2025-12-06 09:36:57.09230439 +0000 UTC m=+0.115861067 container cleanup 33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-type=git, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public)
Dec 06 09:36:57 np0005548788.localdomain sudo[160097]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:57 np0005548788.localdomain podman[160116]: 2025-12-06 09:36:57.157349294 +0000 UTC m=+0.080473153 container remove 33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:36:57 np0005548788.localdomain systemd[1]: libpod-conmon-33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09.scope: Deactivated successfully.
Dec 06 09:36:58 np0005548788.localdomain sudo[160218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drdbijqhckbhuootfkvaqkwfmteyygzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013817.4427822-116-263827667247999/AnsiballZ_systemd_service.py
Dec 06 09:36:58 np0005548788.localdomain sudo[160218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f4ccaf6b9e00129f62b1a5ca75d97cdf0ef2bf4949fad3aa47a39da7b3075511-merged.mount: Deactivated successfully.
Dec 06 09:36:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33fa47627c835c5b2933fd0ca4f78f86b55645974056e90d3ff9616ef92eca09-userdata-shm.mount: Deactivated successfully.
Dec 06 09:36:58 np0005548788.localdomain python3.9[160220]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:36:58 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:36:58 np0005548788.localdomain systemd-rc-local-generator[160246]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:58 np0005548788.localdomain systemd-sysv-generator[160251]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:58 np0005548788.localdomain sudo[160218]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26914 DF PROTO=TCP SPT=34320 DPT=9100 SEQ=3540750611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E36DF00000000001030307) 
Dec 06 09:36:59 np0005548788.localdomain python3.9[160346]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:36:59 np0005548788.localdomain network[160363]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:36:59 np0005548788.localdomain network[160364]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:36:59 np0005548788.localdomain network[160365]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:37:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:37:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57574 DF PROTO=TCP SPT=35544 DPT=9101 SEQ=2250136101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E384EE0000000001030307) 
Dec 06 09:37:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11201 DF PROTO=TCP SPT=46252 DPT=9882 SEQ=2176481386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E385F10000000001030307) 
Dec 06 09:37:05 np0005548788.localdomain sudo[160565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwggwchtygxpitswelkazmkklmbjrrhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013825.1317267-173-18023498846064/AnsiballZ_systemd_service.py
Dec 06 09:37:05 np0005548788.localdomain sudo[160565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:05 np0005548788.localdomain python3.9[160567]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:05 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:37:05 np0005548788.localdomain systemd-sysv-generator[160596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:37:05 np0005548788.localdomain systemd-rc-local-generator[160592]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:37:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:37:06 np0005548788.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Dec 06 09:37:06 np0005548788.localdomain sudo[160565]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:06 np0005548788.localdomain sudo[160696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrupsavxibhhrvpigscoyexfzmcfqyyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013826.2502205-173-248685165459209/AnsiballZ_systemd_service.py
Dec 06 09:37:06 np0005548788.localdomain sudo[160696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:06 np0005548788.localdomain python3.9[160698]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:06 np0005548788.localdomain sudo[160696]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:07 np0005548788.localdomain sudo[160789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jharnxxlmpujezvocochasebabzrckli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013826.9471722-173-263369035635354/AnsiballZ_systemd_service.py
Dec 06 09:37:07 np0005548788.localdomain sudo[160789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:07 np0005548788.localdomain python3.9[160791]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57576 DF PROTO=TCP SPT=35544 DPT=9101 SEQ=2250136101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E390F00000000001030307) 
Dec 06 09:37:08 np0005548788.localdomain sudo[160789]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:08 np0005548788.localdomain sudo[160882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgzzkaefzhpfpxzunpihvozqprldulwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013828.6818075-173-278204285875097/AnsiballZ_systemd_service.py
Dec 06 09:37:08 np0005548788.localdomain sudo[160882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:09 np0005548788.localdomain python3.9[160884]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:10 np0005548788.localdomain sudo[160882]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:10 np0005548788.localdomain sudo[160975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paflearsqmpwnvrsfbpxnallyduadpvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013830.406463-173-233034259928345/AnsiballZ_systemd_service.py
Dec 06 09:37:10 np0005548788.localdomain sudo[160975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8098 DF PROTO=TCP SPT=44924 DPT=9105 SEQ=721097922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E39BF10000000001030307) 
Dec 06 09:37:11 np0005548788.localdomain python3.9[160977]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:11 np0005548788.localdomain sudo[160975]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:11 np0005548788.localdomain sudo[161068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsqnvbqfreqwypcplpblcuzyhbiamrmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013831.1662383-173-138637794352574/AnsiballZ_systemd_service.py
Dec 06 09:37:11 np0005548788.localdomain sudo[161068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:11 np0005548788.localdomain python3.9[161070]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:11 np0005548788.localdomain sudo[161068]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:12 np0005548788.localdomain sudo[161161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olvctsxccjydxyfgysxjqyhjrstrspbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013831.9622014-173-60156405530802/AnsiballZ_systemd_service.py
Dec 06 09:37:12 np0005548788.localdomain sudo[161161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:12 np0005548788.localdomain python3.9[161163]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:12 np0005548788.localdomain sudo[161161]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:13 np0005548788.localdomain sudo[161254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igrzdlnedythkyrigcxwtuvekdzjppck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013832.9616694-329-42222462434759/AnsiballZ_file.py
Dec 06 09:37:13 np0005548788.localdomain sudo[161254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57318 DF PROTO=TCP SPT=54776 DPT=9100 SEQ=4085753135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3A7100000000001030307) 
Dec 06 09:37:13 np0005548788.localdomain python3.9[161256]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:13 np0005548788.localdomain sudo[161254]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:14 np0005548788.localdomain sudo[161346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybyvkjrxcmetwsaxnobdpyzoazykjiic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013834.005045-329-165676489524482/AnsiballZ_file.py
Dec 06 09:37:14 np0005548788.localdomain sudo[161346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:14 np0005548788.localdomain python3.9[161348]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:14 np0005548788.localdomain sudo[161346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:14 np0005548788.localdomain sudo[161438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipeebqmpnrjljwgvzoezkjtijqqkvoed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013834.610164-329-9124033373674/AnsiballZ_file.py
Dec 06 09:37:14 np0005548788.localdomain sudo[161438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:37:14 np0005548788.localdomain podman[161441]: 2025-12-06 09:37:14.982653248 +0000 UTC m=+0.095523868 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:37:15 np0005548788.localdomain podman[161441]: 2025-12-06 09:37:15.022683607 +0000 UTC m=+0.135554267 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:37:15 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:37:15 np0005548788.localdomain python3.9[161440]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:15 np0005548788.localdomain sudo[161438]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:16 np0005548788.localdomain sudo[161555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lddltnzlaojbbvddnlxsljwgvomfufaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013835.1815388-329-261785629627811/AnsiballZ_file.py
Dec 06 09:37:16 np0005548788.localdomain sudo[161555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:37:16 np0005548788.localdomain podman[161557]: 2025-12-06 09:37:16.200468808 +0000 UTC m=+0.096320973 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 06 09:37:16 np0005548788.localdomain podman[161557]: 2025-12-06 09:37:16.203937205 +0000 UTC m=+0.099789370 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:37:16 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:37:16 np0005548788.localdomain python3.9[161558]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:16 np0005548788.localdomain sudo[161555]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:16 np0005548788.localdomain sudo[161666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndvywzrzkctqlvidkqxtvdlxijafknqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013836.4187195-329-1063826752378/AnsiballZ_file.py
Dec 06 09:37:16 np0005548788.localdomain sudo[161666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57320 DF PROTO=TCP SPT=54776 DPT=9100 SEQ=4085753135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3B3300000000001030307) 
Dec 06 09:37:16 np0005548788.localdomain python3.9[161668]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:16 np0005548788.localdomain sudo[161666]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:17 np0005548788.localdomain sudo[161758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huoghipomkxfvwixxpdezwklzrskvhjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013836.9893014-329-277101333944863/AnsiballZ_file.py
Dec 06 09:37:17 np0005548788.localdomain sudo[161758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:17 np0005548788.localdomain python3.9[161760]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:17 np0005548788.localdomain sudo[161758]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:17 np0005548788.localdomain sudo[161850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfivlzoovbodqjdzveeqognozowixsfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013837.6431637-329-69879952800836/AnsiballZ_file.py
Dec 06 09:37:17 np0005548788.localdomain sudo[161850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:18 np0005548788.localdomain python3.9[161852]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:18 np0005548788.localdomain sudo[161850]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:18 np0005548788.localdomain sudo[161942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keizzgvcqjehrbgnojbapznhusoqbhim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013838.291243-479-226361978286959/AnsiballZ_file.py
Dec 06 09:37:18 np0005548788.localdomain sudo[161942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:18 np0005548788.localdomain python3.9[161944]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:18 np0005548788.localdomain sudo[161942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:19 np0005548788.localdomain sudo[162034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqxkdhclolgkazfshkhpbxhjmkmhfrum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013838.901704-479-212948025308881/AnsiballZ_file.py
Dec 06 09:37:19 np0005548788.localdomain sudo[162034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:19 np0005548788.localdomain python3.9[162036]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:19 np0005548788.localdomain sudo[162034]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60740 DF PROTO=TCP SPT=54254 DPT=9102 SEQ=1093479200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3BE150000000001030307) 
Dec 06 09:37:19 np0005548788.localdomain sudo[162126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axxwsyitgfekitsordqjxeopfpwcakrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013839.5253632-479-105110858476760/AnsiballZ_file.py
Dec 06 09:37:19 np0005548788.localdomain sudo[162126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:19 np0005548788.localdomain python3.9[162128]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:20 np0005548788.localdomain sudo[162126]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:20 np0005548788.localdomain sudo[162218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqmwtffxgrudwmudtkcppjsxtfzksccv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013840.1200562-479-20773907986862/AnsiballZ_file.py
Dec 06 09:37:20 np0005548788.localdomain sudo[162218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:20 np0005548788.localdomain python3.9[162220]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:20 np0005548788.localdomain sudo[162218]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:20 np0005548788.localdomain sudo[162310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sscvwwvxjctdmsbeblvzfranojiebyyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013840.7378275-479-170729168622415/AnsiballZ_file.py
Dec 06 09:37:20 np0005548788.localdomain sudo[162310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:21 np0005548788.localdomain python3.9[162312]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:21 np0005548788.localdomain sudo[162310]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:21 np0005548788.localdomain sudo[162402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrgdyunsikwjqzhfcxboupbvjscevwie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013841.3160698-479-124017601173588/AnsiballZ_file.py
Dec 06 09:37:21 np0005548788.localdomain sudo[162402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:21 np0005548788.localdomain sshd[162405]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:21 np0005548788.localdomain python3.9[162404]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:21 np0005548788.localdomain sudo[162402]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:21 np0005548788.localdomain sshd[162405]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 09:37:21 np0005548788.localdomain sshd[162405]: Connection closed by 43.163.93.82 port 35918
Dec 06 09:37:22 np0005548788.localdomain sudo[162495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oauyhjsfpnkaaorsaetmgitgpsiamlsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013841.9286146-479-145794931769014/AnsiballZ_file.py
Dec 06 09:37:22 np0005548788.localdomain sudo[162495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:22 np0005548788.localdomain python3.9[162497]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:22 np0005548788.localdomain sudo[162495]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60742 DF PROTO=TCP SPT=54254 DPT=9102 SEQ=1093479200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3CA300000000001030307) 
Dec 06 09:37:22 np0005548788.localdomain sudo[162587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvcogyxlatfafkgsvazscsbxngrknbqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013842.6452463-632-41613226544663/AnsiballZ_command.py
Dec 06 09:37:22 np0005548788.localdomain sudo[162587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:23 np0005548788.localdomain python3.9[162589]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:23 np0005548788.localdomain sudo[162587]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:23 np0005548788.localdomain python3.9[162681]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:37:24 np0005548788.localdomain sudo[162771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyxdrnnethkcelwvogcsjyzfrprdjmmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013844.214073-686-264859470258839/AnsiballZ_systemd_service.py
Dec 06 09:37:24 np0005548788.localdomain sudo[162771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:24 np0005548788.localdomain python3.9[162773]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:37:24 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:37:24 np0005548788.localdomain systemd-rc-local-generator[162801]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:37:24 np0005548788.localdomain systemd-sysv-generator[162804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:37:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:37:25 np0005548788.localdomain sudo[162771]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:26 np0005548788.localdomain sudo[162899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhywthrmbjimrjgyteaikpfseefpxaru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013845.832977-710-223436126285972/AnsiballZ_command.py
Dec 06 09:37:26 np0005548788.localdomain sudo[162899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:26 np0005548788.localdomain python3.9[162901]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60743 DF PROTO=TCP SPT=54254 DPT=9102 SEQ=1093479200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3D9F00000000001030307) 
Dec 06 09:37:26 np0005548788.localdomain sudo[162899]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:26 np0005548788.localdomain sudo[162992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuifpyumoupnjeiqkuizrcfxzanhpvtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013846.7331069-710-265223938763020/AnsiballZ_command.py
Dec 06 09:37:26 np0005548788.localdomain sudo[162992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:27 np0005548788.localdomain python3.9[162994]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:27 np0005548788.localdomain sudo[162992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:28 np0005548788.localdomain sudo[163085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfswvvbspeactgmltetwnhrjafnqmivo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013847.3137019-710-180113098558222/AnsiballZ_command.py
Dec 06 09:37:28 np0005548788.localdomain sudo[163085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:28 np0005548788.localdomain python3.9[163087]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:28 np0005548788.localdomain sudo[163085]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:28 np0005548788.localdomain sudo[163178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drarqijvrbfmsalafszfdbgujmmksznr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013848.512743-710-32954428601028/AnsiballZ_command.py
Dec 06 09:37:28 np0005548788.localdomain sudo[163178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:28 np0005548788.localdomain python3.9[163180]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:29 np0005548788.localdomain sudo[163178]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57322 DF PROTO=TCP SPT=54776 DPT=9100 SEQ=4085753135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3E3F00000000001030307) 
Dec 06 09:37:29 np0005548788.localdomain sudo[163271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjmdhqlztxkjnqdlvjyygrvojhyfqzlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013849.1128137-710-232286341712247/AnsiballZ_command.py
Dec 06 09:37:29 np0005548788.localdomain sudo[163271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:29 np0005548788.localdomain python3.9[163273]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:29 np0005548788.localdomain sudo[163271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:29 np0005548788.localdomain sudo[163364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fysdilvaqkbqohqwbwcycwjkvvuxqnvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013849.7119713-710-273309648840644/AnsiballZ_command.py
Dec 06 09:37:29 np0005548788.localdomain sudo[163364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:30 np0005548788.localdomain python3.9[163366]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:30 np0005548788.localdomain sudo[163364]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:30 np0005548788.localdomain sudo[163457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irqccmhtmifpnohkqotojctlkarimihp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013850.2160041-710-248006920536570/AnsiballZ_command.py
Dec 06 09:37:30 np0005548788.localdomain sudo[163457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:30 np0005548788.localdomain sshd[163460]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:30 np0005548788.localdomain python3.9[163459]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:30 np0005548788.localdomain sudo[163457]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:31 np0005548788.localdomain sudo[163552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjvlxviaaxqietckhtisvnfhrtsnlich ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013851.3169038-872-116364711855469/AnsiballZ_getent.py
Dec 06 09:37:31 np0005548788.localdomain sudo[163552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:31 np0005548788.localdomain python3.9[163554]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 06 09:37:31 np0005548788.localdomain sudo[163552]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:32 np0005548788.localdomain sudo[163645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzqxdwevcmhibbeuhteucdknsxzrabka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013852.1404629-896-252916743366340/AnsiballZ_group.py
Dec 06 09:37:32 np0005548788.localdomain sudo[163645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:32 np0005548788.localdomain sshd[163460]: Connection closed by 101.47.142.76 port 46342 [preauth]
Dec 06 09:37:32 np0005548788.localdomain python3.9[163647]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:37:32 np0005548788.localdomain groupadd[163648]: group added to /etc/group: name=libvirt, GID=42473
Dec 06 09:37:32 np0005548788.localdomain groupadd[163648]: group added to /etc/gshadow: name=libvirt
Dec 06 09:37:32 np0005548788.localdomain groupadd[163648]: new group: name=libvirt, GID=42473
Dec 06 09:37:32 np0005548788.localdomain sudo[163645]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:33 np0005548788.localdomain sudo[163743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwnlimqvomckllphupqvkpwygxywyfit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013853.186042-920-104406330146000/AnsiballZ_user.py
Dec 06 09:37:33 np0005548788.localdomain sudo[163743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:33 np0005548788.localdomain python3.9[163745]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548788.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:37:33 np0005548788.localdomain useradd[163747]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 06 09:37:34 np0005548788.localdomain sudo[163743]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:34 np0005548788.localdomain sudo[163843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvgpdlutndmupewbfcrayovomeypmnde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013854.4180002-953-178001289167500/AnsiballZ_setup.py
Dec 06 09:37:34 np0005548788.localdomain sudo[163843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5390 DF PROTO=TCP SPT=52788 DPT=9882 SEQ=2342456572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3F9F00000000001030307) 
Dec 06 09:37:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60744 DF PROTO=TCP SPT=54254 DPT=9102 SEQ=1093479200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E3F9F00000000001030307) 
Dec 06 09:37:34 np0005548788.localdomain python3.9[163845]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:37:35 np0005548788.localdomain sudo[163843]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:35 np0005548788.localdomain sudo[163897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivpnpsavexyupjkucmqgwwvkocleutnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013854.4180002-953-178001289167500/AnsiballZ_dnf.py
Dec 06 09:37:35 np0005548788.localdomain sudo[163897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:35 np0005548788.localdomain python3.9[163899]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:37:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18780 DF PROTO=TCP SPT=48856 DPT=9101 SEQ=2456623523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E406300000000001030307) 
Dec 06 09:37:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32862 DF PROTO=TCP SPT=50712 DPT=9105 SEQ=4256181132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E415F00000000001030307) 
Dec 06 09:37:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24104 DF PROTO=TCP SPT=42486 DPT=9100 SEQ=3831133331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E41C400000000001030307) 
Dec 06 09:37:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:37:45 np0005548788.localdomain podman[163971]: 2025-12-06 09:37:45.259359973 +0000 UTC m=+0.085360831 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:37:45 np0005548788.localdomain podman[163971]: 2025-12-06 09:37:45.298221386 +0000 UTC m=+0.124222214 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:37:45 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:37:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24106 DF PROTO=TCP SPT=42486 DPT=9100 SEQ=3831133331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E428300000000001030307) 
Dec 06 09:37:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:37:47 np0005548788.localdomain podman[163996]: 2025-12-06 09:37:47.338416871 +0000 UTC m=+0.165511180 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:37:47 np0005548788.localdomain podman[163996]: 2025-12-06 09:37:47.368815332 +0000 UTC m=+0.195909631 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:37:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:37:47.394 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:37:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:37:47.395 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:37:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:37:47.395 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:37:47 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:37:48 np0005548788.localdomain sudo[164014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:37:48 np0005548788.localdomain sudo[164014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:37:48 np0005548788.localdomain sudo[164014]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:48 np0005548788.localdomain sudo[164032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:37:48 np0005548788.localdomain sudo[164032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:37:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54577 DF PROTO=TCP SPT=46100 DPT=9102 SEQ=1186861243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E433450000000001030307) 
Dec 06 09:37:49 np0005548788.localdomain sudo[164032]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:50 np0005548788.localdomain sudo[164083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:37:50 np0005548788.localdomain sudo[164083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:37:50 np0005548788.localdomain sudo[164083]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54579 DF PROTO=TCP SPT=46100 DPT=9102 SEQ=1186861243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E43F300000000001030307) 
Dec 06 09:37:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54580 DF PROTO=TCP SPT=46100 DPT=9102 SEQ=1186861243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E44EF00000000001030307) 
Dec 06 09:37:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24108 DF PROTO=TCP SPT=42486 DPT=9100 SEQ=3831133331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E457F10000000001030307) 
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  Converting 2746 SID table entries...
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:03 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33409 DF PROTO=TCP SPT=36232 DPT=9101 SEQ=3617578314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E46F4E0000000001030307) 
Dec 06 09:38:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29893 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4083345310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E46FF00000000001030307) 
Dec 06 09:38:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33411 DF PROTO=TCP SPT=36232 DPT=9101 SEQ=3617578314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E47B700000000001030307) 
Dec 06 09:38:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32865 DF PROTO=TCP SPT=50712 DPT=9105 SEQ=4256181132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E485F00000000001030307) 
Dec 06 09:38:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59664 DF PROTO=TCP SPT=42030 DPT=9100 SEQ=3734221078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4916F0000000001030307) 
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:14 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:16 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Dec 06 09:38:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:38:16 np0005548788.localdomain systemd[1]: tmp-crun.yheq2Z.mount: Deactivated successfully.
Dec 06 09:38:16 np0005548788.localdomain podman[165125]: 2025-12-06 09:38:16.29446114 +0000 UTC m=+0.107685222 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:38:16 np0005548788.localdomain podman[165125]: 2025-12-06 09:38:16.359853513 +0000 UTC m=+0.173077565 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:38:16 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:38:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59666 DF PROTO=TCP SPT=42030 DPT=9100 SEQ=3734221078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E49D700000000001030307) 
Dec 06 09:38:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:38:18 np0005548788.localdomain podman[165148]: 2025-12-06 09:38:18.242493125 +0000 UTC m=+0.068374066 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 09:38:18 np0005548788.localdomain podman[165148]: 2025-12-06 09:38:18.24848232 +0000 UTC m=+0.074363261 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:38:18 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:38:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6046 DF PROTO=TCP SPT=58966 DPT=9102 SEQ=767332756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4A8740000000001030307) 
Dec 06 09:38:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6048 DF PROTO=TCP SPT=58966 DPT=9102 SEQ=767332756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4B4700000000001030307) 
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:24 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6049 DF PROTO=TCP SPT=58966 DPT=9102 SEQ=767332756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4C4310000000001030307) 
Dec 06 09:38:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59668 DF PROTO=TCP SPT=42030 DPT=9100 SEQ=3734221078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4CDF00000000001030307) 
Dec 06 09:38:31 np0005548788.localdomain sshd[165173]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:34 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6050 DF PROTO=TCP SPT=58966 DPT=9102 SEQ=767332756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4E3F00000000001030307) 
Dec 06 09:38:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14592 DF PROTO=TCP SPT=34138 DPT=9101 SEQ=2035964180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4E47F0000000001030307) 
Dec 06 09:38:35 np0005548788.localdomain sshd[165173]: Received disconnect from 45.78.194.186 port 56774:11: Bye Bye [preauth]
Dec 06 09:38:35 np0005548788.localdomain sshd[165173]: Disconnected from authenticating user root 45.78.194.186 port 56774 [preauth]
Dec 06 09:38:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14594 DF PROTO=TCP SPT=34138 DPT=9101 SEQ=2035964180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4F0700000000001030307) 
Dec 06 09:38:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25556 DF PROTO=TCP SPT=53542 DPT=9105 SEQ=1756842038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E4FBF00000000001030307) 
Dec 06 09:38:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45041 DF PROTO=TCP SPT=59154 DPT=9100 SEQ=1279700625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E506A00000000001030307) 
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:45 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45043 DF PROTO=TCP SPT=59154 DPT=9100 SEQ=1279700625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E512B00000000001030307) 
Dec 06 09:38:47 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Dec 06 09:38:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:38:47 np0005548788.localdomain systemd[1]: tmp-crun.44NQ9l.mount: Deactivated successfully.
Dec 06 09:38:47 np0005548788.localdomain podman[165195]: 2025-12-06 09:38:47.290285446 +0000 UTC m=+0.107464887 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec 06 09:38:47 np0005548788.localdomain podman[165195]: 2025-12-06 09:38:47.339726893 +0000 UTC m=+0.156906334 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:38:47 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:38:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:38:47.394 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:38:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:38:47.395 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:38:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:38:47.395 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:38:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:38:49 np0005548788.localdomain systemd[1]: tmp-crun.zyPTWN.mount: Deactivated successfully.
Dec 06 09:38:49 np0005548788.localdomain podman[165221]: 2025-12-06 09:38:49.227726687 +0000 UTC m=+0.059342228 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:38:49 np0005548788.localdomain podman[165221]: 2025-12-06 09:38:49.258262071 +0000 UTC m=+0.089877602 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:38:49 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:38:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17382 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2902023770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E51DA40000000001030307) 
Dec 06 09:38:50 np0005548788.localdomain sudo[165238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:50 np0005548788.localdomain sudo[165238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:38:50 np0005548788.localdomain sudo[165238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:50 np0005548788.localdomain sudo[165256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:38:50 np0005548788.localdomain sudo[165256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:38:51 np0005548788.localdomain sudo[165256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:51 np0005548788.localdomain sudo[165307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:38:51 np0005548788.localdomain sudo[165307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:38:51 np0005548788.localdomain sudo[165307]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17384 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2902023770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E529B00000000001030307) 
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:55 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17385 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2902023770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E539710000000001030307) 
Dec 06 09:38:56 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:38:56 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Dec 06 09:38:56 np0005548788.localdomain systemd-sysv-generator[165364]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:38:56 np0005548788.localdomain systemd-rc-local-generator[165361]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:38:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:38:57 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:38:57 np0005548788.localdomain systemd-sysv-generator[165399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:38:57 np0005548788.localdomain systemd-rc-local-generator[165395]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:38:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:38:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45045 DF PROTO=TCP SPT=59154 DPT=9100 SEQ=1279700625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E541F00000000001030307) 
Dec 06 09:39:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59357 DF PROTO=TCP SPT=38606 DPT=9101 SEQ=1601037058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E559AE0000000001030307) 
Dec 06 09:39:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4416 DF PROTO=TCP SPT=60444 DPT=9882 SEQ=166951186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E559F00000000001030307) 
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  Converting 2750 SID table entries...
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:39:07 np0005548788.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:39:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59359 DF PROTO=TCP SPT=38606 DPT=9101 SEQ=1601037058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E565B00000000001030307) 
Dec 06 09:39:08 np0005548788.localdomain groupadd[165424]: group added to /etc/group: name=clevis, GID=985
Dec 06 09:39:08 np0005548788.localdomain groupadd[165424]: group added to /etc/gshadow: name=clevis
Dec 06 09:39:08 np0005548788.localdomain groupadd[165424]: new group: name=clevis, GID=985
Dec 06 09:39:08 np0005548788.localdomain useradd[165431]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 06 09:39:08 np0005548788.localdomain usermod[165441]: add 'clevis' to group 'tss'
Dec 06 09:39:08 np0005548788.localdomain usermod[165441]: add 'clevis' to shadow group 'tss'
Dec 06 09:39:08 np0005548788.localdomain sshd[165442]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:09 np0005548788.localdomain sshd[165442]: Received disconnect from 148.227.3.232 port 39784:11: Bye Bye [preauth]
Dec 06 09:39:09 np0005548788.localdomain sshd[165442]: Disconnected from authenticating user root 148.227.3.232 port 39784 [preauth]
Dec 06 09:39:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5385 DF PROTO=TCP SPT=38998 DPT=9105 SEQ=1664506709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E56FF00000000001030307) 
Dec 06 09:39:12 np0005548788.localdomain groupadd[165465]: group added to /etc/group: name=dnsmasq, GID=984
Dec 06 09:39:12 np0005548788.localdomain groupadd[165465]: group added to /etc/gshadow: name=dnsmasq
Dec 06 09:39:12 np0005548788.localdomain groupadd[165465]: new group: name=dnsmasq, GID=984
Dec 06 09:39:12 np0005548788.localdomain useradd[165472]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 06 09:39:12 np0005548788.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 09:39:12 np0005548788.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Dec 06 09:39:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42181 DF PROTO=TCP SPT=43186 DPT=9100 SEQ=1150603958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E57BD00000000001030307) 
Dec 06 09:39:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42183 DF PROTO=TCP SPT=43186 DPT=9100 SEQ=1150603958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E587F10000000001030307) 
Dec 06 09:39:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:39:18 np0005548788.localdomain podman[165486]: 2025-12-06 09:39:18.382377552 +0000 UTC m=+0.152059592 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 09:39:18 np0005548788.localdomain podman[165486]: 2025-12-06 09:39:18.448520828 +0000 UTC m=+0.218202888 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:39:18 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:39:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65216 DF PROTO=TCP SPT=38790 DPT=9102 SEQ=127652097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E592D50000000001030307) 
Dec 06 09:39:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:39:20 np0005548788.localdomain podman[165511]: 2025-12-06 09:39:20.274133546 +0000 UTC m=+0.092781287 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:39:20 np0005548788.localdomain podman[165511]: 2025-12-06 09:39:20.280467251 +0000 UTC m=+0.099115012 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 09:39:20 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:39:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65218 DF PROTO=TCP SPT=38790 DPT=9102 SEQ=127652097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E59EF00000000001030307) 
Dec 06 09:39:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65219 DF PROTO=TCP SPT=38790 DPT=9102 SEQ=127652097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5AEB10000000001030307) 
Dec 06 09:39:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42185 DF PROTO=TCP SPT=43186 DPT=9100 SEQ=1150603958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5B7F00000000001030307) 
Dec 06 09:39:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65300 DF PROTO=TCP SPT=44256 DPT=9101 SEQ=1243036238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5CEDE0000000001030307) 
Dec 06 09:39:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65220 DF PROTO=TCP SPT=38790 DPT=9102 SEQ=127652097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5CFF00000000001030307) 
Dec 06 09:39:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65302 DF PROTO=TCP SPT=44256 DPT=9101 SEQ=1243036238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5DAF00000000001030307) 
Dec 06 09:39:39 np0005548788.localdomain sshd[174194]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21550 DF PROTO=TCP SPT=38530 DPT=9105 SEQ=3633167463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5E5F00000000001030307) 
Dec 06 09:39:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30778 DF PROTO=TCP SPT=36308 DPT=9100 SEQ=2092101870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5F1000000000001030307) 
Dec 06 09:39:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30780 DF PROTO=TCP SPT=36308 DPT=9100 SEQ=2092101870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E5FCF00000000001030307) 
Dec 06 09:39:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:39:47.396 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:39:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:39:47.398 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:39:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:39:47.398 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:39:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:39:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60973 DF PROTO=TCP SPT=35808 DPT=9102 SEQ=1017983968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E608050000000001030307) 
Dec 06 09:39:50 np0005548788.localdomain systemd[1]: tmp-crun.AaHfm3.mount: Deactivated successfully.
Dec 06 09:39:50 np0005548788.localdomain podman[180013]: 2025-12-06 09:39:50.112074292 +0000 UTC m=+0.927424598 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:39:50 np0005548788.localdomain podman[180013]: 2025-12-06 09:39:50.15575683 +0000 UTC m=+0.971107096 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 09:39:50 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:39:50 np0005548788.localdomain sshd[174194]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:39:50 np0005548788.localdomain sshd[174194]: banner exchange: Connection from 14.103.115.212 port 36114: Connection timed out
Dec 06 09:39:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:39:51 np0005548788.localdomain podman[181397]: 2025-12-06 09:39:51.264636454 +0000 UTC m=+0.083818834 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:39:51 np0005548788.localdomain podman[181397]: 2025-12-06 09:39:51.29862644 +0000 UTC m=+0.117808770 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:39:51 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:39:51 np0005548788.localdomain sudo[181809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:51 np0005548788.localdomain sudo[181809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:39:51 np0005548788.localdomain sudo[181809]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:51 np0005548788.localdomain sudo[181893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:39:51 np0005548788.localdomain sudo[181893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:39:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60975 DF PROTO=TCP SPT=35808 DPT=9102 SEQ=1017983968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E613F10000000001030307) 
Dec 06 09:39:52 np0005548788.localdomain sudo[181893]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:53 np0005548788.localdomain sudo[182517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:39:53 np0005548788.localdomain sudo[182517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:39:53 np0005548788.localdomain sudo[182517]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Reloading rules
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Collecting garbage unconditionally...
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Finished loading, compiling and executing 5 rules
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Reloading rules
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Collecting garbage unconditionally...
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:39:53 np0005548788.localdomain polkitd[1037]: Finished loading, compiling and executing 5 rules
Dec 06 09:39:55 np0005548788.localdomain groupadd[182715]: group added to /etc/group: name=ceph, GID=167
Dec 06 09:39:55 np0005548788.localdomain groupadd[182715]: group added to /etc/gshadow: name=ceph
Dec 06 09:39:55 np0005548788.localdomain groupadd[182715]: new group: name=ceph, GID=167
Dec 06 09:39:55 np0005548788.localdomain useradd[182721]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 06 09:39:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60976 DF PROTO=TCP SPT=35808 DPT=9102 SEQ=1017983968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E623B00000000001030307) 
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:39:58 np0005548788.localdomain sshd[119013]: Received signal 15; terminating.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: sshd.service: Consumed 1.811s CPU time, read 32.0K from disk, written 0B to disk.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:39:58 np0005548788.localdomain sshd[183358]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:58 np0005548788.localdomain sshd[183358]: Server listening on 0.0.0.0 port 22.
Dec 06 09:39:58 np0005548788.localdomain sshd[183358]: Server listening on :: port 22.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30782 DF PROTO=TCP SPT=36308 DPT=9100 SEQ=2092101870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E62DF00000000001030307) 
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:40:00 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:00 np0005548788.localdomain systemd-rc-local-generator[183585]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:00 np0005548788.localdomain systemd-sysv-generator[183588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:40:01 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:40:01 np0005548788.localdomain auditd[728]: Error receiving audit netlink packet (No buffer space available)
Dec 06 09:40:03 np0005548788.localdomain sudo[163897]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:04 np0005548788.localdomain sudo[187540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqryjtlewqczawekytnsdyswycwpuwxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014003.9609802-989-154380763871424/AnsiballZ_systemd.py
Dec 06 09:40:04 np0005548788.localdomain sudo[187540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27909 DF PROTO=TCP SPT=44898 DPT=9882 SEQ=3453933152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E643F00000000001030307) 
Dec 06 09:40:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60977 DF PROTO=TCP SPT=35808 DPT=9102 SEQ=1017983968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E643F00000000001030307) 
Dec 06 09:40:04 np0005548788.localdomain python3.9[187565]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:04 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:05 np0005548788.localdomain systemd-rc-local-generator[187846]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:05 np0005548788.localdomain systemd-sysv-generator[187851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548788.localdomain sudo[187540]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:05 np0005548788.localdomain sudo[188396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nybxqghmejjwlnqvkhlxsolzsworrork ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014005.3989413-989-148422010556368/AnsiballZ_systemd.py
Dec 06 09:40:05 np0005548788.localdomain sudo[188396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:05 np0005548788.localdomain python3.9[188416]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:06 np0005548788.localdomain systemd-rc-local-generator[188744]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:06 np0005548788.localdomain systemd-sysv-generator[188750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548788.localdomain sudo[188396]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:06 np0005548788.localdomain sudo[189318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsbvnrdmblyndgabshkvfhvyljuivukp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014006.5459285-989-141186251426828/AnsiballZ_systemd.py
Dec 06 09:40:06 np0005548788.localdomain sudo[189318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:07 np0005548788.localdomain python3.9[189351]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:07 np0005548788.localdomain systemd-rc-local-generator[189631]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:07 np0005548788.localdomain systemd-sysv-generator[189634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548788.localdomain sudo[189318]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2033 DF PROTO=TCP SPT=54604 DPT=9101 SEQ=1855133639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E650300000000001030307) 
Dec 06 09:40:08 np0005548788.localdomain sudo[190033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnjhcnndfapjsbepjmxnbmwsqpertwsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014008.1986098-989-19799939713315/AnsiballZ_systemd.py
Dec 06 09:40:08 np0005548788.localdomain sudo[190033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:08 np0005548788.localdomain python3.9[190049]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:08 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:08 np0005548788.localdomain systemd-sysv-generator[190217]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:08 np0005548788.localdomain systemd-rc-local-generator[190213]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548788.localdomain sudo[190033]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:40:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d3569610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x5648d35682d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 09:40:09 np0005548788.localdomain sudo[190588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azlavyiaojetwzjcsrzljetblsvakkho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014009.3841443-1076-250571768292260/AnsiballZ_systemd.py
Dec 06 09:40:09 np0005548788.localdomain sudo[190588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:09 np0005548788.localdomain python3.9[190606]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:10 np0005548788.localdomain systemd-rc-local-generator[190834]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:10 np0005548788.localdomain systemd-sysv-generator[190838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548788.localdomain sudo[190588]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:10 np0005548788.localdomain sshd[191088]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:40:10 np0005548788.localdomain sudo[191159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfaemdvbgmrmtfvmojzwojkyfgpoqlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014010.5190003-1076-145930840063764/AnsiballZ_systemd.py
Dec 06 09:40:10 np0005548788.localdomain sudo[191159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:11 np0005548788.localdomain python3.9[191176]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:11 np0005548788.localdomain systemd-sysv-generator[191390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:11 np0005548788.localdomain systemd-rc-local-generator[191384]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548788.localdomain sudo[191159]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:11 np0005548788.localdomain sudo[191755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynyxujffwbtjobnywbhsyihieosszphj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014011.6382735-1076-31774485864523/AnsiballZ_systemd.py
Dec 06 09:40:11 np0005548788.localdomain sudo[191755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:11 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2034 DF PROTO=TCP SPT=54604 DPT=9101 SEQ=1855133639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E65FF10000000001030307) 
Dec 06 09:40:12 np0005548788.localdomain python3.9[191772]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:12 np0005548788.localdomain systemd-rc-local-generator[192000]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:12 np0005548788.localdomain systemd-sysv-generator[192009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548788.localdomain sudo[191755]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:13 np0005548788.localdomain sudo[192347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-begblxgmyrnsbvhiqubjxcnglogiuhys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014012.8239405-1076-24089069356786/AnsiballZ_systemd.py
Dec 06 09:40:13 np0005548788.localdomain sudo[192347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:13 np0005548788.localdomain python3.9[192363]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:13 np0005548788.localdomain sudo[192347]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7517 DF PROTO=TCP SPT=34240 DPT=9100 SEQ=1111503314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6662F0000000001030307) 
Dec 06 09:40:13 np0005548788.localdomain sudo[192778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgpdczcuuodkezvodebaosawgyewgvvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014013.5722144-1076-118017290761579/AnsiballZ_systemd.py
Dec 06 09:40:13 np0005548788.localdomain sudo[192778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:40:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f03610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55c536f022d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 09:40:14 np0005548788.localdomain python3.9[192788]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:15 np0005548788.localdomain systemd-rc-local-generator[193338]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:15 np0005548788.localdomain systemd-sysv-generator[193342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Consumed 17.554s CPU time.
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: run-r3ac2b93b28c04f1783d70f98a17c88c7.service: Deactivated successfully.
Dec 06 09:40:15 np0005548788.localdomain systemd[1]: run-r31c064539ff842658bd24a0bffb21e4f.service: Deactivated successfully.
Dec 06 09:40:15 np0005548788.localdomain sudo[192778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7519 DF PROTO=TCP SPT=34240 DPT=9100 SEQ=1111503314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E672310000000001030307) 
Dec 06 09:40:17 np0005548788.localdomain sudo[193468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrunlakrjncxsxytihvlweazodinvuoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014017.5417705-1184-169758440829217/AnsiballZ_systemd.py
Dec 06 09:40:17 np0005548788.localdomain sudo[193468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:18 np0005548788.localdomain python3.9[193470]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:40:18 np0005548788.localdomain systemd-sysv-generator[193500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:18 np0005548788.localdomain systemd-rc-local-generator[193497]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548788.localdomain sshd[191088]: Connection closed by 101.47.142.76 port 40352 [preauth]
Dec 06 09:40:18 np0005548788.localdomain sudo[193468]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26067 DF PROTO=TCP SPT=32886 DPT=9102 SEQ=1113233810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E67D350000000001030307) 
Dec 06 09:40:19 np0005548788.localdomain sudo[193617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owfngabisuaovfrthjevphhdqxuureuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014019.6746013-1208-15025325245965/AnsiballZ_systemd.py
Dec 06 09:40:19 np0005548788.localdomain sudo[193617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:20 np0005548788.localdomain python3.9[193619]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:40:20 np0005548788.localdomain sudo[193617]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:20 np0005548788.localdomain podman[193622]: 2025-12-06 09:40:20.412074968 +0000 UTC m=+0.099519240 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:40:20 np0005548788.localdomain podman[193622]: 2025-12-06 09:40:20.449592246 +0000 UTC m=+0.137036488 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:40:20 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:40:20 np0005548788.localdomain sudo[193753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sljkziasarssnzknglfwlfdijbjfvkju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014020.4973304-1208-196331823426967/AnsiballZ_systemd.py
Dec 06 09:40:20 np0005548788.localdomain sudo[193753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:21 np0005548788.localdomain python3.9[193755]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:21 np0005548788.localdomain sudo[193753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:21 np0005548788.localdomain sudo[193866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehrplgppysqivzmgokultofsgqqkuwri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014021.6191022-1208-22258661554268/AnsiballZ_systemd.py
Dec 06 09:40:21 np0005548788.localdomain sudo[193866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:40:22 np0005548788.localdomain systemd[1]: tmp-crun.WLnc9R.mount: Deactivated successfully.
Dec 06 09:40:22 np0005548788.localdomain podman[193869]: 2025-12-06 09:40:22.015298429 +0000 UTC m=+0.084681849 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:40:22 np0005548788.localdomain podman[193869]: 2025-12-06 09:40:22.049792661 +0000 UTC m=+0.119176031 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:40:22 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:40:22 np0005548788.localdomain python3.9[193868]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:22 np0005548788.localdomain sudo[193866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26069 DF PROTO=TCP SPT=32886 DPT=9102 SEQ=1113233810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E689300000000001030307) 
Dec 06 09:40:22 np0005548788.localdomain sudo[193997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvxkzgifpjabqzucpzizpcamuhluduxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014022.4028587-1208-120281341837795/AnsiballZ_systemd.py
Dec 06 09:40:22 np0005548788.localdomain sudo[193997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:22 np0005548788.localdomain python3.9[193999]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:23 np0005548788.localdomain sudo[193997]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:23 np0005548788.localdomain sudo[194110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txvhfacvfgyfpsoxsjjfqbinjrzhdnfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014023.138503-1208-34637918696587/AnsiballZ_systemd.py
Dec 06 09:40:23 np0005548788.localdomain sudo[194110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:23 np0005548788.localdomain python3.9[194112]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:23 np0005548788.localdomain sudo[194110]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:24 np0005548788.localdomain sudo[194223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abzfwrnbhidytvuccwldxtlbavhegndr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014023.8510337-1208-184303900392063/AnsiballZ_systemd.py
Dec 06 09:40:24 np0005548788.localdomain sudo[194223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:24 np0005548788.localdomain python3.9[194225]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:25 np0005548788.localdomain sudo[194223]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:25 np0005548788.localdomain sudo[194336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxgtunlbpxdgoupxilkhomycaltbtgqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014025.6284466-1208-33884308885408/AnsiballZ_systemd.py
Dec 06 09:40:25 np0005548788.localdomain sudo[194336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:26 np0005548788.localdomain python3.9[194338]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26070 DF PROTO=TCP SPT=32886 DPT=9102 SEQ=1113233810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E698F00000000001030307) 
Dec 06 09:40:27 np0005548788.localdomain sudo[194336]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:27 np0005548788.localdomain sudo[194449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbwxisraqiyhjnrjcebdeupmyuhrodwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014027.470357-1208-246585145489506/AnsiballZ_systemd.py
Dec 06 09:40:27 np0005548788.localdomain sudo[194449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:28 np0005548788.localdomain python3.9[194451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:28 np0005548788.localdomain sudo[194449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:28 np0005548788.localdomain sudo[194562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnbwmzrvxzdqafwzxwrwhijrbcwnvswr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014028.3057191-1208-271873233633304/AnsiballZ_systemd.py
Dec 06 09:40:28 np0005548788.localdomain sudo[194562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7521 DF PROTO=TCP SPT=34240 DPT=9100 SEQ=1111503314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6A1F00000000001030307) 
Dec 06 09:40:28 np0005548788.localdomain python3.9[194564]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:28 np0005548788.localdomain sudo[194562]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:29 np0005548788.localdomain sudo[194675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xynhumsweoumnfncyriwdnlgbnqnfcyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014029.0878143-1208-177678966382928/AnsiballZ_systemd.py
Dec 06 09:40:29 np0005548788.localdomain sudo[194675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:29 np0005548788.localdomain python3.9[194677]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:29 np0005548788.localdomain sudo[194675]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:30 np0005548788.localdomain sudo[194788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlzfbrodpcjkeoxsiiidstivuxdepgnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014029.8580244-1208-232183150394440/AnsiballZ_systemd.py
Dec 06 09:40:30 np0005548788.localdomain sudo[194788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:30 np0005548788.localdomain python3.9[194790]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:30 np0005548788.localdomain sudo[194788]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:31 np0005548788.localdomain sudo[194901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkyazgdlaceryxlegcqemeosydfonmvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014030.6760774-1208-108851014019639/AnsiballZ_systemd.py
Dec 06 09:40:31 np0005548788.localdomain sudo[194901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:32 np0005548788.localdomain python3.9[194903]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:32 np0005548788.localdomain sudo[194901]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:32 np0005548788.localdomain sudo[195014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaanmzauhuxicnssozcgsymtwudkasjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014032.2624478-1208-107584241068747/AnsiballZ_systemd.py
Dec 06 09:40:32 np0005548788.localdomain sudo[195014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:32 np0005548788.localdomain python3.9[195016]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:33 np0005548788.localdomain sudo[195014]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:34 np0005548788.localdomain sudo[195127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jctmnfeddyrgpnwblcqxisvzwxbvlftk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014034.3435752-1208-107335077831354/AnsiballZ_systemd.py
Dec 06 09:40:34 np0005548788.localdomain sudo[195127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=227 DF PROTO=TCP SPT=50032 DPT=9101 SEQ=3413720136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6B93F0000000001030307) 
Dec 06 09:40:34 np0005548788.localdomain python3.9[195129]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16454 DF PROTO=TCP SPT=33028 DPT=9882 SEQ=3070406702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6B9F00000000001030307) 
Dec 06 09:40:34 np0005548788.localdomain sudo[195127]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:35 np0005548788.localdomain sudo[195240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmoczxjtwtqiivanqxtceujnzmfznfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014035.4860797-1514-184277033698519/AnsiballZ_file.py
Dec 06 09:40:35 np0005548788.localdomain sudo[195240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:35 np0005548788.localdomain python3.9[195242]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:35 np0005548788.localdomain sudo[195240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:36 np0005548788.localdomain sudo[195350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neuixsfeggbnobpycmhctrzfneuaktcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014036.111857-1514-75658617065805/AnsiballZ_file.py
Dec 06 09:40:36 np0005548788.localdomain sudo[195350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:36 np0005548788.localdomain python3.9[195352]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:36 np0005548788.localdomain sudo[195350]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:37 np0005548788.localdomain sudo[195460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fahkkmpjhreeecejbedryoifpldgwita ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014036.7741425-1514-145494661383161/AnsiballZ_file.py
Dec 06 09:40:37 np0005548788.localdomain sudo[195460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:37 np0005548788.localdomain python3.9[195462]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:37 np0005548788.localdomain sudo[195460]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:37 np0005548788.localdomain sudo[195570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhrgajseihukrxwqviwbzkvtpxzxyojr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014037.3791397-1514-33072895924089/AnsiballZ_file.py
Dec 06 09:40:37 np0005548788.localdomain sudo[195570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=229 DF PROTO=TCP SPT=50032 DPT=9101 SEQ=3413720136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6C5300000000001030307) 
Dec 06 09:40:37 np0005548788.localdomain python3.9[195572]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:37 np0005548788.localdomain sudo[195570]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:38 np0005548788.localdomain sudo[195680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgynohaueagmogskmuxeusfpegpjtlsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014038.0448723-1514-144449827803192/AnsiballZ_file.py
Dec 06 09:40:38 np0005548788.localdomain sudo[195680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:38 np0005548788.localdomain python3.9[195682]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:38 np0005548788.localdomain sudo[195680]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:38 np0005548788.localdomain sudo[195790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjkqajkcpuivepmgzfwhtejoelbyjzdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014038.6977048-1514-144554818259859/AnsiballZ_file.py
Dec 06 09:40:38 np0005548788.localdomain sudo[195790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:39 np0005548788.localdomain python3.9[195792]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:39 np0005548788.localdomain sudo[195790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:39 np0005548788.localdomain sudo[195900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnlegtehgpbadewncpfieehibrvatmgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014039.3421903-1643-235133601998218/AnsiballZ_stat.py
Dec 06 09:40:39 np0005548788.localdomain sudo[195900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:39 np0005548788.localdomain python3.9[195902]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:40 np0005548788.localdomain sudo[195900]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:40 np0005548788.localdomain sudo[195990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvgnahoajzdubyffebaypvqoxbslagaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014039.3421903-1643-235133601998218/AnsiballZ_copy.py
Dec 06 09:40:40 np0005548788.localdomain sudo[195990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5357 DF PROTO=TCP SPT=38464 DPT=9105 SEQ=1322947357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6CFF00000000001030307) 
Dec 06 09:40:40 np0005548788.localdomain python3.9[195992]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014039.3421903-1643-235133601998218/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:40 np0005548788.localdomain sudo[195990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:41 np0005548788.localdomain sudo[196100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtgvsprlfjbpbufiyzigaiqobbqmqrmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014040.848167-1643-2981795963699/AnsiballZ_stat.py
Dec 06 09:40:41 np0005548788.localdomain sudo[196100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:41 np0005548788.localdomain python3.9[196102]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:41 np0005548788.localdomain sudo[196100]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:41 np0005548788.localdomain sudo[196190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjskzekzwqaruhqwburebvpswfpgtseo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014040.848167-1643-2981795963699/AnsiballZ_copy.py
Dec 06 09:40:41 np0005548788.localdomain sudo[196190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:41 np0005548788.localdomain python3.9[196192]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014040.848167-1643-2981795963699/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:41 np0005548788.localdomain sudo[196190]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:42 np0005548788.localdomain sudo[196300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsdlahesufwlkwvhhiaswondibshwjhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014042.0361996-1643-198568423048300/AnsiballZ_stat.py
Dec 06 09:40:42 np0005548788.localdomain sudo[196300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:42 np0005548788.localdomain python3.9[196302]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:42 np0005548788.localdomain sudo[196300]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:42 np0005548788.localdomain sudo[196390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piurwllbrjgnhdzoqlkdkrbjlpcytlvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014042.0361996-1643-198568423048300/AnsiballZ_copy.py
Dec 06 09:40:42 np0005548788.localdomain sudo[196390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:43 np0005548788.localdomain python3.9[196392]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014042.0361996-1643-198568423048300/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:43 np0005548788.localdomain sudo[196390]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:43 np0005548788.localdomain sudo[196500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opecmnibilcrrdnbiqwhxdcgbxuxsdrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014043.2859209-1643-221983521855523/AnsiballZ_stat.py
Dec 06 09:40:43 np0005548788.localdomain sudo[196500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14228 DF PROTO=TCP SPT=33394 DPT=9100 SEQ=1082742886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6DB680000000001030307) 
Dec 06 09:40:43 np0005548788.localdomain python3.9[196502]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:43 np0005548788.localdomain sudo[196500]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:44 np0005548788.localdomain sudo[196590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jenjcitsrylszedccakikmrgggylroju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014043.2859209-1643-221983521855523/AnsiballZ_copy.py
Dec 06 09:40:44 np0005548788.localdomain sudo[196590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:44 np0005548788.localdomain python3.9[196592]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014043.2859209-1643-221983521855523/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:44 np0005548788.localdomain sudo[196590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:44 np0005548788.localdomain sudo[196700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opqoluklvmmqbqqcnuabcvtcnrtxoxmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014044.465137-1643-70663692628358/AnsiballZ_stat.py
Dec 06 09:40:44 np0005548788.localdomain sudo[196700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:44 np0005548788.localdomain python3.9[196702]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:44 np0005548788.localdomain sudo[196700]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:45 np0005548788.localdomain sudo[196790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrmutyetcancavcfizluhheyxuenbrkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014044.465137-1643-70663692628358/AnsiballZ_copy.py
Dec 06 09:40:45 np0005548788.localdomain sudo[196790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:45 np0005548788.localdomain python3.9[196792]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014044.465137-1643-70663692628358/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:45 np0005548788.localdomain sudo[196790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:45 np0005548788.localdomain sudo[196900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtwrjyrvkgtsllmknrgqufcsinpqsswr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014045.6597233-1643-117902440856349/AnsiballZ_stat.py
Dec 06 09:40:45 np0005548788.localdomain sudo[196900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:46 np0005548788.localdomain python3.9[196902]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:46 np0005548788.localdomain sudo[196900]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:46 np0005548788.localdomain sudo[196990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdbzlwxapxjuqfoimegazftiiqpxpevn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014045.6597233-1643-117902440856349/AnsiballZ_copy.py
Dec 06 09:40:46 np0005548788.localdomain sudo[196990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14230 DF PROTO=TCP SPT=33394 DPT=9100 SEQ=1082742886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6E7700000000001030307) 
Dec 06 09:40:46 np0005548788.localdomain python3.9[196992]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014045.6597233-1643-117902440856349/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:46 np0005548788.localdomain sudo[196990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:47 np0005548788.localdomain sudo[197100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjeahxfovnaeyiupvpcbjogsgvabkcwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014046.793152-1643-183419689935802/AnsiballZ_stat.py
Dec 06 09:40:47 np0005548788.localdomain sudo[197100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:47 np0005548788.localdomain python3.9[197102]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:47 np0005548788.localdomain sudo[197100]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:40:47.396 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:40:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:40:47.397 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:40:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:40:47.397 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:40:47 np0005548788.localdomain sudo[197188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyjypzyuutliwinsceqqackduydrfept ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014046.793152-1643-183419689935802/AnsiballZ_copy.py
Dec 06 09:40:47 np0005548788.localdomain sudo[197188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:47 np0005548788.localdomain python3.9[197190]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014046.793152-1643-183419689935802/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:47 np0005548788.localdomain sudo[197188]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:48 np0005548788.localdomain sudo[197298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erdsefpjqdcgrawjrtjruoiduhegyjkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014047.9723926-1643-208061350208537/AnsiballZ_stat.py
Dec 06 09:40:48 np0005548788.localdomain sudo[197298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:48 np0005548788.localdomain python3.9[197300]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:48 np0005548788.localdomain sudo[197298]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:48 np0005548788.localdomain sudo[197388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycmgvpjdgmnbuvsqzgzclotrhrbyfbjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014047.9723926-1643-208061350208537/AnsiballZ_copy.py
Dec 06 09:40:48 np0005548788.localdomain sudo[197388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:48 np0005548788.localdomain python3.9[197390]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014047.9723926-1643-208061350208537/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:48 np0005548788.localdomain sudo[197388]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9406 DF PROTO=TCP SPT=42512 DPT=9102 SEQ=1178424866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6F2640000000001030307) 
Dec 06 09:40:49 np0005548788.localdomain sudo[197498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbidlnionxejztbvsjgqjugwkmbschae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014049.2102706-1985-164426083035038/AnsiballZ_file.py
Dec 06 09:40:49 np0005548788.localdomain sudo[197498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:49 np0005548788.localdomain python3.9[197500]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:49 np0005548788.localdomain sudo[197498]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:50 np0005548788.localdomain sudo[197608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-miipfrasvodarhudbrpvsgtkyvlycxxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014049.8531115-2009-195600471296895/AnsiballZ_file.py
Dec 06 09:40:50 np0005548788.localdomain sudo[197608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:50 np0005548788.localdomain python3.9[197610]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:50 np0005548788.localdomain sudo[197608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:50 np0005548788.localdomain sudo[197718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpokcaejoplesaonfudtitmisuzysptd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014050.467306-2009-199466364966759/AnsiballZ_file.py
Dec 06 09:40:50 np0005548788.localdomain sudo[197718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:40:50 np0005548788.localdomain podman[197721]: 2025-12-06 09:40:50.930603621 +0000 UTC m=+0.088982952 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:40:50 np0005548788.localdomain podman[197721]: 2025-12-06 09:40:50.96518119 +0000 UTC m=+0.123560591 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:40:50 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:40:51 np0005548788.localdomain python3.9[197720]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:51 np0005548788.localdomain sudo[197718]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:51 np0005548788.localdomain sudo[197853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-foizpacwsofiwmzlqnbaqunuuzcuqzpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014051.1592112-2009-266275064106432/AnsiballZ_file.py
Dec 06 09:40:51 np0005548788.localdomain sudo[197853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:51 np0005548788.localdomain python3.9[197855]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:51 np0005548788.localdomain sudo[197853]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:52 np0005548788.localdomain sudo[197963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkjrozunxxmhyvfwuwfntbngbyutgile ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014051.8171268-2009-257194681883964/AnsiballZ_file.py
Dec 06 09:40:52 np0005548788.localdomain sudo[197963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:40:52 np0005548788.localdomain podman[197966]: 2025-12-06 09:40:52.179305666 +0000 UTC m=+0.087761018 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:40:52 np0005548788.localdomain systemd[1]: tmp-crun.FVB9v7.mount: Deactivated successfully.
Dec 06 09:40:52 np0005548788.localdomain podman[197966]: 2025-12-06 09:40:52.213727721 +0000 UTC m=+0.122183013 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:40:52 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:40:52 np0005548788.localdomain python3.9[197965]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:52 np0005548788.localdomain sudo[197963]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9408 DF PROTO=TCP SPT=42512 DPT=9102 SEQ=1178424866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E6FE700000000001030307) 
Dec 06 09:40:52 np0005548788.localdomain sudo[198089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oomrnvwfypidjjntgdpakorvdqvzfnum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014052.416729-2009-260171659464276/AnsiballZ_file.py
Dec 06 09:40:52 np0005548788.localdomain sudo[198089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:52 np0005548788.localdomain python3.9[198091]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:52 np0005548788.localdomain sudo[198089]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:53 np0005548788.localdomain sudo[198199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efdytnjmgbwosnduslqjznmiusyunafb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014053.0062761-2009-198284150128196/AnsiballZ_file.py
Dec 06 09:40:53 np0005548788.localdomain sudo[198199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:53 np0005548788.localdomain sudo[198202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:40:53 np0005548788.localdomain sudo[198202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:40:53 np0005548788.localdomain sudo[198202]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:53 np0005548788.localdomain python3.9[198201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:53 np0005548788.localdomain sudo[198220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:40:53 np0005548788.localdomain sudo[198220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:40:53 np0005548788.localdomain sudo[198199]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:53 np0005548788.localdomain sudo[198359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcqabaezqtsxzqtboeynltsuowrdazvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014053.623301-2009-106617916703170/AnsiballZ_file.py
Dec 06 09:40:53 np0005548788.localdomain sudo[198359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:54 np0005548788.localdomain python3.9[198361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:54 np0005548788.localdomain sudo[198359]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:54 np0005548788.localdomain sudo[198220]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:54 np0005548788.localdomain sudo[198486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvbbrlnzrzhdymlblwacpbkksgswbyqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014054.174922-2009-36715104542754/AnsiballZ_file.py
Dec 06 09:40:54 np0005548788.localdomain sudo[198486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:54 np0005548788.localdomain python3.9[198488]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:54 np0005548788.localdomain sudo[198486]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:54 np0005548788.localdomain sudo[198489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:40:54 np0005548788.localdomain sudo[198489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:40:54 np0005548788.localdomain sudo[198489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:55 np0005548788.localdomain sudo[198614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abarcedrgxkfghpeecyzjzigedqcaxlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014054.7793658-2009-48483769670150/AnsiballZ_file.py
Dec 06 09:40:55 np0005548788.localdomain sudo[198614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:55 np0005548788.localdomain python3.9[198616]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:55 np0005548788.localdomain sudo[198614]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:55 np0005548788.localdomain sudo[198724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhhvqzhjlawjfaebocwlnfoydexywcxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014055.385767-2009-40920902226158/AnsiballZ_file.py
Dec 06 09:40:55 np0005548788.localdomain sudo[198724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:55 np0005548788.localdomain python3.9[198726]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:55 np0005548788.localdomain sudo[198724]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:56 np0005548788.localdomain sudo[198834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gexvurqzijghsqbzpyvovznthynomzpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014056.0370533-2009-201867410779014/AnsiballZ_file.py
Dec 06 09:40:56 np0005548788.localdomain sudo[198834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:56 np0005548788.localdomain python3.9[198836]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9409 DF PROTO=TCP SPT=42512 DPT=9102 SEQ=1178424866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E70E300000000001030307) 
Dec 06 09:40:56 np0005548788.localdomain sudo[198834]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:57 np0005548788.localdomain sudo[198944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhnaapotxwfvgftmaislqtinmawxruth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014056.7107613-2009-277212988553956/AnsiballZ_file.py
Dec 06 09:40:57 np0005548788.localdomain sudo[198944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:57 np0005548788.localdomain python3.9[198946]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:57 np0005548788.localdomain sudo[198944]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:57 np0005548788.localdomain sudo[199054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zobeuwtnjxvjiqvsntczqqmavovautvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014057.3498776-2009-47690483869027/AnsiballZ_file.py
Dec 06 09:40:57 np0005548788.localdomain sudo[199054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:57 np0005548788.localdomain python3.9[199056]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:57 np0005548788.localdomain sudo[199054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:58 np0005548788.localdomain sshd[199128]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:40:58 np0005548788.localdomain sudo[199165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjeebtiwlthypoehspjubujazvavklgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014058.0022473-2009-112810142898604/AnsiballZ_file.py
Dec 06 09:40:58 np0005548788.localdomain sudo[199165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14232 DF PROTO=TCP SPT=33394 DPT=9100 SEQ=1082742886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E717F10000000001030307) 
Dec 06 09:40:59 np0005548788.localdomain python3.9[199167]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:59 np0005548788.localdomain sudo[199165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:59 np0005548788.localdomain sudo[199275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojhpjunjnbjocgymsjrvhxtkfjjgqels ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014059.3268006-2306-280960117483165/AnsiballZ_stat.py
Dec 06 09:40:59 np0005548788.localdomain sudo[199275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:00 np0005548788.localdomain python3.9[199277]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:00 np0005548788.localdomain sudo[199275]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:01 np0005548788.localdomain sudo[199363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nefojqegalumxyiafhwniyllwcvmmkwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014059.3268006-2306-280960117483165/AnsiballZ_copy.py
Dec 06 09:41:01 np0005548788.localdomain sudo[199363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:01 np0005548788.localdomain python3.9[199365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014059.3268006-2306-280960117483165/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:01 np0005548788.localdomain sudo[199363]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:01 np0005548788.localdomain sudo[199473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itfkmpztpwvtoyecjzduktqqukgtgkue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014061.4311104-2306-204695209741639/AnsiballZ_stat.py
Dec 06 09:41:01 np0005548788.localdomain sudo[199473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:01 np0005548788.localdomain python3.9[199475]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:01 np0005548788.localdomain sudo[199473]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:02 np0005548788.localdomain sudo[199561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zibouewdcigegkexheydhjlvnybhepfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014061.4311104-2306-204695209741639/AnsiballZ_copy.py
Dec 06 09:41:02 np0005548788.localdomain sudo[199561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:02 np0005548788.localdomain python3.9[199563]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014061.4311104-2306-204695209741639/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:02 np0005548788.localdomain sudo[199561]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:02 np0005548788.localdomain sudo[199671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msddzdfmxgyaomcfgzktpxwibctcorom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014062.6287906-2306-178842606870367/AnsiballZ_stat.py
Dec 06 09:41:02 np0005548788.localdomain sudo[199671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:03 np0005548788.localdomain python3.9[199673]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:03 np0005548788.localdomain sudo[199671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:03 np0005548788.localdomain sudo[199759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clegqkzihokrglrulsthatrpmuyfaeqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014062.6287906-2306-178842606870367/AnsiballZ_copy.py
Dec 06 09:41:03 np0005548788.localdomain sudo[199759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:03 np0005548788.localdomain python3.9[199761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014062.6287906-2306-178842606870367/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:03 np0005548788.localdomain sudo[199759]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:04 np0005548788.localdomain sudo[199869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-encrppibirhigizvltukggafzecdrhfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014063.908726-2306-131031812661806/AnsiballZ_stat.py
Dec 06 09:41:04 np0005548788.localdomain sudo[199869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:04 np0005548788.localdomain python3.9[199871]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:04 np0005548788.localdomain sudo[199869]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9410 DF PROTO=TCP SPT=42512 DPT=9102 SEQ=1178424866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E72DF00000000001030307) 
Dec 06 09:41:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40195 DF PROTO=TCP SPT=43894 DPT=9101 SEQ=240555840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E72E6E0000000001030307) 
Dec 06 09:41:04 np0005548788.localdomain sudo[199957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmjmfuxlkpswbhfayubjkqjllmsqyozy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014063.908726-2306-131031812661806/AnsiballZ_copy.py
Dec 06 09:41:04 np0005548788.localdomain sudo[199957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:05 np0005548788.localdomain python3.9[199959]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014063.908726-2306-131031812661806/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:05 np0005548788.localdomain sudo[199957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:05 np0005548788.localdomain sudo[200067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jggxuqgcrvwtlimmcsvreofltehksmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014065.2818186-2306-56987972123331/AnsiballZ_stat.py
Dec 06 09:41:05 np0005548788.localdomain sudo[200067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:05 np0005548788.localdomain python3.9[200069]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:05 np0005548788.localdomain sudo[200067]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:06 np0005548788.localdomain sudo[200155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwhbihkojmttkggkaidsxqbhddvwutap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014065.2818186-2306-56987972123331/AnsiballZ_copy.py
Dec 06 09:41:06 np0005548788.localdomain sudo[200155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:06 np0005548788.localdomain python3.9[200157]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014065.2818186-2306-56987972123331/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:06 np0005548788.localdomain sudo[200155]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:06 np0005548788.localdomain sudo[200265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiwdtwxjzliywyhouwaowoipfcphrkjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014066.467766-2306-278873903962819/AnsiballZ_stat.py
Dec 06 09:41:06 np0005548788.localdomain sudo[200265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:06 np0005548788.localdomain python3.9[200267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:07 np0005548788.localdomain sudo[200265]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:07 np0005548788.localdomain sudo[200353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bitgowaxjpobpbbvxttgsihtwbhstghg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014066.467766-2306-278873903962819/AnsiballZ_copy.py
Dec 06 09:41:07 np0005548788.localdomain sudo[200353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:07 np0005548788.localdomain python3.9[200355]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014066.467766-2306-278873903962819/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:07 np0005548788.localdomain sudo[200353]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40197 DF PROTO=TCP SPT=43894 DPT=9101 SEQ=240555840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E73A710000000001030307) 
Dec 06 09:41:08 np0005548788.localdomain sudo[200463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvaufdischbulezrplqdntazxlgfwdgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014067.7175543-2306-269768371323876/AnsiballZ_stat.py
Dec 06 09:41:08 np0005548788.localdomain sudo[200463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:08 np0005548788.localdomain python3.9[200465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:08 np0005548788.localdomain sudo[200463]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:08 np0005548788.localdomain sudo[200551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewgltegnzqddsbdtxnkqzagjmxiwydkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014067.7175543-2306-269768371323876/AnsiballZ_copy.py
Dec 06 09:41:08 np0005548788.localdomain sudo[200551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:08 np0005548788.localdomain python3.9[200553]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014067.7175543-2306-269768371323876/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:08 np0005548788.localdomain sudo[200551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:09 np0005548788.localdomain sudo[200661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxvvbhmcfeijpfztdfgemrivsetoyuoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014068.8598506-2306-214251627551430/AnsiballZ_stat.py
Dec 06 09:41:09 np0005548788.localdomain sudo[200661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:09 np0005548788.localdomain sshd[199128]: Connection closed by 45.78.194.186 port 33544 [preauth]
Dec 06 09:41:09 np0005548788.localdomain python3.9[200663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:09 np0005548788.localdomain sudo[200661]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:09 np0005548788.localdomain sudo[200750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcxtazfqzchydwrgbgorkysxscdsfpaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014068.8598506-2306-214251627551430/AnsiballZ_copy.py
Dec 06 09:41:09 np0005548788.localdomain sudo[200750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:09 np0005548788.localdomain python3.9[200752]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014068.8598506-2306-214251627551430/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:09 np0005548788.localdomain sudo[200750]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:10 np0005548788.localdomain sudo[200860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkvmvjxgawwbrxofjmvlncerwtrwwdqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014070.2557828-2306-177526989797306/AnsiballZ_stat.py
Dec 06 09:41:10 np0005548788.localdomain sudo[200860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:10 np0005548788.localdomain python3.9[200862]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:10 np0005548788.localdomain sudo[200860]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40840 DF PROTO=TCP SPT=48568 DPT=9105 SEQ=261571998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E745F00000000001030307) 
Dec 06 09:41:11 np0005548788.localdomain sudo[200948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxbjyxfbklajjkbmopgnzcmjmbbufjmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014070.2557828-2306-177526989797306/AnsiballZ_copy.py
Dec 06 09:41:11 np0005548788.localdomain sudo[200948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:11 np0005548788.localdomain python3.9[200950]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014070.2557828-2306-177526989797306/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:11 np0005548788.localdomain sudo[200948]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:11 np0005548788.localdomain sudo[201058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnkdibjeebuavvhaijyfurbalcvezuls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014071.446986-2306-149166586610904/AnsiballZ_stat.py
Dec 06 09:41:11 np0005548788.localdomain sudo[201058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:11 np0005548788.localdomain python3.9[201060]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:11 np0005548788.localdomain sudo[201058]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:12 np0005548788.localdomain sudo[201146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlyvzrrskizytfnabwoucyfmetqgvnkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014071.446986-2306-149166586610904/AnsiballZ_copy.py
Dec 06 09:41:12 np0005548788.localdomain sudo[201146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:12 np0005548788.localdomain python3.9[201148]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014071.446986-2306-149166586610904/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:12 np0005548788.localdomain sudo[201146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:12 np0005548788.localdomain sudo[201256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdufypwzupfxmlojjabqkydjojimsstq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014072.640457-2306-88857510406029/AnsiballZ_stat.py
Dec 06 09:41:12 np0005548788.localdomain sudo[201256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:13 np0005548788.localdomain python3.9[201258]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:13 np0005548788.localdomain sudo[201256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32574 DF PROTO=TCP SPT=41318 DPT=9100 SEQ=1342159501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7508F0000000001030307) 
Dec 06 09:41:13 np0005548788.localdomain sudo[201344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myqauilrdlxwremzbnjtxrcmsbmvdnia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014072.640457-2306-88857510406029/AnsiballZ_copy.py
Dec 06 09:41:13 np0005548788.localdomain sudo[201344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:13 np0005548788.localdomain python3.9[201346]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014072.640457-2306-88857510406029/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:13 np0005548788.localdomain sudo[201344]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 np0005548788.localdomain sudo[201454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwmxnydvtfphsfmmrmkctrtedttuxkjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014073.916737-2306-213063196396097/AnsiballZ_stat.py
Dec 06 09:41:14 np0005548788.localdomain sudo[201454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:14 np0005548788.localdomain python3.9[201456]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:14 np0005548788.localdomain sudo[201454]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 np0005548788.localdomain sudo[201542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdvrheeemtiomlxijaqtpmeyauwgiqyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014073.916737-2306-213063196396097/AnsiballZ_copy.py
Dec 06 09:41:14 np0005548788.localdomain sudo[201542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:14 np0005548788.localdomain python3.9[201544]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014073.916737-2306-213063196396097/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:14 np0005548788.localdomain sudo[201542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 np0005548788.localdomain sudo[201652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piktdgbwbrzqowboarpcslnmiukaajxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014075.1042414-2306-31886528568961/AnsiballZ_stat.py
Dec 06 09:41:15 np0005548788.localdomain sudo[201652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:15 np0005548788.localdomain python3.9[201654]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:15 np0005548788.localdomain sudo[201652]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 np0005548788.localdomain sudo[201740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcilsyukkzssfnvofgcoqmwvgkbwdzfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014075.1042414-2306-31886528568961/AnsiballZ_copy.py
Dec 06 09:41:16 np0005548788.localdomain sudo[201740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:16 np0005548788.localdomain python3.9[201742]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014075.1042414-2306-31886528568961/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:16 np0005548788.localdomain sudo[201740]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32576 DF PROTO=TCP SPT=41318 DPT=9100 SEQ=1342159501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E75CB00000000001030307) 
Dec 06 09:41:16 np0005548788.localdomain sudo[201850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fygmbulwfrvxehibgfehbizrxqgspplr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014076.4680316-2306-109032820031903/AnsiballZ_stat.py
Dec 06 09:41:16 np0005548788.localdomain sudo[201850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:16 np0005548788.localdomain python3.9[201852]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:16 np0005548788.localdomain sudo[201850]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:17 np0005548788.localdomain sudo[201938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snjgtcqcqwryopumicenjpjwueutevyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014076.4680316-2306-109032820031903/AnsiballZ_copy.py
Dec 06 09:41:17 np0005548788.localdomain sudo[201938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:17 np0005548788.localdomain python3.9[201940]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014076.4680316-2306-109032820031903/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:17 np0005548788.localdomain sudo[201938]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:18 np0005548788.localdomain python3.9[202048]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31460 DF PROTO=TCP SPT=60224 DPT=9102 SEQ=3419504907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E767950000000001030307) 
Dec 06 09:41:19 np0005548788.localdomain sudo[202159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqzndafupzbbcpdwzegwfffihqugaiqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014079.1468577-2924-19863766876138/AnsiballZ_seboolean.py
Dec 06 09:41:19 np0005548788.localdomain sudo[202159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:19 np0005548788.localdomain python3.9[202161]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 06 09:41:20 np0005548788.localdomain sudo[202159]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:41:21 np0005548788.localdomain podman[202201]: 2025-12-06 09:41:21.273150423 +0000 UTC m=+0.089390123 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:41:21 np0005548788.localdomain podman[202201]: 2025-12-06 09:41:21.316655091 +0000 UTC m=+0.132894821 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:41:21 np0005548788.localdomain sudo[202293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlqiyzixxawxnbuyazdvyqdjkozfhziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014081.1487596-2954-179377650546790/AnsiballZ_systemd.py
Dec 06 09:41:21 np0005548788.localdomain sudo[202293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:21 np0005548788.localdomain python3.9[202295]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:41:21 np0005548788.localdomain systemd-rc-local-generator[202316]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:21 np0005548788.localdomain systemd-sysv-generator[202323]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Starting libvirt logging daemon socket...
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Starting libvirt logging daemon...
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Started libvirt logging daemon.
Dec 06 09:41:22 np0005548788.localdomain sudo[202293]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31462 DF PROTO=TCP SPT=60224 DPT=9102 SEQ=3419504907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E773B00000000001030307) 
Dec 06 09:41:22 np0005548788.localdomain sudo[202445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siyvbwitzbmmdcgpwqikauowptxsoovm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014082.3518417-2954-206785533988020/AnsiballZ_systemd.py
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:41:22 np0005548788.localdomain sudo[202445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: tmp-crun.NJqWYQ.mount: Deactivated successfully.
Dec 06 09:41:22 np0005548788.localdomain podman[202447]: 2025-12-06 09:41:22.766075097 +0000 UTC m=+0.094551686 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:41:22 np0005548788.localdomain podman[202447]: 2025-12-06 09:41:22.800557435 +0000 UTC m=+0.129034024 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:41:22 np0005548788.localdomain python3.9[202448]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:22 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:41:23 np0005548788.localdomain systemd-sysv-generator[202494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:23 np0005548788.localdomain systemd-rc-local-generator[202490]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 06 09:41:23 np0005548788.localdomain systemd[1]: Started libvirt nodedev daemon.
Dec 06 09:41:23 np0005548788.localdomain sudo[202445]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:23 np0005548788.localdomain sudo[202636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lshltdngywnoyvjmzobzokoequmxfpei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014083.585426-2954-163546367885465/AnsiballZ_systemd.py
Dec 06 09:41:23 np0005548788.localdomain sudo[202636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:24 np0005548788.localdomain python3.9[202638]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:41:24 np0005548788.localdomain systemd-rc-local-generator[202664]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:24 np0005548788.localdomain systemd-sysv-generator[202668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Started libvirt proxy daemon.
Dec 06 09:41:24 np0005548788.localdomain sudo[202636]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:24 np0005548788.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 06 09:41:25 np0005548788.localdomain sudo[202810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geltpbrtsoavhwkwvkkdjgfnjikcthri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014084.7731566-2954-5260961734397/AnsiballZ_systemd.py
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 06 09:41:25 np0005548788.localdomain sudo[202810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:25 np0005548788.localdomain python3.9[202816]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:41:25 np0005548788.localdomain systemd-rc-local-generator[202842]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:25 np0005548788.localdomain systemd-sysv-generator[202845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 06 09:41:25 np0005548788.localdomain systemd[1]: Started libvirt QEMU daemon.
Dec 06 09:41:25 np0005548788.localdomain sudo[202810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:25 np0005548788.localdomain setroubleshoot[202675]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e959d589-23c5-4def-84b6-ad813de64ac3
Dec 06 09:41:25 np0005548788.localdomain setroubleshoot[202675]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Dec 06 09:41:25 np0005548788.localdomain setroubleshoot[202675]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e959d589-23c5-4def-84b6-ad813de64ac3
Dec 06 09:41:25 np0005548788.localdomain setroubleshoot[202675]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Dec 06 09:41:26 np0005548788.localdomain sudo[202991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngecktmlpogoqtyifwxqnbiznisivvqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014085.939089-2954-279309553126058/AnsiballZ_systemd.py
Dec 06 09:41:26 np0005548788.localdomain sudo[202991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:26 np0005548788.localdomain python3.9[202993]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:41:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31463 DF PROTO=TCP SPT=60224 DPT=9102 SEQ=3419504907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E783710000000001030307) 
Dec 06 09:41:26 np0005548788.localdomain systemd-rc-local-generator[203020]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:26 np0005548788.localdomain systemd-sysv-generator[203024]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Starting libvirt secret daemon socket...
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 06 09:41:26 np0005548788.localdomain systemd[1]: Started libvirt secret daemon.
Dec 06 09:41:26 np0005548788.localdomain sudo[202991]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:27 np0005548788.localdomain sudo[203162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxhspcmdpomgcjiwqkakjgpffxypxhep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014087.231036-3065-275986129894097/AnsiballZ_file.py
Dec 06 09:41:27 np0005548788.localdomain sudo[203162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:27 np0005548788.localdomain python3.9[203164]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:27 np0005548788.localdomain sudo[203162]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:28 np0005548788.localdomain sudo[203272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhwgkaypzypsodbwwwdvkakjabkrxhic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014087.918577-3089-40051814049700/AnsiballZ_find.py
Dec 06 09:41:28 np0005548788.localdomain sudo[203272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:28 np0005548788.localdomain python3.9[203274]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:41:28 np0005548788.localdomain sudo[203272]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32578 DF PROTO=TCP SPT=41318 DPT=9100 SEQ=1342159501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E78BF00000000001030307) 
Dec 06 09:41:28 np0005548788.localdomain sudo[203382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iguyhpkskglzaqgjtdxndnsjcwqllqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014088.6321707-3113-112651826458619/AnsiballZ_command.py
Dec 06 09:41:28 np0005548788.localdomain sudo[203382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:29 np0005548788.localdomain python3.9[203384]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:29 np0005548788.localdomain sudo[203382]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:29 np0005548788.localdomain python3.9[203496]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:41:30 np0005548788.localdomain python3.9[203604]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:31 np0005548788.localdomain python3.9[203690]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014090.3302913-3170-191359086870818/.source.xml follow=False _original_basename=secret.xml.j2 checksum=9621e6cf70c8e0de93f1c73ff2a387c8c3ac4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:32 np0005548788.localdomain sudo[203798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqgwaspjalwanzilrfxbqftjcxlyyumu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014091.4192154-3215-8277085765566/AnsiballZ_command.py
Dec 06 09:41:32 np0005548788.localdomain sudo[203798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:32 np0005548788.localdomain python3.9[203800]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 1939e851-b10c-5c3b-9bb7-8e7f380233e8
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:32 np0005548788.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:203802:1051290 (system bus name :1.2835 [pkttyagent --process 203802 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 06 09:41:32 np0005548788.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:203802:1051290 (system bus name :1.2835, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 06 09:41:32 np0005548788.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:203801:1051290 (system bus name :1.2836 [pkttyagent --process 203801 --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 06 09:41:32 np0005548788.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:203801:1051290 (system bus name :1.2836, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 06 09:41:32 np0005548788.localdomain sudo[203798]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 np0005548788.localdomain python3.9[203920]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:34 np0005548788.localdomain sudo[204028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aebpsjrxvgfxvhlkchllkpivbnyfqeys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014093.7408645-3263-38068055700361/AnsiballZ_command.py
Dec 06 09:41:34 np0005548788.localdomain sudo[204028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:34 np0005548788.localdomain sudo[204028]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3873 DF PROTO=TCP SPT=54006 DPT=9101 SEQ=3588668121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7A39E0000000001030307) 
Dec 06 09:41:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14021 DF PROTO=TCP SPT=60960 DPT=9882 SEQ=2621926078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7A3F00000000001030307) 
Dec 06 09:41:35 np0005548788.localdomain sudo[204139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdujnnaissuyhghgbrtqtuzjjyrbnvhc ; FSID=1939e851-b10c-5c3b-9bb7-8e7f380233e8 KEY=AQC14jNpAAAAABAAVDrRWQiDxWIwal0FbWGWhA== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014094.9797091-3287-42277174976631/AnsiballZ_command.py
Dec 06 09:41:35 np0005548788.localdomain sudo[204139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:35 np0005548788.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:204142:1051567 (system bus name :1.2839 [pkttyagent --process 204142 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 06 09:41:35 np0005548788.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:204142:1051567 (system bus name :1.2839, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 06 09:41:35 np0005548788.localdomain sudo[204139]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:36 np0005548788.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 06 09:41:36 np0005548788.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 06 09:41:37 np0005548788.localdomain sudo[204256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyttgvcryuyyjsniwqyxmwshwdrdbujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014096.7891958-3311-102400537860343/AnsiballZ_copy.py
Dec 06 09:41:37 np0005548788.localdomain sudo[204256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:37 np0005548788.localdomain python3.9[204258]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:37 np0005548788.localdomain sudo[204256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:37 np0005548788.localdomain sudo[204366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikszfbvvycqmgrtzfqvqbbdtpawflmli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014097.4838836-3335-2769254375214/AnsiballZ_stat.py
Dec 06 09:41:37 np0005548788.localdomain sudo[204366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3875 DF PROTO=TCP SPT=54006 DPT=9101 SEQ=3588668121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7AFB10000000001030307) 
Dec 06 09:41:37 np0005548788.localdomain python3.9[204368]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:37 np0005548788.localdomain sudo[204366]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:38 np0005548788.localdomain sudo[204454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qngnbtvbupxcurhtuvkuiqbpsbbzecjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014097.4838836-3335-2769254375214/AnsiballZ_copy.py
Dec 06 09:41:38 np0005548788.localdomain sudo[204454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:38 np0005548788.localdomain python3.9[204456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014097.4838836-3335-2769254375214/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:38 np0005548788.localdomain sudo[204454]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:39 np0005548788.localdomain sudo[204564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suhzcyahzxyijvjooobakjyjbssrrtez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014098.8525784-3383-217523195514021/AnsiballZ_file.py
Dec 06 09:41:39 np0005548788.localdomain sudo[204564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:39 np0005548788.localdomain python3.9[204566]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:39 np0005548788.localdomain sudo[204564]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:39 np0005548788.localdomain sudo[204674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqclqojixzdotlansdfxqcvnjiekarpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014099.5502877-3407-107391721945385/AnsiballZ_stat.py
Dec 06 09:41:39 np0005548788.localdomain sudo[204674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:39 np0005548788.localdomain sshd[204677]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:40 np0005548788.localdomain python3.9[204676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:40 np0005548788.localdomain sudo[204674]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:40 np0005548788.localdomain sudo[204733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilxroehfnazlncdqyobnwntqzleknryx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014099.5502877-3407-107391721945385/AnsiballZ_file.py
Dec 06 09:41:40 np0005548788.localdomain sudo[204733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36631 DF PROTO=TCP SPT=47562 DPT=9105 SEQ=2224188633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7B9F00000000001030307) 
Dec 06 09:41:40 np0005548788.localdomain sshd[204677]: Received disconnect from 148.227.3.232 port 60784:11: Bye Bye [preauth]
Dec 06 09:41:40 np0005548788.localdomain sshd[204677]: Disconnected from authenticating user root 148.227.3.232 port 60784 [preauth]
Dec 06 09:41:40 np0005548788.localdomain python3.9[204735]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:40 np0005548788.localdomain sudo[204733]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:41 np0005548788.localdomain sudo[204843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spblmsxirjfbhbfrpgpvokifowocyyth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014100.796028-3443-193456134452184/AnsiballZ_stat.py
Dec 06 09:41:41 np0005548788.localdomain sudo[204843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:41 np0005548788.localdomain python3.9[204845]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:41 np0005548788.localdomain sudo[204843]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:41 np0005548788.localdomain sudo[204900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgjjfiflpoufsqvnsdxlczrtcrywticz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014100.796028-3443-193456134452184/AnsiballZ_file.py
Dec 06 09:41:41 np0005548788.localdomain sudo[204900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:41 np0005548788.localdomain python3.9[204902]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.2aw2g98o recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:41 np0005548788.localdomain sudo[204900]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:42 np0005548788.localdomain sudo[205010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcldatyrcqmjdkfpwnncvrsybpumvxax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014101.9667983-3479-170281191590511/AnsiballZ_stat.py
Dec 06 09:41:42 np0005548788.localdomain sudo[205010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:42 np0005548788.localdomain python3.9[205012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:42 np0005548788.localdomain sudo[205010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:42 np0005548788.localdomain sudo[205067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwvrrmaeutjshiklgbufxkrpyxysfosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014101.9667983-3479-170281191590511/AnsiballZ_file.py
Dec 06 09:41:42 np0005548788.localdomain sudo[205067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:42 np0005548788.localdomain python3.9[205069]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:42 np0005548788.localdomain sudo[205067]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:43 np0005548788.localdomain sudo[205177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujubaxfpftlhzdllvculsfomywqfndgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014103.1567743-3518-20785863288914/AnsiballZ_command.py
Dec 06 09:41:43 np0005548788.localdomain sudo[205177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44021 DF PROTO=TCP SPT=34368 DPT=9100 SEQ=279885983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7C5BF0000000001030307) 
Dec 06 09:41:43 np0005548788.localdomain python3.9[205179]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:43 np0005548788.localdomain sudo[205177]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:44 np0005548788.localdomain sudo[205288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loehzlamntsdbdotwniqtuecglznyhks ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014103.8976636-3542-31053650020669/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:41:44 np0005548788.localdomain sudo[205288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:44 np0005548788.localdomain python3[205290]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:41:44 np0005548788.localdomain sudo[205288]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:45 np0005548788.localdomain sudo[205398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdhpzzayxwbudfycqjtjgravtlevqcif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014104.7177129-3566-35053022177700/AnsiballZ_stat.py
Dec 06 09:41:45 np0005548788.localdomain sudo[205398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:45 np0005548788.localdomain python3.9[205400]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:45 np0005548788.localdomain sudo[205398]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:45 np0005548788.localdomain sudo[205455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvpfnzkoamjtuoowxqkopferhogitqlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014104.7177129-3566-35053022177700/AnsiballZ_file.py
Dec 06 09:41:45 np0005548788.localdomain sudo[205455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:45 np0005548788.localdomain python3.9[205457]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:45 np0005548788.localdomain sudo[205455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44023 DF PROTO=TCP SPT=34368 DPT=9100 SEQ=279885983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7D1B00000000001030307) 
Dec 06 09:41:47 np0005548788.localdomain sudo[205565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohmyhjgzmgmpplkftwshskatnlhhannb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014106.6494298-3602-63422392544107/AnsiballZ_stat.py
Dec 06 09:41:47 np0005548788.localdomain sudo[205565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:47 np0005548788.localdomain python3.9[205567]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:47 np0005548788.localdomain sudo[205565]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:41:47.397 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:41:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:41:47.398 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:41:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:41:47.398 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:41:47 np0005548788.localdomain sudo[205622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noqukvehxadmzysvqmdpljkgmaqovfjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014106.6494298-3602-63422392544107/AnsiballZ_file.py
Dec 06 09:41:47 np0005548788.localdomain sudo[205622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:47 np0005548788.localdomain python3.9[205624]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:47 np0005548788.localdomain sudo[205622]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:48 np0005548788.localdomain sudo[205732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbmiumvmiuecjyxkdflawlfxwmdohowj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014107.8763418-3638-115989146076564/AnsiballZ_stat.py
Dec 06 09:41:48 np0005548788.localdomain sudo[205732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:48 np0005548788.localdomain python3.9[205734]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:48 np0005548788.localdomain sudo[205732]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:49 np0005548788.localdomain sudo[205789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzdynhsgrtxtmzjgkhjtxjkxdgzevkgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014107.8763418-3638-115989146076564/AnsiballZ_file.py
Dec 06 09:41:49 np0005548788.localdomain sudo[205789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:49 np0005548788.localdomain python3.9[205791]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:49 np0005548788.localdomain sudo[205789]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54465 DF PROTO=TCP SPT=47808 DPT=9102 SEQ=3914807378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7DCC40000000001030307) 
Dec 06 09:41:49 np0005548788.localdomain sudo[205899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btfzpulljckjvtpsrejxmveiilrgcgtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014109.4980242-3674-212566301841651/AnsiballZ_stat.py
Dec 06 09:41:49 np0005548788.localdomain sudo[205899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:50 np0005548788.localdomain python3.9[205901]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:50 np0005548788.localdomain sudo[205899]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:50 np0005548788.localdomain sudo[205956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxyyngrxakaiexvqlfzltqczqkyvwsob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014109.4980242-3674-212566301841651/AnsiballZ_file.py
Dec 06 09:41:50 np0005548788.localdomain sudo[205956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:50 np0005548788.localdomain python3.9[205958]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:50 np0005548788.localdomain sudo[205956]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:51 np0005548788.localdomain sudo[206066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dptqtzdbmiictyxfawebnnvjjgvezvrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014110.7972302-3710-105062840659678/AnsiballZ_stat.py
Dec 06 09:41:51 np0005548788.localdomain sudo[206066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:51 np0005548788.localdomain python3.9[206068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:51 np0005548788.localdomain sudo[206066]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:51 np0005548788.localdomain sudo[206156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hurkevvaafyktqzptnleejwljvuimnho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014110.7972302-3710-105062840659678/AnsiballZ_copy.py
Dec 06 09:41:51 np0005548788.localdomain sudo[206156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:41:51 np0005548788.localdomain podman[206159]: 2025-12-06 09:41:51.872416842 +0000 UTC m=+0.097378561 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:41:51 np0005548788.localdomain podman[206159]: 2025-12-06 09:41:51.915745983 +0000 UTC m=+0.140707702 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 06 09:41:51 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:41:51 np0005548788.localdomain python3.9[206158]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014110.7972302-3710-105062840659678/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:51 np0005548788.localdomain sudo[206156]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54467 DF PROTO=TCP SPT=47808 DPT=9102 SEQ=3914807378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7E8B00000000001030307) 
Dec 06 09:41:52 np0005548788.localdomain sudo[206291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzcvxdtivihiixavjuyhgrgctpbperdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014112.2584898-3755-276594977914155/AnsiballZ_file.py
Dec 06 09:41:52 np0005548788.localdomain sudo[206291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:52 np0005548788.localdomain python3.9[206293]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:52 np0005548788.localdomain sudo[206291]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:41:53 np0005548788.localdomain sudo[206401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-namiizxlbdjujttjdadzjabztjyggnwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014112.9162977-3779-237317047678722/AnsiballZ_command.py
Dec 06 09:41:53 np0005548788.localdomain sudo[206401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:53 np0005548788.localdomain podman[206402]: 2025-12-06 09:41:53.267282344 +0000 UTC m=+0.079555526 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:41:53 np0005548788.localdomain podman[206402]: 2025-12-06 09:41:53.298511415 +0000 UTC m=+0.110784547 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:41:53 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:41:53 np0005548788.localdomain python3.9[206412]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:53 np0005548788.localdomain sudo[206401]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:54 np0005548788.localdomain sudo[206532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaphsliupugisxjyqskzupmmnpnwnzrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014113.637427-3803-106173891455750/AnsiballZ_blockinfile.py
Dec 06 09:41:54 np0005548788.localdomain sudo[206532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:54 np0005548788.localdomain python3.9[206534]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:54 np0005548788.localdomain sudo[206532]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:54 np0005548788.localdomain sudo[206606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:54 np0005548788.localdomain sudo[206606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:41:54 np0005548788.localdomain sudo[206606]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:55 np0005548788.localdomain sudo[206641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:41:55 np0005548788.localdomain sudo[206641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:41:55 np0005548788.localdomain sudo[206677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvqdflzkxixewrpvaynpeihwhcizwkpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014114.7065697-3830-14865880050277/AnsiballZ_command.py
Dec 06 09:41:55 np0005548788.localdomain sudo[206677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:55 np0005548788.localdomain python3.9[206680]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:55 np0005548788.localdomain sudo[206677]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:55 np0005548788.localdomain sudo[206641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:55 np0005548788.localdomain sudo[206821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hltgkxeygylkoduwtivhpdogbyhvhymp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014115.4222069-3854-246242347450847/AnsiballZ_stat.py
Dec 06 09:41:55 np0005548788.localdomain sudo[206821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:55 np0005548788.localdomain python3.9[206823]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:41:55 np0005548788.localdomain sudo[206821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54468 DF PROTO=TCP SPT=47808 DPT=9102 SEQ=3914807378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E7F8700000000001030307) 
Dec 06 09:41:56 np0005548788.localdomain sudo[206933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elxneqmaqamgarddpbsefobvpfbtduiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014116.2849336-3878-254852574364896/AnsiballZ_command.py
Dec 06 09:41:56 np0005548788.localdomain sudo[206933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:56 np0005548788.localdomain python3.9[206935]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:56 np0005548788.localdomain sudo[206933]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:57 np0005548788.localdomain sudo[207046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iucrntzwlrtfgfwtqbbzkgmyoxdhhbve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014117.065419-3902-229445812814786/AnsiballZ_file.py
Dec 06 09:41:57 np0005548788.localdomain sudo[207046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:57 np0005548788.localdomain python3.9[207048]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:57 np0005548788.localdomain sudo[207046]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548788.localdomain sudo[207156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzoaxlmcpjiovzjpcjubuurxhcjbvyfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014117.7945828-3926-248806655963428/AnsiballZ_stat.py
Dec 06 09:41:58 np0005548788.localdomain sudo[207156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:58 np0005548788.localdomain sudo[207159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:41:58 np0005548788.localdomain sudo[207159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:41:58 np0005548788.localdomain sudo[207159]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548788.localdomain python3.9[207158]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:58 np0005548788.localdomain sudo[207156]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548788.localdomain sudo[207262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggegscvjihdifypzjescwbzcxswlqchg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014117.7945828-3926-248806655963428/AnsiballZ_copy.py
Dec 06 09:41:58 np0005548788.localdomain sudo[207262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:58 np0005548788.localdomain python3.9[207264]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014117.7945828-3926-248806655963428/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:58 np0005548788.localdomain sudo[207262]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44025 DF PROTO=TCP SPT=34368 DPT=9100 SEQ=279885983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E801F00000000001030307) 
Dec 06 09:41:59 np0005548788.localdomain sudo[207372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrdufhmhhkyobtkhnuapbefdjnpjwcll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014119.062275-3971-264007244050973/AnsiballZ_stat.py
Dec 06 09:41:59 np0005548788.localdomain sudo[207372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:59 np0005548788.localdomain python3.9[207374]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:59 np0005548788.localdomain sudo[207372]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:59 np0005548788.localdomain sudo[207460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fichpdgiyyztmynagjwcdfiqheiqmbax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014119.062275-3971-264007244050973/AnsiballZ_copy.py
Dec 06 09:41:59 np0005548788.localdomain sudo[207460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:00 np0005548788.localdomain python3.9[207462]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014119.062275-3971-264007244050973/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:00 np0005548788.localdomain sudo[207460]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:00 np0005548788.localdomain sudo[207570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eboslckdmbmebwyxiutughjtqukyypkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014120.3247972-4016-141173313040180/AnsiballZ_stat.py
Dec 06 09:42:00 np0005548788.localdomain sudo[207570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:00 np0005548788.localdomain python3.9[207572]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:42:00 np0005548788.localdomain sudo[207570]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:01 np0005548788.localdomain sudo[207658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqucecxayvywofhwjsllpxbmjxywbbcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014120.3247972-4016-141173313040180/AnsiballZ_copy.py
Dec 06 09:42:01 np0005548788.localdomain sudo[207658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:01 np0005548788.localdomain python3.9[207660]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014120.3247972-4016-141173313040180/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:01 np0005548788.localdomain sudo[207658]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:01 np0005548788.localdomain sudo[207768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxfpabymizjryxgxdxpicctnjdahbaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014121.6875944-4061-75028710402751/AnsiballZ_systemd.py
Dec 06 09:42:01 np0005548788.localdomain sudo[207768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:02 np0005548788.localdomain python3.9[207770]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:42:02 np0005548788.localdomain systemd-sysv-generator[207795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:02 np0005548788.localdomain systemd-rc-local-generator[207791]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548788.localdomain systemd[1]: Reached target edpm_libvirt.target.
Dec 06 09:42:02 np0005548788.localdomain sudo[207768]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:03 np0005548788.localdomain sudo[207918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lorgcgacdfjokzpuxvlxzudgliqnjddp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014122.9272463-4085-236001886362707/AnsiballZ_systemd.py
Dec 06 09:42:03 np0005548788.localdomain sudo[207918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:03 np0005548788.localdomain python3.9[207920]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:42:03 np0005548788.localdomain systemd-rc-local-generator[207948]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:03 np0005548788.localdomain systemd-sysv-generator[207951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:42:04 np0005548788.localdomain systemd-sysv-generator[207985]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:04 np0005548788.localdomain systemd-rc-local-generator[207982]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548788.localdomain sudo[207918]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54469 DF PROTO=TCP SPT=47808 DPT=9102 SEQ=3914807378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E817F00000000001030307) 
Dec 06 09:42:04 np0005548788.localdomain sshd[159805]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Dec 06 09:42:04 np0005548788.localdomain systemd[1]: session-53.scope: Consumed 3min 59.531s CPU time.
Dec 06 09:42:04 np0005548788.localdomain systemd-logind[765]: Session 53 logged out. Waiting for processes to exit.
Dec 06 09:42:04 np0005548788.localdomain systemd-logind[765]: Removed session 53.
Dec 06 09:42:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44843 DF PROTO=TCP SPT=52428 DPT=9101 SEQ=2780856532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E818CE0000000001030307) 
Dec 06 09:42:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44845 DF PROTO=TCP SPT=52428 DPT=9101 SEQ=2780856532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E824F00000000001030307) 
Dec 06 09:42:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43127 DF PROTO=TCP SPT=60770 DPT=9105 SEQ=3123975679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E82FF00000000001030307) 
Dec 06 09:42:11 np0005548788.localdomain sshd[208012]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:11 np0005548788.localdomain sshd[208012]: Accepted publickey for zuul from 192.168.122.30 port 33470 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:42:11 np0005548788.localdomain systemd-logind[765]: New session 54 of user zuul.
Dec 06 09:42:11 np0005548788.localdomain systemd[1]: Started Session 54 of User zuul.
Dec 06 09:42:11 np0005548788.localdomain sshd[208012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:42:12 np0005548788.localdomain python3.9[208123]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:42:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4770 DF PROTO=TCP SPT=53944 DPT=9100 SEQ=18690279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E83AF00000000001030307) 
Dec 06 09:42:14 np0005548788.localdomain python3.9[208235]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:42:14 np0005548788.localdomain network[208252]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:42:14 np0005548788.localdomain network[208253]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:42:14 np0005548788.localdomain network[208254]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:42:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4772 DF PROTO=TCP SPT=53944 DPT=9100 SEQ=18690279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E846F00000000001030307) 
Dec 06 09:42:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18517 DF PROTO=TCP SPT=53080 DPT=9102 SEQ=3747486367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E851F50000000001030307) 
Dec 06 09:42:20 np0005548788.localdomain sudo[208484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysrynahyyrzsrboiluiupeumvebnbcfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014139.8329792-101-193712753823136/AnsiballZ_setup.py
Dec 06 09:42:20 np0005548788.localdomain sudo[208484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:21 np0005548788.localdomain python3.9[208486]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:42:21 np0005548788.localdomain sudo[208484]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:21 np0005548788.localdomain sudo[208547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edxvwmglqibzsbvsmanzwawwiokriuth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014139.8329792-101-193712753823136/AnsiballZ_dnf.py
Dec 06 09:42:21 np0005548788.localdomain sudo[208547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:22 np0005548788.localdomain python3.9[208549]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:42:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:42:22 np0005548788.localdomain systemd[1]: tmp-crun.qp4Wgw.mount: Deactivated successfully.
Dec 06 09:42:22 np0005548788.localdomain podman[208551]: 2025-12-06 09:42:22.289499248 +0000 UTC m=+0.113474845 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 09:42:22 np0005548788.localdomain podman[208551]: 2025-12-06 09:42:22.330252924 +0000 UTC m=+0.154228451 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 09:42:22 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:42:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18519 DF PROTO=TCP SPT=53080 DPT=9102 SEQ=3747486367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E85DF00000000001030307) 
Dec 06 09:42:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:42:24 np0005548788.localdomain systemd[1]: tmp-crun.SeccdJ.mount: Deactivated successfully.
Dec 06 09:42:24 np0005548788.localdomain podman[208579]: 2025-12-06 09:42:24.26966358 +0000 UTC m=+0.095843107 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Dec 06 09:42:24 np0005548788.localdomain podman[208579]: 2025-12-06 09:42:24.27590476 +0000 UTC m=+0.102084297 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:42:24 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:42:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18520 DF PROTO=TCP SPT=53080 DPT=9102 SEQ=3747486367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E86DB00000000001030307) 
Dec 06 09:42:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4774 DF PROTO=TCP SPT=53944 DPT=9100 SEQ=18690279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E877F00000000001030307) 
Dec 06 09:42:29 np0005548788.localdomain sudo[208547]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:30 np0005548788.localdomain sudo[208704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkszcdkfdfwakcczbrcpcwdqeyvxpmdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014149.8393776-137-27220333911014/AnsiballZ_stat.py
Dec 06 09:42:30 np0005548788.localdomain sudo[208704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:30 np0005548788.localdomain python3.9[208706]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:30 np0005548788.localdomain sudo[208704]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:31 np0005548788.localdomain sudo[208816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixleuwejsvnfoigbfmprrkldejueappg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014150.6979604-161-56905319791750/AnsiballZ_copy.py
Dec 06 09:42:31 np0005548788.localdomain sudo[208816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:31 np0005548788.localdomain python3.9[208818]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:31 np0005548788.localdomain sudo[208816]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:31 np0005548788.localdomain sudo[208926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgnyvrnnknrnoqjcgbfxkxngkrrfixgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014151.5423908-185-281421193860923/AnsiballZ_command.py
Dec 06 09:42:31 np0005548788.localdomain sudo[208926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:32 np0005548788.localdomain python3.9[208928]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:32 np0005548788.localdomain sudo[208926]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:33 np0005548788.localdomain sudo[209037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yecylcebiuyizrryyqhmcdsagwhxcwxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014152.4128509-209-235225764549298/AnsiballZ_command.py
Dec 06 09:42:33 np0005548788.localdomain sudo[209037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:33 np0005548788.localdomain python3.9[209039]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:33 np0005548788.localdomain sudo[209037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:34 np0005548788.localdomain sudo[209148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agcqapqwlklcbgiilowjqitsfpuuaknj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014153.8134212-233-74409241334387/AnsiballZ_command.py
Dec 06 09:42:34 np0005548788.localdomain sudo[209148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:34 np0005548788.localdomain python3.9[209150]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:34 np0005548788.localdomain sudo[209148]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18521 DF PROTO=TCP SPT=53080 DPT=9102 SEQ=3747486367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E88DF00000000001030307) 
Dec 06 09:42:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32771 DF PROTO=TCP SPT=39860 DPT=9882 SEQ=4057479245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E88DF10000000001030307) 
Dec 06 09:42:34 np0005548788.localdomain sudo[209259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ribagaukrtpniksplnmmzxsxjxhrzekf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014154.6703637-260-23368195575923/AnsiballZ_stat.py
Dec 06 09:42:34 np0005548788.localdomain sudo[209259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:35 np0005548788.localdomain python3.9[209261]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:35 np0005548788.localdomain sudo[209259]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:36 np0005548788.localdomain sudo[209371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkdvqclpmiyjlqqqfbwohvhlnzkfqgjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014156.0694058-293-43244946471272/AnsiballZ_lineinfile.py
Dec 06 09:42:36 np0005548788.localdomain sudo[209371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:36 np0005548788.localdomain python3.9[209373]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:36 np0005548788.localdomain sudo[209371]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:37 np0005548788.localdomain sudo[209481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqrlsoqmetzdnxuihwoavnixzfdsfswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014157.0445826-320-227081856229197/AnsiballZ_systemd_service.py
Dec 06 09:42:37 np0005548788.localdomain sudo[209481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:37 np0005548788.localdomain sshd[209484]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34627 DF PROTO=TCP SPT=60158 DPT=9101 SEQ=512525184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E899F00000000001030307) 
Dec 06 09:42:37 np0005548788.localdomain python3.9[209483]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:42:38 np0005548788.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 06 09:42:38 np0005548788.localdomain sudo[209481]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:38 np0005548788.localdomain sudo[209597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bddguoexcscfufwpbinzhmjfmjodsojq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014158.4242332-344-38481990397200/AnsiballZ_systemd_service.py
Dec 06 09:42:38 np0005548788.localdomain sudo[209597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:39 np0005548788.localdomain python3.9[209599]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:42:39 np0005548788.localdomain systemd-rc-local-generator[209623]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:39 np0005548788.localdomain systemd-sysv-generator[209626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: Starting Open-iSCSI...
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: If using hardware iscsi like qla4xxx this message can be ignored.
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Dec 06 09:42:39 np0005548788.localdomain iscsid[209640]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: Started Open-iSCSI.
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 06 09:42:39 np0005548788.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 06 09:42:39 np0005548788.localdomain sudo[209597]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:39 np0005548788.localdomain sshd[209484]: Received disconnect from 45.78.219.195 port 39002:11: Bye Bye [preauth]
Dec 06 09:42:39 np0005548788.localdomain sshd[209484]: Disconnected from authenticating user root 45.78.219.195 port 39002 [preauth]
Dec 06 09:42:41 np0005548788.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 06 09:42:41 np0005548788.localdomain sudo[209750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epqxrgwxsblytuzhckmbjchcazhkxpsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014161.030602-377-153369002992327/AnsiballZ_service_facts.py
Dec 06 09:42:41 np0005548788.localdomain sudo[209750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:41 np0005548788.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 06 09:42:41 np0005548788.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Dec 06 09:42:41 np0005548788.localdomain python3.9[209752]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:42:41 np0005548788.localdomain network[209782]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:42:41 np0005548788.localdomain network[209783]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:42:41 np0005548788.localdomain network[209784]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:42:41 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38913 DF PROTO=TCP SPT=38496 DPT=9105 SEQ=2995768066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8A9B00000000001030307) 
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 2b7e2aaa-6963-49d2-85c9-a5350bd880d5
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 2b7e2aaa-6963-49d2-85c9-a5350bd880d5
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 2b7e2aaa-6963-49d2-85c9-a5350bd880d5
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 2b7e2aaa-6963-49d2-85c9-a5350bd880d5
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 2b7e2aaa-6963-49d2-85c9-a5350bd880d5
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 2b7e2aaa-6963-49d2-85c9-a5350bd880d5
Dec 06 09:42:42 np0005548788.localdomain setroubleshoot[209675]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52211 DF PROTO=TCP SPT=56690 DPT=9100 SEQ=244687763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8B01F0000000001030307) 
Dec 06 09:42:44 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:45 np0005548788.localdomain sudo[209750]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52213 DF PROTO=TCP SPT=56690 DPT=9100 SEQ=244687763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8BC300000000001030307) 
Dec 06 09:42:46 np0005548788.localdomain sudo[210016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbmqtnevafbhxcrqndsqslrhyhaeatpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014166.1458964-407-183498198072059/AnsiballZ_file.py
Dec 06 09:42:46 np0005548788.localdomain sudo[210016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:46 np0005548788.localdomain python3.9[210018]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:42:46 np0005548788.localdomain sudo[210016]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:42:47.399 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:42:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:42:47.400 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:42:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:42:47.400 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:42:47 np0005548788.localdomain sudo[210126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqxprvxydyarawbiwijnjapdlixzedls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014167.082305-431-2928476821062/AnsiballZ_modprobe.py
Dec 06 09:42:47 np0005548788.localdomain sudo[210126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:47 np0005548788.localdomain python3.9[210128]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 06 09:42:47 np0005548788.localdomain sudo[210126]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:48 np0005548788.localdomain sudo[210240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvfcwfwkbhkkhqakdwpsgkfurcmpayef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014167.9875002-455-112390013649758/AnsiballZ_stat.py
Dec 06 09:42:48 np0005548788.localdomain sudo[210240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:48 np0005548788.localdomain python3.9[210242]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:42:48 np0005548788.localdomain sudo[210240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:48 np0005548788.localdomain sudo[210328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzsavrynbmudiomctqbdaziozeigcsnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014167.9875002-455-112390013649758/AnsiballZ_copy.py
Dec 06 09:42:48 np0005548788.localdomain sudo[210328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:49 np0005548788.localdomain python3.9[210330]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014167.9875002-455-112390013649758/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:49 np0005548788.localdomain sudo[210328]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15520 DF PROTO=TCP SPT=48744 DPT=9102 SEQ=3265566061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8C7250000000001030307) 
Dec 06 09:42:49 np0005548788.localdomain sudo[210438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-liwprgzwidaikskooanllxklowoitiky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014169.3678691-503-182418933426813/AnsiballZ_lineinfile.py
Dec 06 09:42:49 np0005548788.localdomain sudo[210438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:49 np0005548788.localdomain python3.9[210440]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:49 np0005548788.localdomain sudo[210438]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:50 np0005548788.localdomain sudo[210548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iundlfcoiprecdnlyxeypskkeilvebtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014170.069016-527-258619119225052/AnsiballZ_systemd.py
Dec 06 09:42:50 np0005548788.localdomain sudo[210548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:51 np0005548788.localdomain python3.9[210550]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:42:51 np0005548788.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:42:51 np0005548788.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:42:51 np0005548788.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:42:51 np0005548788.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 09:42:51 np0005548788.localdomain systemd-modules-load[210554]: Module 'msr' is built in
Dec 06 09:42:51 np0005548788.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 09:42:51 np0005548788.localdomain sudo[210548]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:51 np0005548788.localdomain sudo[210664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwaffkeoexgpmgnowydkewpbuzscsotu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014171.2687945-551-158975140318244/AnsiballZ_file.py
Dec 06 09:42:51 np0005548788.localdomain sudo[210664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:51 np0005548788.localdomain python3.9[210666]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:42:51 np0005548788.localdomain sudo[210664]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:51 np0005548788.localdomain sshd[210667]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15522 DF PROTO=TCP SPT=48744 DPT=9102 SEQ=3265566061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8D3300000000001030307) 
Dec 06 09:42:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:42:52 np0005548788.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Dec 06 09:42:52 np0005548788.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 06 09:42:52 np0005548788.localdomain podman[210684]: 2025-12-06 09:42:52.646948995 +0000 UTC m=+0.091452243 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 09:42:52 np0005548788.localdomain podman[210684]: 2025-12-06 09:42:52.691715506 +0000 UTC m=+0.136218734 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:42:52 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:42:53 np0005548788.localdomain sudo[210803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-liuojjkvxcoisjedhzohjyxmbbppjeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014172.798573-578-240581395203423/AnsiballZ_stat.py
Dec 06 09:42:53 np0005548788.localdomain sudo[210803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:53 np0005548788.localdomain python3.9[210805]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:53 np0005548788.localdomain sudo[210803]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:53 np0005548788.localdomain sudo[210913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtcsdbvwfrcffbmiahjaroovqurjfsrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014173.5768957-605-54745893510073/AnsiballZ_stat.py
Dec 06 09:42:53 np0005548788.localdomain sudo[210913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:54 np0005548788.localdomain python3.9[210915]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:54 np0005548788.localdomain sudo[210913]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:55 np0005548788.localdomain sudo[211023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdwkaishylvpdxzfbffklhwwkshhwjxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014174.8628397-629-281214454193901/AnsiballZ_stat.py
Dec 06 09:42:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:42:55 np0005548788.localdomain sudo[211023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:55 np0005548788.localdomain systemd[1]: tmp-crun.WxCItE.mount: Deactivated successfully.
Dec 06 09:42:55 np0005548788.localdomain podman[211025]: 2025-12-06 09:42:55.289675558 +0000 UTC m=+0.105296936 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 09:42:55 np0005548788.localdomain podman[211025]: 2025-12-06 09:42:55.298581581 +0000 UTC m=+0.114202989 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:42:55 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:42:55 np0005548788.localdomain python3.9[211026]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:42:55 np0005548788.localdomain sshd[210667]: Received disconnect from 101.47.142.76 port 39952:11: Bye Bye [preauth]
Dec 06 09:42:55 np0005548788.localdomain sshd[210667]: Disconnected from authenticating user root 101.47.142.76 port 39952 [preauth]
Dec 06 09:42:55 np0005548788.localdomain sudo[211023]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:55 np0005548788.localdomain sudo[211130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpjjcvjtotxkvpauslqyocaplubinkba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014174.8628397-629-281214454193901/AnsiballZ_copy.py
Dec 06 09:42:55 np0005548788.localdomain sudo[211130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:55 np0005548788.localdomain python3.9[211132]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014174.8628397-629-281214454193901/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:56 np0005548788.localdomain sudo[211130]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:56 np0005548788.localdomain sudo[211240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjmenulvmlnqntwwmphslvwxsxlpncts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014176.19742-674-266245222441924/AnsiballZ_command.py
Dec 06 09:42:56 np0005548788.localdomain sudo[211240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15523 DF PROTO=TCP SPT=48744 DPT=9102 SEQ=3265566061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8E2F00000000001030307) 
Dec 06 09:42:56 np0005548788.localdomain python3.9[211242]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:56 np0005548788.localdomain sudo[211240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:57 np0005548788.localdomain sudo[211351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isstqztidoaxudpldyrrxkzgfpiubaih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014176.871661-698-145442568835499/AnsiballZ_lineinfile.py
Dec 06 09:42:57 np0005548788.localdomain sudo[211351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:57 np0005548788.localdomain python3.9[211353]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:57 np0005548788.localdomain sudo[211351]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:58 np0005548788.localdomain sudo[211461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osnhzsbdbmitqswlrolkezjiccxcnkwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014177.7248487-722-204000229627989/AnsiballZ_replace.py
Dec 06 09:42:58 np0005548788.localdomain sudo[211461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:58 np0005548788.localdomain python3.9[211463]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:58 np0005548788.localdomain sudo[211461]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:58 np0005548788.localdomain sudo[211469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:58 np0005548788.localdomain sudo[211469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:58 np0005548788.localdomain sudo[211469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:58 np0005548788.localdomain sudo[211502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:42:58 np0005548788.localdomain sudo[211502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52215 DF PROTO=TCP SPT=56690 DPT=9100 SEQ=244687763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E8EBF00000000001030307) 
Dec 06 09:42:58 np0005548788.localdomain sudo[211607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sisqmomploghvdizmnvdqpkxeocxajck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014178.5754476-746-190944324641426/AnsiballZ_replace.py
Dec 06 09:42:58 np0005548788.localdomain sudo[211607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:59 np0005548788.localdomain python3.9[211609]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:59 np0005548788.localdomain sudo[211502]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548788.localdomain sudo[211607]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548788.localdomain sudo[211649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:59 np0005548788.localdomain sudo[211649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:59 np0005548788.localdomain sudo[211649]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548788.localdomain sudo[211684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:42:59 np0005548788.localdomain sudo[211684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:59 np0005548788.localdomain sudo[211775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxwyagwcghypabcafbmkobhcmfeixbzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014179.3432295-773-273361591279974/AnsiballZ_lineinfile.py
Dec 06 09:42:59 np0005548788.localdomain sudo[211775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:59 np0005548788.localdomain python3.9[211777]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:59 np0005548788.localdomain sudo[211775]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548788.localdomain sudo[211684]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548788.localdomain sudo[211917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgfgcvzqtvstlscnwcdppancyzmblsaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014179.993206-773-190467746722515/AnsiballZ_lineinfile.py
Dec 06 09:43:00 np0005548788.localdomain sudo[211917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:00 np0005548788.localdomain python3.9[211919]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:00 np0005548788.localdomain sudo[211917]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548788.localdomain sudo[211920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:43:00 np0005548788.localdomain sudo[211920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:43:00 np0005548788.localdomain sudo[211920]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548788.localdomain sudo[212045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aamsxdtxmlzxtnlasmcobxflwpatbqxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014180.618533-773-36679345920518/AnsiballZ_lineinfile.py
Dec 06 09:43:00 np0005548788.localdomain sudo[212045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:01 np0005548788.localdomain python3.9[212047]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:01 np0005548788.localdomain sudo[212045]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:01 np0005548788.localdomain sudo[212155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxkmqdmtxgcblaheijhyqqkmvrmzftug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014181.3298428-773-91245303230484/AnsiballZ_lineinfile.py
Dec 06 09:43:01 np0005548788.localdomain sudo[212155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:01 np0005548788.localdomain python3.9[212157]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:01 np0005548788.localdomain sudo[212155]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:02 np0005548788.localdomain sudo[212265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odqoiuodzinazztetokpmofqridbathr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014181.994104-860-145575330808117/AnsiballZ_stat.py
Dec 06 09:43:02 np0005548788.localdomain sudo[212265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:02 np0005548788.localdomain python3.9[212267]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:02 np0005548788.localdomain sudo[212265]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:03 np0005548788.localdomain sudo[212377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqszyhwljginkyhhkrzfvactwreooptb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014182.6841073-884-202513566335687/AnsiballZ_file.py
Dec 06 09:43:03 np0005548788.localdomain sudo[212377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:03 np0005548788.localdomain python3.9[212379]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:03 np0005548788.localdomain sudo[212377]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:03 np0005548788.localdomain sudo[212487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psxfvyvcqvdeqckjaduyrbfbnkehejbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014183.5109053-911-42583035267618/AnsiballZ_file.py
Dec 06 09:43:03 np0005548788.localdomain sudo[212487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:03 np0005548788.localdomain python3.9[212489]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:04 np0005548788.localdomain sudo[212487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56736 DF PROTO=TCP SPT=56748 DPT=9101 SEQ=1242055555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9032E0000000001030307) 
Dec 06 09:43:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15524 DF PROTO=TCP SPT=48744 DPT=9102 SEQ=3265566061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E903F00000000001030307) 
Dec 06 09:43:05 np0005548788.localdomain sudo[212597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-septwnreiccuoadswvshxqjkbsztiwrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014184.814746-935-258020441894997/AnsiballZ_stat.py
Dec 06 09:43:05 np0005548788.localdomain sudo[212597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:05 np0005548788.localdomain python3.9[212599]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:05 np0005548788.localdomain sudo[212597]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:05 np0005548788.localdomain sudo[212654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnzkxhntzqzqzdsaycdozgviyhoexpyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014184.814746-935-258020441894997/AnsiballZ_file.py
Dec 06 09:43:05 np0005548788.localdomain sudo[212654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:05 np0005548788.localdomain python3.9[212656]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:05 np0005548788.localdomain sudo[212654]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:06 np0005548788.localdomain sudo[212764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxjzgbtaefxelqkygzorxjxitqyccmsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014185.8942635-935-47445433858378/AnsiballZ_stat.py
Dec 06 09:43:06 np0005548788.localdomain sudo[212764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:06 np0005548788.localdomain python3.9[212766]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:06 np0005548788.localdomain sudo[212764]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:07 np0005548788.localdomain sudo[212821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poslkcyubmamdnnyqkmknnltuvdtcoyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014185.8942635-935-47445433858378/AnsiballZ_file.py
Dec 06 09:43:07 np0005548788.localdomain sudo[212821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:07 np0005548788.localdomain python3.9[212823]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:07 np0005548788.localdomain sudo[212821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56738 DF PROTO=TCP SPT=56748 DPT=9101 SEQ=1242055555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E90F300000000001030307) 
Dec 06 09:43:08 np0005548788.localdomain sudo[212931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psptqlztcbabrmknezhdujzukhnmncic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014187.8622417-1004-274671020748132/AnsiballZ_file.py
Dec 06 09:43:08 np0005548788.localdomain sudo[212931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:08 np0005548788.localdomain python3.9[212933]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:08 np0005548788.localdomain sudo[212931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:08 np0005548788.localdomain sudo[213041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqkizbqqnmpowaguqrbvmozluoiptmlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014188.5595074-1028-5997977798824/AnsiballZ_stat.py
Dec 06 09:43:08 np0005548788.localdomain sudo[213041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:09 np0005548788.localdomain python3.9[213043]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:09 np0005548788.localdomain sudo[213041]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:09 np0005548788.localdomain sudo[213098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tudeuyvsgxuudmpuagyrzvdowjbpkisa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014188.5595074-1028-5997977798824/AnsiballZ_file.py
Dec 06 09:43:09 np0005548788.localdomain sudo[213098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:09 np0005548788.localdomain python3.9[213100]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:09 np0005548788.localdomain sudo[213098]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:10 np0005548788.localdomain sudo[213208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etndfrpqiltwaxytxknadsezeohhmjlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014189.7112484-1064-81707281403942/AnsiballZ_stat.py
Dec 06 09:43:10 np0005548788.localdomain sudo[213208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:10 np0005548788.localdomain python3.9[213210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:10 np0005548788.localdomain sudo[213208]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:10 np0005548788.localdomain sudo[213265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vilkqqltocxrvhemwwmuummeopjtqcrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014189.7112484-1064-81707281403942/AnsiballZ_file.py
Dec 06 09:43:10 np0005548788.localdomain sudo[213265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38916 DF PROTO=TCP SPT=38496 DPT=9105 SEQ=2995768066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E919F00000000001030307) 
Dec 06 09:43:10 np0005548788.localdomain python3.9[213267]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:10 np0005548788.localdomain sudo[213265]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:11 np0005548788.localdomain sudo[213375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfwgwtehcthzmppmoyevaulhmafdcrqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014190.8892574-1100-189594481889282/AnsiballZ_systemd.py
Dec 06 09:43:11 np0005548788.localdomain sudo[213375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:11 np0005548788.localdomain python3.9[213377]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:11 np0005548788.localdomain systemd-rc-local-generator[213401]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:11 np0005548788.localdomain systemd-sysv-generator[213405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548788.localdomain sudo[213375]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:12 np0005548788.localdomain sudo[213523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyzcgepugozplmexfecnxcvvzpjuuhlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014192.2080848-1124-90695665659476/AnsiballZ_stat.py
Dec 06 09:43:12 np0005548788.localdomain sudo[213523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:12 np0005548788.localdomain python3.9[213525]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:12 np0005548788.localdomain sudo[213523]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:12 np0005548788.localdomain sudo[213580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeshxctxgexfhytvulakdaofqnngmboe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014192.2080848-1124-90695665659476/AnsiballZ_file.py
Dec 06 09:43:12 np0005548788.localdomain sudo[213580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:13 np0005548788.localdomain python3.9[213582]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:13 np0005548788.localdomain sudo[213580]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35950 DF PROTO=TCP SPT=60816 DPT=9100 SEQ=3142154869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E925500000000001030307) 
Dec 06 09:43:13 np0005548788.localdomain sudo[213690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueqnxefnsxficppjmyzphdubcrqnrahl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014193.3951807-1160-170294335792395/AnsiballZ_stat.py
Dec 06 09:43:13 np0005548788.localdomain sudo[213690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:13 np0005548788.localdomain python3.9[213692]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:13 np0005548788.localdomain sudo[213690]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:14 np0005548788.localdomain sudo[213747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaehrneortbsfbdiucmmccebmygchxxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014193.3951807-1160-170294335792395/AnsiballZ_file.py
Dec 06 09:43:14 np0005548788.localdomain sudo[213747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:14 np0005548788.localdomain python3.9[213749]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:14 np0005548788.localdomain sudo[213747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:15 np0005548788.localdomain sudo[213857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leyrzgicrhtwrqjgkwfthrmihcwjyyqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014194.591184-1196-270170948391811/AnsiballZ_systemd.py
Dec 06 09:43:15 np0005548788.localdomain sudo[213857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:15 np0005548788.localdomain python3.9[213859]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:15 np0005548788.localdomain systemd-rc-local-generator[213884]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:15 np0005548788.localdomain systemd-sysv-generator[213888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:43:15 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:43:15 np0005548788.localdomain sudo[213857]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:16 np0005548788.localdomain sudo[214009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdxtwdgcbxvjzbvmhmbakncbnvwxplcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014196.1492484-1226-193463437095896/AnsiballZ_file.py
Dec 06 09:43:16 np0005548788.localdomain sudo[214009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:16 np0005548788.localdomain python3.9[214011]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35952 DF PROTO=TCP SPT=60816 DPT=9100 SEQ=3142154869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E931700000000001030307) 
Dec 06 09:43:16 np0005548788.localdomain sudo[214009]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:17 np0005548788.localdomain sudo[214119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yapdqdstdmgcghmbbxazrybxlastzndr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014196.892547-1250-75190114752555/AnsiballZ_stat.py
Dec 06 09:43:17 np0005548788.localdomain sudo[214119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:17 np0005548788.localdomain python3.9[214121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:17 np0005548788.localdomain sudo[214119]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:17 np0005548788.localdomain sudo[214207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssrbxtnppsvtupftcxmspirptiupffdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014196.892547-1250-75190114752555/AnsiballZ_copy.py
Dec 06 09:43:17 np0005548788.localdomain sudo[214207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:17 np0005548788.localdomain python3.9[214209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014196.892547-1250-75190114752555/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:17 np0005548788.localdomain sudo[214207]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:18 np0005548788.localdomain sudo[214317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfxnideftwpjjhmymyyfklxeoxdrbfyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014198.5024242-1301-35663345591680/AnsiballZ_file.py
Dec 06 09:43:18 np0005548788.localdomain sudo[214317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:18 np0005548788.localdomain python3.9[214319]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:18 np0005548788.localdomain sudo[214317]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24974 DF PROTO=TCP SPT=47364 DPT=9102 SEQ=471064487 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E93C540000000001030307) 
Dec 06 09:43:19 np0005548788.localdomain sudo[214427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kquatkshgeqraeklabqqkomplrqbfoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014199.24516-1325-214179481756804/AnsiballZ_stat.py
Dec 06 09:43:19 np0005548788.localdomain sudo[214427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:19 np0005548788.localdomain python3.9[214429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:19 np0005548788.localdomain sudo[214427]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:20 np0005548788.localdomain sudo[214515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebqbdrrlapgqrgmdoxixrwyljikyxzrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014199.24516-1325-214179481756804/AnsiballZ_copy.py
Dec 06 09:43:20 np0005548788.localdomain sudo[214515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:20 np0005548788.localdomain python3.9[214517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014199.24516-1325-214179481756804/.source.json _original_basename=.u1bhixea follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:20 np0005548788.localdomain sudo[214515]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:20 np0005548788.localdomain sudo[214625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrwbykdellwbtsflirhihvzdmasmptjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014200.5322824-1370-55567565948403/AnsiballZ_file.py
Dec 06 09:43:20 np0005548788.localdomain sudo[214625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:21 np0005548788.localdomain python3.9[214627]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:21 np0005548788.localdomain sudo[214625]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:21 np0005548788.localdomain sudo[214735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxgtzqvrwkgbynbaoljbceityczmulvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014201.2343469-1394-48121800211699/AnsiballZ_stat.py
Dec 06 09:43:21 np0005548788.localdomain sudo[214735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:21 np0005548788.localdomain sudo[214735]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:22 np0005548788.localdomain sudo[214823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibgyjeeamudhbthsdfovnevbziyrlpar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014201.2343469-1394-48121800211699/AnsiballZ_copy.py
Dec 06 09:43:22 np0005548788.localdomain sudo[214823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:22 np0005548788.localdomain sshd[214826]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:43:22 np0005548788.localdomain sudo[214823]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24976 DF PROTO=TCP SPT=47364 DPT=9102 SEQ=471064487 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E948700000000001030307) 
Dec 06 09:43:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:43:23 np0005548788.localdomain sudo[214935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwtheojliujjaqkdmpvnwzdhpgqifrro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014202.7081115-1445-146078065617809/AnsiballZ_container_config_data.py
Dec 06 09:43:23 np0005548788.localdomain sudo[214935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:23 np0005548788.localdomain podman[214937]: 2025-12-06 09:43:23.260732966 +0000 UTC m=+0.086180611 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 09:43:23 np0005548788.localdomain podman[214937]: 2025-12-06 09:43:23.304294639 +0000 UTC m=+0.129742274 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:43:23 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:43:23 np0005548788.localdomain python3.9[214938]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 06 09:43:23 np0005548788.localdomain sudo[214935]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:23 np0005548788.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 06 09:43:24 np0005548788.localdomain sudo[215071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aipudcecmpjpxvymmvgjfvslzkdsacke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014203.7692435-1472-18235869847487/AnsiballZ_container_config_hash.py
Dec 06 09:43:24 np0005548788.localdomain sudo[215071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:24 np0005548788.localdomain python3.9[215073]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:43:24 np0005548788.localdomain sudo[215071]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:24 np0005548788.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 06 09:43:25 np0005548788.localdomain sudo[215182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdkaebpkfzxiwmejwwbklypalrybmpia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014204.8294272-1499-43258843955876/AnsiballZ_podman_container_info.py
Dec 06 09:43:25 np0005548788.localdomain sudo[215182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:43:25 np0005548788.localdomain podman[215185]: 2025-12-06 09:43:25.431804885 +0000 UTC m=+0.082512409 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:43:25 np0005548788.localdomain podman[215185]: 2025-12-06 09:43:25.43784893 +0000 UTC m=+0.088556494 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:43:25 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:43:25 np0005548788.localdomain python3.9[215184]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:43:25 np0005548788.localdomain sudo[215182]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24977 DF PROTO=TCP SPT=47364 DPT=9102 SEQ=471064487 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E958300000000001030307) 
Dec 06 09:43:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35954 DF PROTO=TCP SPT=60816 DPT=9100 SEQ=3142154869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E961F00000000001030307) 
Dec 06 09:43:29 np0005548788.localdomain sudo[215338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riamydyeaijitncyeprxsrfvylwuymab ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014209.1087916-1538-146134996390428/AnsiballZ_edpm_container_manage.py
Dec 06 09:43:29 np0005548788.localdomain sudo[215338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:29 np0005548788.localdomain python3[215340]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:43:31 np0005548788.localdomain podman[215353]: 2025-12-06 09:43:29.970979167 +0000 UTC m=+0.046184336 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:43:31 np0005548788.localdomain podman[215401]: 
Dec 06 09:43:31 np0005548788.localdomain podman[215401]: 2025-12-06 09:43:31.781837764 +0000 UTC m=+0.078694721 container create 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:43:31 np0005548788.localdomain podman[215401]: 2025-12-06 09:43:31.748730031 +0000 UTC m=+0.045586998 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:43:31 np0005548788.localdomain python3[215340]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:43:31 np0005548788.localdomain sudo[215338]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 np0005548788.localdomain sudo[215543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldqytnvhkwawupovqhsmcouxeimvvhcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014212.8774354-1562-153951984739022/AnsiballZ_stat.py
Dec 06 09:43:33 np0005548788.localdomain sudo[215543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:33 np0005548788.localdomain python3.9[215545]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:33 np0005548788.localdomain sudo[215543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 np0005548788.localdomain sudo[215655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfigrjwvyfpojdmxiivsexmhzqtrfjry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014213.8734062-1589-161617055494582/AnsiballZ_file.py
Dec 06 09:43:34 np0005548788.localdomain sudo[215655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:34 np0005548788.localdomain python3.9[215657]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:34 np0005548788.localdomain sudo[215655]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24978 DF PROTO=TCP SPT=47364 DPT=9102 SEQ=471064487 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E977F00000000001030307) 
Dec 06 09:43:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45538 DF PROTO=TCP SPT=41132 DPT=9101 SEQ=2958517121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E978600000000001030307) 
Dec 06 09:43:35 np0005548788.localdomain sudo[215710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cltwhdyuiqtpjpbxuzagrajzdnitnewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014213.8734062-1589-161617055494582/AnsiballZ_stat.py
Dec 06 09:43:35 np0005548788.localdomain sudo[215710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:35 np0005548788.localdomain python3.9[215712]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:35 np0005548788.localdomain sudo[215710]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:35 np0005548788.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 06 09:43:35 np0005548788.localdomain systemd[1]: virtqemud.service: Deactivated successfully.
Dec 06 09:43:35 np0005548788.localdomain sudo[215821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgamrmeanwfroenedluhvbixfaflekll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014215.5222409-1589-119666713369763/AnsiballZ_copy.py
Dec 06 09:43:35 np0005548788.localdomain sudo[215821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:36 np0005548788.localdomain python3.9[215823]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014215.5222409-1589-119666713369763/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:36 np0005548788.localdomain sudo[215821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:36 np0005548788.localdomain sudo[215876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctjemjymuzplohnwhjlflaqmpswbrfoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014215.5222409-1589-119666713369763/AnsiballZ_systemd.py
Dec 06 09:43:36 np0005548788.localdomain sudo[215876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:36 np0005548788.localdomain python3.9[215878]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:36 np0005548788.localdomain systemd-rc-local-generator[215901]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:36 np0005548788.localdomain systemd-sysv-generator[215907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain sudo[215876]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:37 np0005548788.localdomain sudo[215967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnrgjpnjcvdbbegvfmvptkognlgzdkzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014215.5222409-1589-119666713369763/AnsiballZ_systemd.py
Dec 06 09:43:37 np0005548788.localdomain sudo[215967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:37 np0005548788.localdomain python3.9[215969]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:37 np0005548788.localdomain systemd-rc-local-generator[215998]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:37 np0005548788.localdomain systemd-sysv-generator[216002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45540 DF PROTO=TCP SPT=41132 DPT=9101 SEQ=2958517121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E984710000000001030307) 
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: Starting multipathd container...
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:43:38 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d542f240e22de4612c28ad96e0dabf0498601fa40049fdec6b6acee5a7bb25d6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:38 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d542f240e22de4612c28ad96e0dabf0498601fa40049fdec6b6acee5a7bb25d6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:43:38 np0005548788.localdomain podman[216010]: 2025-12-06 09:43:38.241943066 +0000 UTC m=+0.152777180 container init 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + sudo -E kolla_set_configs
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:43:38 np0005548788.localdomain podman[216010]: 2025-12-06 09:43:38.27863497 +0000 UTC m=+0.189469044 container start 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Dec 06 09:43:38 np0005548788.localdomain podman[216010]: multipathd
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: Started multipathd container.
Dec 06 09:43:38 np0005548788.localdomain sudo[216030]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:43:38 np0005548788.localdomain sudo[216030]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:38 np0005548788.localdomain sudo[216030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:38 np0005548788.localdomain sudo[215967]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: INFO:__main__:Validating config file
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: INFO:__main__:Writing out command to execute
Dec 06 09:43:38 np0005548788.localdomain sudo[216030]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: ++ cat /run_command
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + CMD='/usr/sbin/multipathd -d'
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + ARGS=
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + sudo kolla_copy_cacerts
Dec 06 09:43:38 np0005548788.localdomain sudo[216048]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:43:38 np0005548788.localdomain podman[216031]: 2025-12-06 09:43:38.377182458 +0000 UTC m=+0.093858026 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:43:38 np0005548788.localdomain sudo[216048]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:38 np0005548788.localdomain sudo[216048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:38 np0005548788.localdomain sudo[216048]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:38 np0005548788.localdomain podman[216031]: 2025-12-06 09:43:38.385762211 +0000 UTC m=+0.102437749 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + [[ ! -n '' ]]
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + . kolla_extend_start
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: Running command: '/usr/sbin/multipathd -d'
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + umask 0022
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: + exec /usr/sbin/multipathd -d
Dec 06 09:43:38 np0005548788.localdomain podman[216031]: unhealthy
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: 10638.576979 | --------start up--------
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: 10638.577006 | read /etc/multipath.conf
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:43:38 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Failed with result 'exit-code'.
Dec 06 09:43:38 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Dec 06 09:43:38 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:43:38 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:43:38 np0005548788.localdomain multipathd[216024]: 10638.581980 | path checkers start up
Dec 06 09:43:38 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:43:40 np0005548788.localdomain python3.9[216170]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43901 DF PROTO=TCP SPT=37548 DPT=9105 SEQ=4247580543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E98FF00000000001030307) 
Dec 06 09:43:41 np0005548788.localdomain sudo[216280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lruscfvgpdqsyfwrckneyvhzhbxuzqkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014221.385623-1697-227731041840156/AnsiballZ_command.py
Dec 06 09:43:41 np0005548788.localdomain sudo[216280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:41 np0005548788.localdomain python3.9[216282]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:43:41 np0005548788.localdomain sudo[216280]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:42 np0005548788.localdomain sudo[216403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deplbosxkygmiigrhmtepjwlfvpnidkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014222.199241-1721-232553630847112/AnsiballZ_systemd.py
Dec 06 09:43:42 np0005548788.localdomain sudo[216403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:42 np0005548788.localdomain python3.9[216405]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:43:42 np0005548788.localdomain systemd[1]: Stopping multipathd container...
Dec 06 09:43:43 np0005548788.localdomain multipathd[216024]: 10643.183708 | exit (signal)
Dec 06 09:43:43 np0005548788.localdomain multipathd[216024]: 10643.184392 | --------shut down-------
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: libpod-6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.scope: Deactivated successfully.
Dec 06 09:43:43 np0005548788.localdomain podman[216409]: 2025-12-06 09:43:43.043585996 +0000 UTC m=+0.117625673 container died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.timer: Deactivated successfully.
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3-userdata-shm.mount: Deactivated successfully.
Dec 06 09:43:43 np0005548788.localdomain podman[216409]: 2025-12-06 09:43:43.265127062 +0000 UTC m=+0.339166709 container cleanup 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:43:43 np0005548788.localdomain podman[216409]: multipathd
Dec 06 09:43:43 np0005548788.localdomain podman[216435]: 2025-12-06 09:43:43.387577541 +0000 UTC m=+0.077454312 container cleanup 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:43:43 np0005548788.localdomain podman[216435]: multipathd
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Stopped multipathd container.
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Starting multipathd container...
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:43:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d542f240e22de4612c28ad96e0dabf0498601fa40049fdec6b6acee5a7bb25d6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d542f240e22de4612c28ad96e0dabf0498601fa40049fdec6b6acee5a7bb25d6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:43:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52893 DF PROTO=TCP SPT=37684 DPT=9100 SEQ=4226942430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E99A7F0000000001030307) 
Dec 06 09:43:43 np0005548788.localdomain podman[216449]: 2025-12-06 09:43:43.553473891 +0000 UTC m=+0.134361366 container init 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + sudo -E kolla_set_configs
Dec 06 09:43:43 np0005548788.localdomain sudo[216469]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:43:43 np0005548788.localdomain sudo[216469]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:43 np0005548788.localdomain sudo[216469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:43:43 np0005548788.localdomain podman[216449]: 2025-12-06 09:43:43.600271605 +0000 UTC m=+0.181159010 container start 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 06 09:43:43 np0005548788.localdomain podman[216449]: multipathd
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: Started multipathd container.
Dec 06 09:43:43 np0005548788.localdomain sudo[216403]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: INFO:__main__:Validating config file
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: INFO:__main__:Writing out command to execute
Dec 06 09:43:43 np0005548788.localdomain sudo[216469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: ++ cat /run_command
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + CMD='/usr/sbin/multipathd -d'
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + ARGS=
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + sudo kolla_copy_cacerts
Dec 06 09:43:43 np0005548788.localdomain sudo[216485]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:43:43 np0005548788.localdomain sudo[216485]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:43 np0005548788.localdomain sudo[216485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:43 np0005548788.localdomain sudo[216485]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + [[ ! -n '' ]]
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + . kolla_extend_start
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: Running command: '/usr/sbin/multipathd -d'
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + umask 0022
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: + exec /usr/sbin/multipathd -d
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: 10643.884819 | --------start up--------
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: 10643.884840 | read /etc/multipath.conf
Dec 06 09:43:43 np0005548788.localdomain multipathd[216463]: 10643.888919 | path checkers start up
Dec 06 09:43:43 np0005548788.localdomain podman[216472]: 2025-12-06 09:43:43.712407759 +0000 UTC m=+0.114047044 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 06 09:43:43 np0005548788.localdomain podman[216472]: 2025-12-06 09:43:43.720838237 +0000 UTC m=+0.122477552 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:43:43 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:43:44 np0005548788.localdomain sudo[216609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azqxsghvvkuvkudsttlitceuzoemyrmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014223.841697-1745-221161711604966/AnsiballZ_file.py
Dec 06 09:43:44 np0005548788.localdomain sudo[216609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:44 np0005548788.localdomain python3.9[216611]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:44 np0005548788.localdomain sudo[216609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:45 np0005548788.localdomain sudo[216719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptiuhczqskvktucsvylbdrexbowmxquq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014225.3455558-1781-244488580511339/AnsiballZ_file.py
Dec 06 09:43:45 np0005548788.localdomain sudo[216719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:45 np0005548788.localdomain python3.9[216721]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:43:45 np0005548788.localdomain sudo[216719]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:46 np0005548788.localdomain sudo[216829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unezrhngayckmnsaonzyhufdwpjtqgjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014226.0725622-1805-82011467872987/AnsiballZ_modprobe.py
Dec 06 09:43:46 np0005548788.localdomain sudo[216829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:46 np0005548788.localdomain python3.9[216831]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 06 09:43:46 np0005548788.localdomain sudo[216829]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52895 DF PROTO=TCP SPT=37684 DPT=9100 SEQ=4226942430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9A6700000000001030307) 
Dec 06 09:43:47 np0005548788.localdomain sshd[214826]: Received disconnect from 45.78.194.186 port 37798:11: Bye Bye [preauth]
Dec 06 09:43:47 np0005548788.localdomain sshd[214826]: Disconnected from authenticating user root 45.78.194.186 port 37798 [preauth]
Dec 06 09:43:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:43:47.400 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:43:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:43:47.401 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:43:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:43:47.401 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:43:47 np0005548788.localdomain sudo[216947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aderghynlgchuahtpxpdahhzntjalbmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014227.4872012-1829-129530903886402/AnsiballZ_stat.py
Dec 06 09:43:47 np0005548788.localdomain sudo[216947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:48 np0005548788.localdomain python3.9[216949]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:48 np0005548788.localdomain sudo[216947]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:48 np0005548788.localdomain sudo[217035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjvbybmoksjqflpgpggampjvkczbeoxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014227.4872012-1829-129530903886402/AnsiballZ_copy.py
Dec 06 09:43:48 np0005548788.localdomain sudo[217035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:48 np0005548788.localdomain python3.9[217037]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014227.4872012-1829-129530903886402/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:48 np0005548788.localdomain sudo[217035]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:49 np0005548788.localdomain sudo[217145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maouqfaxwdafmuevjranhrjmrnbzkufp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014229.13816-1877-162052387434086/AnsiballZ_lineinfile.py
Dec 06 09:43:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44273 DF PROTO=TCP SPT=59040 DPT=9102 SEQ=2761998328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9B1850000000001030307) 
Dec 06 09:43:49 np0005548788.localdomain sudo[217145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:49 np0005548788.localdomain python3.9[217147]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:49 np0005548788.localdomain sudo[217145]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:50 np0005548788.localdomain sudo[217255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iekakkuepixxdziwvxowqutwzqqtmtba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014229.8329804-1901-119854682997741/AnsiballZ_systemd.py
Dec 06 09:43:50 np0005548788.localdomain sudo[217255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:50 np0005548788.localdomain python3.9[217257]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:43:50 np0005548788.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:43:50 np0005548788.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:43:50 np0005548788.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:43:50 np0005548788.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 09:43:50 np0005548788.localdomain systemd-modules-load[217261]: Module 'msr' is built in
Dec 06 09:43:50 np0005548788.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 09:43:50 np0005548788.localdomain sudo[217255]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:51 np0005548788.localdomain sudo[217369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhbghctokoqhvjxmpqvjfvxcknrsofvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014231.0828118-1925-198708510339179/AnsiballZ_dnf.py
Dec 06 09:43:51 np0005548788.localdomain sudo[217369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:51 np0005548788.localdomain python3.9[217371]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:43:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44275 DF PROTO=TCP SPT=59040 DPT=9102 SEQ=2761998328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9BD710000000001030307) 
Dec 06 09:43:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:43:54 np0005548788.localdomain podman[217374]: 2025-12-06 09:43:54.270961251 +0000 UTC m=+0.085994947 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:43:54 np0005548788.localdomain podman[217374]: 2025-12-06 09:43:54.343682918 +0000 UTC m=+0.158716594 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:43:54 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:43:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:43:55 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:56 np0005548788.localdomain podman[217407]: 2025-12-06 09:43:56.037157239 +0000 UTC m=+0.091186523 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 09:43:56 np0005548788.localdomain systemd-rc-local-generator[217445]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:56 np0005548788.localdomain systemd-sysv-generator[217450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:56 np0005548788.localdomain podman[217407]: 2025-12-06 09:43:56.093824284 +0000 UTC m=+0.147853528 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:56 np0005548788.localdomain systemd-rc-local-generator[217484]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:56 np0005548788.localdomain systemd-sysv-generator[217488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44276 DF PROTO=TCP SPT=59040 DPT=9102 SEQ=2761998328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9CD300000000001030307) 
Dec 06 09:43:56 np0005548788.localdomain systemd-logind[765]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 06 09:43:56 np0005548788.localdomain systemd-logind[765]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 06 09:43:56 np0005548788.localdomain lvm[217533]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:43:56 np0005548788.localdomain lvm[217532]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 09:43:56 np0005548788.localdomain lvm[217533]: VG ceph_vg0 finished
Dec 06 09:43:56 np0005548788.localdomain lvm[217532]: VG ceph_vg1 finished
Dec 06 09:43:56 np0005548788.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:43:57 np0005548788.localdomain systemd-rc-local-generator[217583]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:57 np0005548788.localdomain systemd-sysv-generator[217587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548788.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:43:58 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:43:58 np0005548788.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:43:58 np0005548788.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.399s CPU time.
Dec 06 09:43:58 np0005548788.localdomain systemd[1]: run-r6df1c7401e8448a587e80938990ff135.service: Deactivated successfully.
Dec 06 09:43:58 np0005548788.localdomain sudo[217369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52897 DF PROTO=TCP SPT=37684 DPT=9100 SEQ=4226942430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9D5F00000000001030307) 
Dec 06 09:43:59 np0005548788.localdomain python3.9[218828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:44:00 np0005548788.localdomain sudo[218940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdihtdtynnjpglivqqmrngftfckqpxrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014240.3414218-1977-142909104665497/AnsiballZ_file.py
Dec 06 09:44:00 np0005548788.localdomain sudo[218940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:00 np0005548788.localdomain sudo[218943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:44:00 np0005548788.localdomain sudo[218943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:44:00 np0005548788.localdomain sudo[218943]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:00 np0005548788.localdomain sudo[218961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:44:00 np0005548788.localdomain sudo[218961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:44:00 np0005548788.localdomain python3.9[218942]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:00 np0005548788.localdomain sudo[218940]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:01 np0005548788.localdomain sudo[218961]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:02 np0005548788.localdomain sudo[219122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjxfyprqmeyuafinhtylksjqqkeutbtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014241.6789799-2010-229527838404018/AnsiballZ_systemd_service.py
Dec 06 09:44:02 np0005548788.localdomain sudo[219122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:02 np0005548788.localdomain sudo[219115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:44:02 np0005548788.localdomain sudo[219115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:44:02 np0005548788.localdomain sudo[219115]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:02 np0005548788.localdomain python3.9[219136]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:44:02 np0005548788.localdomain systemd-sysv-generator[219167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:44:02 np0005548788.localdomain systemd-rc-local-generator[219164]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548788.localdomain sudo[219122]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:03 np0005548788.localdomain python3.9[219280]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:44:03 np0005548788.localdomain network[219297]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:44:03 np0005548788.localdomain network[219298]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:44:03 np0005548788.localdomain network[219299]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:44:04 np0005548788.localdomain sshd[219327]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29027 DF PROTO=TCP SPT=51156 DPT=9101 SEQ=2885977469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9ED8E0000000001030307) 
Dec 06 09:44:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44277 DF PROTO=TCP SPT=59040 DPT=9102 SEQ=2761998328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9EDF00000000001030307) 
Dec 06 09:44:05 np0005548788.localdomain sshd[219327]: Received disconnect from 148.227.3.232 port 43768:11: Bye Bye [preauth]
Dec 06 09:44:05 np0005548788.localdomain sshd[219327]: Disconnected from authenticating user root 148.227.3.232 port 43768 [preauth]
Dec 06 09:44:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:44:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29029 DF PROTO=TCP SPT=51156 DPT=9101 SEQ=2885977469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5E9F9B00000000001030307) 
Dec 06 09:44:08 np0005548788.localdomain sudo[219534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irgmabmloeehkrtwcuzkotimuypwsogf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014248.1191838-2067-269424121186958/AnsiballZ_systemd_service.py
Dec 06 09:44:08 np0005548788.localdomain sudo[219534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:08 np0005548788.localdomain python3.9[219536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:08 np0005548788.localdomain sudo[219534]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:09 np0005548788.localdomain sudo[219645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ropjeejwcibmuytyfpthpzlobyxjylhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014249.5854635-2067-71421263806223/AnsiballZ_systemd_service.py
Dec 06 09:44:09 np0005548788.localdomain sudo[219645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:10 np0005548788.localdomain python3.9[219647]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41774 DF PROTO=TCP SPT=33176 DPT=9105 SEQ=1453123160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA03F00000000001030307) 
Dec 06 09:44:11 np0005548788.localdomain sudo[219645]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:11 np0005548788.localdomain sudo[219756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlhlyvrdvcjhswamwfceqtftameqthmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014251.3722658-2067-270020777167614/AnsiballZ_systemd_service.py
Dec 06 09:44:11 np0005548788.localdomain sudo[219756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:11 np0005548788.localdomain python3.9[219758]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:13 np0005548788.localdomain sudo[219756]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:13 np0005548788.localdomain sudo[219867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxulttljfskjqgmggmcoolxtlkfadwzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014253.1554105-2067-271225032708217/AnsiballZ_systemd_service.py
Dec 06 09:44:13 np0005548788.localdomain sudo[219867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36128 DF PROTO=TCP SPT=46436 DPT=9100 SEQ=1463691848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA0FAF0000000001030307) 
Dec 06 09:44:13 np0005548788.localdomain python3.9[219869]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:44:13 np0005548788.localdomain sudo[219867]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:13 np0005548788.localdomain podman[219871]: 2025-12-06 09:44:13.98732685 +0000 UTC m=+0.092845808 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:44:14 np0005548788.localdomain podman[219871]: 2025-12-06 09:44:14.008638171 +0000 UTC m=+0.114157109 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 09:44:14 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:44:14 np0005548788.localdomain sudo[219996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnlfgxszkwvznfudhrqburqikbhkiade ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014254.058265-2067-70008874890046/AnsiballZ_systemd_service.py
Dec 06 09:44:14 np0005548788.localdomain sudo[219996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:14 np0005548788.localdomain python3.9[219998]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:14 np0005548788.localdomain sudo[219996]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:15 np0005548788.localdomain sudo[220107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsxaihvgclqoiitwzwfpdweyvrkpesyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014254.8778274-2067-239560389669191/AnsiballZ_systemd_service.py
Dec 06 09:44:15 np0005548788.localdomain sudo[220107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:15 np0005548788.localdomain python3.9[220109]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:15 np0005548788.localdomain sudo[220107]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:16 np0005548788.localdomain sudo[220218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jolbbdtcptnraklqmnvmpqwnfwawgyky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014255.6612062-2067-128968254500953/AnsiballZ_systemd_service.py
Dec 06 09:44:16 np0005548788.localdomain sudo[220218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:16 np0005548788.localdomain python3.9[220220]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:16 np0005548788.localdomain sudo[220218]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36130 DF PROTO=TCP SPT=46436 DPT=9100 SEQ=1463691848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA1BB00000000001030307) 
Dec 06 09:44:16 np0005548788.localdomain sudo[220329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywmfskgzmtkyliagetwxdcskqetzojxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014256.6149921-2067-172486567088149/AnsiballZ_systemd_service.py
Dec 06 09:44:16 np0005548788.localdomain sudo[220329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:17 np0005548788.localdomain python3.9[220331]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:18 np0005548788.localdomain sudo[220329]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:19 np0005548788.localdomain sudo[220440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnoyoktrlsutilbkertubecthpjugzja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014258.7612402-2244-178702038550890/AnsiballZ_file.py
Dec 06 09:44:19 np0005548788.localdomain sudo[220440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:19 np0005548788.localdomain python3.9[220442]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:19 np0005548788.localdomain sudo[220440]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65057 DF PROTO=TCP SPT=35002 DPT=9102 SEQ=3259797286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA26B60000000001030307) 
Dec 06 09:44:19 np0005548788.localdomain sudo[220550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgtxkgjjrbbbcqorfrqzlvskykbihmqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014259.3728468-2244-161059669937839/AnsiballZ_file.py
Dec 06 09:44:19 np0005548788.localdomain sudo[220550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:19 np0005548788.localdomain python3.9[220552]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:19 np0005548788.localdomain sudo[220550]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:20 np0005548788.localdomain sudo[220660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqaeueqxoguzoumdwqmstksnsdobttjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014259.9830449-2244-200261563602583/AnsiballZ_file.py
Dec 06 09:44:20 np0005548788.localdomain sudo[220660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:20 np0005548788.localdomain python3.9[220662]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:20 np0005548788.localdomain sudo[220660]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:20 np0005548788.localdomain sudo[220770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvswgvrzcgagvukwlkhdxipgkdjzpcip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014260.674376-2244-129684706400836/AnsiballZ_file.py
Dec 06 09:44:20 np0005548788.localdomain sudo[220770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:21 np0005548788.localdomain python3.9[220772]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:21 np0005548788.localdomain sudo[220770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:22 np0005548788.localdomain sudo[220880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcjcinrzhdmrqmuvhbjpwsfvkisvxudp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014261.3819106-2244-119750007213771/AnsiballZ_file.py
Dec 06 09:44:22 np0005548788.localdomain sudo[220880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:22 np0005548788.localdomain python3.9[220882]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:22 np0005548788.localdomain sudo[220880]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65059 DF PROTO=TCP SPT=35002 DPT=9102 SEQ=3259797286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA32B00000000001030307) 
Dec 06 09:44:22 np0005548788.localdomain sudo[220990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdstjmdxfcclmjzbjrylnzuxgqhjjpmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014262.5754683-2244-146986372855643/AnsiballZ_file.py
Dec 06 09:44:22 np0005548788.localdomain sudo[220990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:23 np0005548788.localdomain python3.9[220992]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:23 np0005548788.localdomain sudo[220990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:23 np0005548788.localdomain sudo[221100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieawfdrpkwnxhtnjfrffjnoiddhszvoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014263.2919-2244-86647198890928/AnsiballZ_file.py
Dec 06 09:44:23 np0005548788.localdomain sudo[221100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:23 np0005548788.localdomain python3.9[221102]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:23 np0005548788.localdomain sudo[221100]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:24 np0005548788.localdomain sudo[221210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrpihvmufoekwaixuaaoburrxaycbeip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014263.9713862-2244-123656571273035/AnsiballZ_file.py
Dec 06 09:44:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:44:25 np0005548788.localdomain sudo[221210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:25 np0005548788.localdomain systemd[1]: tmp-crun.K9EduX.mount: Deactivated successfully.
Dec 06 09:44:25 np0005548788.localdomain podman[221212]: 2025-12-06 09:44:25.115244565 +0000 UTC m=+0.092083603 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 09:44:25 np0005548788.localdomain podman[221212]: 2025-12-06 09:44:25.155687881 +0000 UTC m=+0.132526929 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Dec 06 09:44:25 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:44:25 np0005548788.localdomain python3.9[221213]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:25 np0005548788.localdomain sudo[221210]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:25 np0005548788.localdomain sudo[221345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hetvvdsiuiavwxmlifwmgqaenuyoavnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014265.4611568-2415-60474839480006/AnsiballZ_file.py
Dec 06 09:44:25 np0005548788.localdomain sudo[221345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:25 np0005548788.localdomain python3.9[221347]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:26 np0005548788.localdomain sudo[221345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:26 np0005548788.localdomain sudo[221455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-serqdaijtbwdibgjvozotjvcjuezuwxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014266.1555998-2415-90682299210618/AnsiballZ_file.py
Dec 06 09:44:26 np0005548788.localdomain sudo[221455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:44:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65060 DF PROTO=TCP SPT=35002 DPT=9102 SEQ=3259797286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA42700000000001030307) 
Dec 06 09:44:26 np0005548788.localdomain systemd[1]: tmp-crun.0SrBNI.mount: Deactivated successfully.
Dec 06 09:44:26 np0005548788.localdomain podman[221458]: 2025-12-06 09:44:26.567022397 +0000 UTC m=+0.092891270 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 09:44:26 np0005548788.localdomain podman[221458]: 2025-12-06 09:44:26.600692095 +0000 UTC m=+0.126560928 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:44:26 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:44:26 np0005548788.localdomain python3.9[221457]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:26 np0005548788.localdomain sudo[221455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:27 np0005548788.localdomain sudo[221584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghzxyoapbwlnrqkyfyscpjghmwbwnqew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014266.847464-2415-91579456450479/AnsiballZ_file.py
Dec 06 09:44:27 np0005548788.localdomain sudo[221584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:27 np0005548788.localdomain python3.9[221586]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:27 np0005548788.localdomain sudo[221584]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:27 np0005548788.localdomain sudo[221694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mawnnfufsdbqojzhhlcjppqwvvyjoolz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014267.4833791-2415-112146471149643/AnsiballZ_file.py
Dec 06 09:44:28 np0005548788.localdomain sudo[221694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:28 np0005548788.localdomain python3.9[221696]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:28 np0005548788.localdomain sudo[221694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:28 np0005548788.localdomain sudo[221804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txrudmlsgvtgnmudrrojmiupjfvfezwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014268.3731728-2415-62812026809626/AnsiballZ_file.py
Dec 06 09:44:28 np0005548788.localdomain sudo[221804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:28 np0005548788.localdomain python3.9[221806]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:28 np0005548788.localdomain sudo[221804]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36132 DF PROTO=TCP SPT=46436 DPT=9100 SEQ=1463691848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA4BF10000000001030307) 
Dec 06 09:44:29 np0005548788.localdomain sudo[221914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjierdadpirejykfahjriopudvwmwcun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014269.0332167-2415-188181649311999/AnsiballZ_file.py
Dec 06 09:44:29 np0005548788.localdomain sudo[221914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:29 np0005548788.localdomain python3.9[221916]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:29 np0005548788.localdomain sudo[221914]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:29 np0005548788.localdomain sudo[222024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcyvmvgfkgxunygrmrzcdfjlqxoxkcfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014269.6727188-2415-121792141990572/AnsiballZ_file.py
Dec 06 09:44:29 np0005548788.localdomain sudo[222024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:30 np0005548788.localdomain python3.9[222026]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:30 np0005548788.localdomain sudo[222024]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:30 np0005548788.localdomain sudo[222134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyahbkqbkbbllimhtedbvczmytxkgrkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014270.3340375-2415-16565714017533/AnsiballZ_file.py
Dec 06 09:44:30 np0005548788.localdomain sudo[222134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:30 np0005548788.localdomain python3.9[222136]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:30 np0005548788.localdomain sudo[222134]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:31 np0005548788.localdomain sudo[222244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbyylscigqdtlnxwrelorncrepwxkhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014271.1025224-2589-94249901089330/AnsiballZ_command.py
Dec 06 09:44:31 np0005548788.localdomain sudo[222244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:31 np0005548788.localdomain python3.9[222246]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:31 np0005548788.localdomain sudo[222244]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:32 np0005548788.localdomain python3.9[222356]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:44:33 np0005548788.localdomain sudo[222464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eukgldpslzgppgvhgeketctdlzvllslr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014272.8067944-2643-32506084998628/AnsiballZ_systemd_service.py
Dec 06 09:44:33 np0005548788.localdomain sudo[222464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:33 np0005548788.localdomain python3.9[222466]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:44:33 np0005548788.localdomain systemd-sysv-generator[222497]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:44:33 np0005548788.localdomain systemd-rc-local-generator[222494]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548788.localdomain sudo[222464]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65061 DF PROTO=TCP SPT=35002 DPT=9102 SEQ=3259797286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA61F00000000001030307) 
Dec 06 09:44:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50868 DF PROTO=TCP SPT=51410 DPT=9101 SEQ=3889365284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA62BD0000000001030307) 
Dec 06 09:44:35 np0005548788.localdomain sudo[222610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvwdlxxeyleccrmjuzrrncmjlvklxcmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014273.977169-2667-95146862923796/AnsiballZ_command.py
Dec 06 09:44:35 np0005548788.localdomain sudo[222610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:35 np0005548788.localdomain python3.9[222612]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:35 np0005548788.localdomain sudo[222610]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:36 np0005548788.localdomain sudo[222721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxczzddwaghogsxrypuzifpvikcsjmkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014275.7023194-2667-25173919889244/AnsiballZ_command.py
Dec 06 09:44:36 np0005548788.localdomain sudo[222721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:36 np0005548788.localdomain python3.9[222723]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:36 np0005548788.localdomain sudo[222721]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:36 np0005548788.localdomain sudo[222832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imwrnwbouwiqezqouvufjzeyrqfezufe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014276.3860888-2667-198550690117580/AnsiballZ_command.py
Dec 06 09:44:36 np0005548788.localdomain sudo[222832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:36 np0005548788.localdomain python3.9[222834]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:36 np0005548788.localdomain sudo[222832]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50870 DF PROTO=TCP SPT=51410 DPT=9101 SEQ=3889365284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA6EB10000000001030307) 
Dec 06 09:44:38 np0005548788.localdomain sudo[222943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckhhjknzhkcidiltgyzrxryepbydmbcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014277.7063253-2667-269625237337478/AnsiballZ_command.py
Dec 06 09:44:38 np0005548788.localdomain sudo[222943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:38 np0005548788.localdomain python3.9[222945]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:38 np0005548788.localdomain sudo[222943]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:38 np0005548788.localdomain sudo[223054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chtuvdxjywrzwavgxkayunqdmspmatdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014278.3964694-2667-269587004880852/AnsiballZ_command.py
Dec 06 09:44:38 np0005548788.localdomain sudo[223054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:38 np0005548788.localdomain python3.9[223056]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:38 np0005548788.localdomain sudo[223054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:39 np0005548788.localdomain sudo[223165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rynxagithxxlintnzldhrqkyydfrdrir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014279.0575724-2667-48901185481755/AnsiballZ_command.py
Dec 06 09:44:39 np0005548788.localdomain sudo[223165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:39 np0005548788.localdomain python3.9[223167]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:39 np0005548788.localdomain sudo[223165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:39 np0005548788.localdomain sudo[223276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogypwsiijvpoyjuleigfgrzlwuqfvkgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014279.7031565-2667-176866456483577/AnsiballZ_command.py
Dec 06 09:44:39 np0005548788.localdomain sudo[223276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:40 np0005548788.localdomain python3.9[223278]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:40 np0005548788.localdomain sudo[223276]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:40 np0005548788.localdomain sudo[223387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veavvfulzinupgyycsfmhwoofjqjhizh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014280.3991187-2667-76262142252552/AnsiballZ_command.py
Dec 06 09:44:40 np0005548788.localdomain sudo[223387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64152 DF PROTO=TCP SPT=37358 DPT=9105 SEQ=1143106598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA79F00000000001030307) 
Dec 06 09:44:40 np0005548788.localdomain python3.9[223389]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:40 np0005548788.localdomain sudo[223387]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:42 np0005548788.localdomain sudo[223498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pafqmiegfhgbtwrfsjenfniukqrgpqho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014282.3143604-2874-158029403339080/AnsiballZ_file.py
Dec 06 09:44:42 np0005548788.localdomain sudo[223498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:42 np0005548788.localdomain python3.9[223500]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:42 np0005548788.localdomain sudo[223498]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:43 np0005548788.localdomain sudo[223608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqtpcrumyhhbliiexpkqwkzczspmtdpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014282.9292123-2874-205662243629309/AnsiballZ_file.py
Dec 06 09:44:43 np0005548788.localdomain sudo[223608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:43 np0005548788.localdomain python3.9[223610]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:43 np0005548788.localdomain sudo[223608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61704 DF PROTO=TCP SPT=57746 DPT=9100 SEQ=3498847918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA84E00000000001030307) 
Dec 06 09:44:43 np0005548788.localdomain sudo[223718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbgrcszcijwalwsomjumssdehcpqixhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014283.571421-2874-146114458551841/AnsiballZ_file.py
Dec 06 09:44:43 np0005548788.localdomain sudo[223718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:44 np0005548788.localdomain python3.9[223720]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:44 np0005548788.localdomain sudo[223718]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:44:44 np0005548788.localdomain systemd[1]: tmp-crun.u9ILCY.mount: Deactivated successfully.
Dec 06 09:44:44 np0005548788.localdomain podman[223738]: 2025-12-06 09:44:44.278574823 +0000 UTC m=+0.094997251 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:44:44 np0005548788.localdomain podman[223738]: 2025-12-06 09:44:44.322651953 +0000 UTC m=+0.139074341 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:44:44 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:44:44 np0005548788.localdomain sudo[223845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvzmrbximnxrmyrlljlinkqwjfaduvcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014284.3093064-2940-42245840223017/AnsiballZ_file.py
Dec 06 09:44:44 np0005548788.localdomain sudo[223845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:44 np0005548788.localdomain python3.9[223847]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:44 np0005548788.localdomain sudo[223845]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:45 np0005548788.localdomain sudo[223955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgtntalbovwscogkpcgvsobpmcndympb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014285.0040977-2940-250327439181302/AnsiballZ_file.py
Dec 06 09:44:45 np0005548788.localdomain sudo[223955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:45 np0005548788.localdomain python3.9[223957]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:45 np0005548788.localdomain sudo[223955]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:45 np0005548788.localdomain sudo[224065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilwrvaqcnlzteilntlfqtihghsyyekld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014285.677969-2940-89004557178100/AnsiballZ_file.py
Dec 06 09:44:45 np0005548788.localdomain sudo[224065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:46 np0005548788.localdomain python3.9[224067]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:46 np0005548788.localdomain sudo[224065]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61706 DF PROTO=TCP SPT=57746 DPT=9100 SEQ=3498847918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA90F10000000001030307) 
Dec 06 09:44:46 np0005548788.localdomain sudo[224175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iryqofvznnyeefcdzvnlncylqvrzonvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014286.3397884-2940-101046959721023/AnsiballZ_file.py
Dec 06 09:44:46 np0005548788.localdomain sudo[224175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:46 np0005548788.localdomain python3.9[224177]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:46 np0005548788.localdomain sudo[224175]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:47 np0005548788.localdomain sudo[224285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-miyqncociuurwtbjjuheodxbrosisant ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014287.0135942-2940-182897747907414/AnsiballZ_file.py
Dec 06 09:44:47 np0005548788.localdomain sudo[224285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:44:47.402 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:44:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:44:47.402 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:44:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:44:47.402 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:44:47 np0005548788.localdomain python3.9[224287]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:47 np0005548788.localdomain sudo[224285]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:48 np0005548788.localdomain sudo[224395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siylkxwbdcummcmxehnhdtgcpdvmidod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014287.6568666-2940-234156620479906/AnsiballZ_file.py
Dec 06 09:44:48 np0005548788.localdomain sudo[224395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:49 np0005548788.localdomain python3.9[224397]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:49 np0005548788.localdomain sudo[224395]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53917 DF PROTO=TCP SPT=40560 DPT=9102 SEQ=2030306770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EA9BE40000000001030307) 
Dec 06 09:44:49 np0005548788.localdomain sudo[224505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpnrgtfpwvmxjzjwuutpxgajgnvniwmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014289.27557-2940-50953407280793/AnsiballZ_file.py
Dec 06 09:44:49 np0005548788.localdomain sudo[224505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:49 np0005548788.localdomain python3.9[224507]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:49 np0005548788.localdomain sudo[224505]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53919 DF PROTO=TCP SPT=40560 DPT=9102 SEQ=2030306770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAA7F00000000001030307) 
Dec 06 09:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:44:56 np0005548788.localdomain podman[224563]: 2025-12-06 09:44:56.267846689 +0000 UTC m=+0.085807020 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:44:56 np0005548788.localdomain podman[224563]: 2025-12-06 09:44:56.303247252 +0000 UTC m=+0.121207563 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:44:56 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:44:56 np0005548788.localdomain sudo[224639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbozaifjddjypciznkknjzdffnwhjlhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014295.9581857-3265-199096107090333/AnsiballZ_getent.py
Dec 06 09:44:56 np0005548788.localdomain sudo[224639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53920 DF PROTO=TCP SPT=40560 DPT=9102 SEQ=2030306770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAB7B00000000001030307) 
Dec 06 09:44:56 np0005548788.localdomain python3.9[224641]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 06 09:44:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:44:56 np0005548788.localdomain sudo[224639]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:56 np0005548788.localdomain systemd[1]: tmp-crun.MxLFcG.mount: Deactivated successfully.
Dec 06 09:44:56 np0005548788.localdomain podman[224643]: 2025-12-06 09:44:56.773483059 +0000 UTC m=+0.089562236 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:44:56 np0005548788.localdomain podman[224643]: 2025-12-06 09:44:56.783814448 +0000 UTC m=+0.099893655 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:44:56 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:44:57 np0005548788.localdomain sudo[224769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-narcwgxjkalhkebdgfuyxplwirgcmsrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014296.867666-3289-55640503668581/AnsiballZ_group.py
Dec 06 09:44:57 np0005548788.localdomain sudo[224769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:57 np0005548788.localdomain python3.9[224771]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:44:57 np0005548788.localdomain groupadd[224772]: group added to /etc/group: name=nova, GID=42436
Dec 06 09:44:57 np0005548788.localdomain groupadd[224772]: group added to /etc/gshadow: name=nova
Dec 06 09:44:57 np0005548788.localdomain groupadd[224772]: new group: name=nova, GID=42436
Dec 06 09:44:57 np0005548788.localdomain sudo[224769]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:58 np0005548788.localdomain sudo[224885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itrwdfewppxhegetonnogazreaohiwte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014297.782026-3313-121543912577220/AnsiballZ_user.py
Dec 06 09:44:58 np0005548788.localdomain sudo[224885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:58 np0005548788.localdomain python3.9[224887]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548788.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:44:58 np0005548788.localdomain useradd[224889]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 06 09:44:58 np0005548788.localdomain useradd[224889]: add 'nova' to group 'libvirt'
Dec 06 09:44:58 np0005548788.localdomain useradd[224889]: add 'nova' to shadow group 'libvirt'
Dec 06 09:44:58 np0005548788.localdomain sudo[224885]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61708 DF PROTO=TCP SPT=57746 DPT=9100 SEQ=3498847918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAC1F00000000001030307) 
Dec 06 09:44:59 np0005548788.localdomain sshd[224913]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:59 np0005548788.localdomain sshd[224913]: Accepted publickey for zuul from 192.168.122.30 port 46752 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:44:59 np0005548788.localdomain systemd-logind[765]: New session 55 of user zuul.
Dec 06 09:44:59 np0005548788.localdomain systemd[1]: Started Session 55 of User zuul.
Dec 06 09:44:59 np0005548788.localdomain sshd[224913]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:44:59 np0005548788.localdomain sshd[224916]: Received disconnect from 192.168.122.30 port 46752:11: disconnected by user
Dec 06 09:44:59 np0005548788.localdomain sshd[224916]: Disconnected from user zuul 192.168.122.30 port 46752
Dec 06 09:44:59 np0005548788.localdomain sshd[224913]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:44:59 np0005548788.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Dec 06 09:44:59 np0005548788.localdomain systemd-logind[765]: Session 55 logged out. Waiting for processes to exit.
Dec 06 09:44:59 np0005548788.localdomain systemd-logind[765]: Removed session 55.
Dec 06 09:45:00 np0005548788.localdomain sshd[225025]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:45:00 np0005548788.localdomain python3.9[225024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:02 np0005548788.localdomain python3.9[225112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014300.121064-3388-210709673532557/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:02 np0005548788.localdomain sudo[225137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:45:02 np0005548788.localdomain sudo[225137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:02 np0005548788.localdomain sudo[225137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:02 np0005548788.localdomain sudo[225186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:45:02 np0005548788.localdomain sudo[225186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:02 np0005548788.localdomain python3.9[225256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:03 np0005548788.localdomain python3.9[225352]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:03 np0005548788.localdomain podman[225385]: 2025-12-06 09:45:03.423428126 +0000 UTC m=+0.112934007 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container)
Dec 06 09:45:03 np0005548788.localdomain podman[225385]: 2025-12-06 09:45:03.55863716 +0000 UTC m=+0.248143041 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, name=rhceph)
Dec 06 09:45:03 np0005548788.localdomain sudo[225186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:03 np0005548788.localdomain sudo[225474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:45:03 np0005548788.localdomain sudo[225474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:03 np0005548788.localdomain sudo[225474]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:04 np0005548788.localdomain sudo[225522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:45:04 np0005548788.localdomain sudo[225522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:04 np0005548788.localdomain python3.9[225592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:04 np0005548788.localdomain sudo[225522]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17576 DF PROTO=TCP SPT=38236 DPT=9101 SEQ=4226545680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAD7EE0000000001030307) 
Dec 06 09:45:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35005 DF PROTO=TCP SPT=59024 DPT=9882 SEQ=2161238661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAD7F00000000001030307) 
Dec 06 09:45:05 np0005548788.localdomain python3.9[225709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014303.9398797-3388-54759249513778/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:05 np0005548788.localdomain sudo[225727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:45:05 np0005548788.localdomain sudo[225727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:05 np0005548788.localdomain sudo[225727]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:05 np0005548788.localdomain python3.9[225835]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:06 np0005548788.localdomain python3.9[225921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014305.174637-3388-118959577552859/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=975829b10d65b228c43d1745a85328aeeb7df1e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:06 np0005548788.localdomain python3.9[226029]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:07 np0005548788.localdomain python3.9[226115]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014306.3702445-3388-216940536058681/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17578 DF PROTO=TCP SPT=38236 DPT=9101 SEQ=4226545680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAE3F20000000001030307) 
Dec 06 09:45:08 np0005548788.localdomain python3.9[226223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:08 np0005548788.localdomain python3.9[226309]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014307.6203325-3388-46005530908686/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:09 np0005548788.localdomain sudo[226417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iywyrecbpmiozpgbyyupvpgogrsjzpus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014308.916416-3637-96101985993878/AnsiballZ_file.py
Dec 06 09:45:09 np0005548788.localdomain sudo[226417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:09 np0005548788.localdomain python3.9[226419]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:09 np0005548788.localdomain sudo[226417]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:09 np0005548788.localdomain sudo[226527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocbpqzvcslztntkzrmudydfxngjhvnkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014309.619378-3661-236308348749816/AnsiballZ_copy.py
Dec 06 09:45:09 np0005548788.localdomain sudo[226527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:10 np0005548788.localdomain python3.9[226529]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:10 np0005548788.localdomain sudo[226527]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4925 DF PROTO=TCP SPT=47114 DPT=9105 SEQ=3748518699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAEDF00000000001030307) 
Dec 06 09:45:10 np0005548788.localdomain sudo[226637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpdanijgxrgzhdgzvztpljhbzjrwtazo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014310.2970889-3685-259544261985206/AnsiballZ_stat.py
Dec 06 09:45:10 np0005548788.localdomain sudo[226637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:10 np0005548788.localdomain python3.9[226639]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:10 np0005548788.localdomain sudo[226637]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:11 np0005548788.localdomain sudo[226749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygejltwcnnydlwprcsafrnmnxghmuqsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014311.0971844-3712-272313776143590/AnsiballZ_file.py
Dec 06 09:45:11 np0005548788.localdomain sudo[226749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:11 np0005548788.localdomain python3.9[226751]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:11 np0005548788.localdomain sudo[226749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:12 np0005548788.localdomain python3.9[226859]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:12 np0005548788.localdomain python3.9[226969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:13 np0005548788.localdomain python3.9[227055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014312.5543532-3763-45702825849002/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21669 DF PROTO=TCP SPT=37762 DPT=9100 SEQ=1452489690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EAFA100000000001030307) 
Dec 06 09:45:14 np0005548788.localdomain python3.9[227163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:14 np0005548788.localdomain python3.9[227249]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014313.7106178-3808-131192151794715/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:45:15 np0005548788.localdomain podman[227267]: 2025-12-06 09:45:15.26830169 +0000 UTC m=+0.088022369 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:45:15 np0005548788.localdomain podman[227267]: 2025-12-06 09:45:15.308664036 +0000 UTC m=+0.128384655 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 09:45:15 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:45:15 np0005548788.localdomain sudo[227377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcjvirnnjpvhexvktmspeueheswqladm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014315.2297013-3859-190042852933697/AnsiballZ_container_config_data.py
Dec 06 09:45:15 np0005548788.localdomain sudo[227377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:15 np0005548788.localdomain python3.9[227379]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 06 09:45:15 np0005548788.localdomain sudo[227377]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:16 np0005548788.localdomain sudo[227487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agrdbwvkjdlqiedupnciidlklaazdeek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014316.1064339-3886-56113745342/AnsiballZ_container_config_hash.py
Dec 06 09:45:16 np0005548788.localdomain sudo[227487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:16 np0005548788.localdomain python3.9[227489]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:45:16 np0005548788.localdomain sudo[227487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21671 DF PROTO=TCP SPT=37762 DPT=9100 SEQ=1452489690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB06300000000001030307) 
Dec 06 09:45:17 np0005548788.localdomain sudo[227597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stkzvrfmlekzzrzuyjxuoxbbtrpjjxaa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014317.0810342-3916-224315589051578/AnsiballZ_edpm_container_manage.py
Dec 06 09:45:17 np0005548788.localdomain sudo[227597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:17 np0005548788.localdomain python3[227599]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:45:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47903 DF PROTO=TCP SPT=55104 DPT=9102 SEQ=3180477417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB11150000000001030307) 
Dec 06 09:45:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47905 DF PROTO=TCP SPT=55104 DPT=9102 SEQ=3180477417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB1D300000000001030307) 
Dec 06 09:45:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47906 DF PROTO=TCP SPT=55104 DPT=9102 SEQ=3180477417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB2CF00000000001030307) 
Dec 06 09:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:45:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:45:28 np0005548788.localdomain podman[227653]: 2025-12-06 09:45:28.871342944 +0000 UTC m=+1.673115713 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 09:45:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21673 DF PROTO=TCP SPT=37762 DPT=9100 SEQ=1452489690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB35F10000000001030307) 
Dec 06 09:45:28 np0005548788.localdomain podman[227653]: 2025-12-06 09:45:28.912762083 +0000 UTC m=+1.714534842 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:45:28 np0005548788.localdomain systemd[1]: tmp-crun.5Q1LMC.mount: Deactivated successfully.
Dec 06 09:45:28 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:45:28 np0005548788.localdomain podman[227614]: 2025-12-06 09:45:17.737690156 +0000 UTC m=+0.034848837 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:28 np0005548788.localdomain podman[227652]: 2025-12-06 09:45:28.943540553 +0000 UTC m=+1.743107414 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:45:28 np0005548788.localdomain podman[227652]: 2025-12-06 09:45:28.955669197 +0000 UTC m=+1.755236048 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:45:28 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:45:29 np0005548788.localdomain podman[227720]: 
Dec 06 09:45:29 np0005548788.localdomain podman[227720]: 2025-12-06 09:45:29.184741099 +0000 UTC m=+0.093840647 container create 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 09:45:29 np0005548788.localdomain podman[227720]: 2025-12-06 09:45:29.141076861 +0000 UTC m=+0.050176479 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:29 np0005548788.localdomain python3[227599]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 06 09:45:29 np0005548788.localdomain sudo[227597]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:29 np0005548788.localdomain sudo[227865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixrjmohpfjnlcxjwgqktejcivhpeguty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014329.628661-3940-125494946028370/AnsiballZ_stat.py
Dec 06 09:45:29 np0005548788.localdomain sudo[227865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:30 np0005548788.localdomain python3.9[227867]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:30 np0005548788.localdomain sudo[227865]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:31 np0005548788.localdomain sudo[227977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmkgjpphledljxwojelypkfsavvlcxgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014330.9480584-3976-29447938482699/AnsiballZ_container_config_data.py
Dec 06 09:45:31 np0005548788.localdomain sudo[227977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:31 np0005548788.localdomain python3.9[227979]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 06 09:45:31 np0005548788.localdomain sudo[227977]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:32 np0005548788.localdomain sudo[228087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atamsrnvrkdsklzrbktqtmruukcowvii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014331.8078232-4003-241135749523974/AnsiballZ_container_config_hash.py
Dec 06 09:45:32 np0005548788.localdomain sudo[228087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:32 np0005548788.localdomain python3.9[228089]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:45:32 np0005548788.localdomain sudo[228087]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:33 np0005548788.localdomain sudo[228197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdhoytpcpyutpqojoadxqgisgkeuqnnt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014332.8053653-4033-140533226610096/AnsiballZ_edpm_container_manage.py
Dec 06 09:45:33 np0005548788.localdomain sudo[228197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:33 np0005548788.localdomain python3[228199]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:45:33 np0005548788.localdomain python3[228199]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:34 np0005548788.localdomain podman[228251]: 2025-12-06 09:45:34.751550498 +0000 UTC m=+0.978053436 container remove 56b87b162c9519f4efd3e391e66214936d37678975058f94b1dcb1ab31e9ae76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c6dd5c7aeba6260998a0bbde3ab20933-558ed7a6d0c1bb3d92c212dc57d9717b'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible)
Dec 06 09:45:34 np0005548788.localdomain python3[228199]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Dec 06 09:45:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2156 DF PROTO=TCP SPT=58530 DPT=9101 SEQ=4185925907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB4D1E0000000001030307) 
Dec 06 09:45:34 np0005548788.localdomain podman[228266]: 
Dec 06 09:45:34 np0005548788.localdomain podman[228266]: 2025-12-06 09:45:34.847967855 +0000 UTC m=+0.078599157 container create b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 06 09:45:34 np0005548788.localdomain podman[228266]: 2025-12-06 09:45:34.807747503 +0000 UTC m=+0.038378855 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:34 np0005548788.localdomain python3[228199]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 06 09:45:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5997 DF PROTO=TCP SPT=57480 DPT=9882 SEQ=4089470096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB4DF00000000001030307) 
Dec 06 09:45:35 np0005548788.localdomain sudo[228197]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:35 np0005548788.localdomain sudo[228412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvfazlzdybqcshtyfxelseidomxobqna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014335.268428-4057-73122016377368/AnsiballZ_stat.py
Dec 06 09:45:35 np0005548788.localdomain sudo[228412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:35 np0005548788.localdomain python3.9[228414]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:35 np0005548788.localdomain sudo[228412]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:36 np0005548788.localdomain sudo[228524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eodshszwpyrvxorgpcckakvanzvvcjie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.19021-4084-20202040114806/AnsiballZ_file.py
Dec 06 09:45:36 np0005548788.localdomain sudo[228524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:36 np0005548788.localdomain python3.9[228526]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:36 np0005548788.localdomain sudo[228524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:37 np0005548788.localdomain sudo[228633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkckqdhhxlrjsyjfanyaglixehoqkdcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.7947388-4084-57465066074218/AnsiballZ_copy.py
Dec 06 09:45:37 np0005548788.localdomain sudo[228633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:37 np0005548788.localdomain python3.9[228635]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014336.7947388-4084-57465066074218/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:37 np0005548788.localdomain sudo[228633]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:37 np0005548788.localdomain sudo[228688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmzzsxiapbivkmvkeauxgppcivnfziwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.7947388-4084-57465066074218/AnsiballZ_systemd.py
Dec 06 09:45:37 np0005548788.localdomain sudo[228688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2158 DF PROTO=TCP SPT=58530 DPT=9101 SEQ=4185925907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB59300000000001030307) 
Dec 06 09:45:38 np0005548788.localdomain python3.9[228690]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:45:38 np0005548788.localdomain systemd-sysv-generator[228720]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:45:38 np0005548788.localdomain systemd-rc-local-generator[228717]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548788.localdomain sudo[228688]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:38 np0005548788.localdomain sudo[228778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnemnxljjaxdzphqjtvaqclhmfucwfmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.7947388-4084-57465066074218/AnsiballZ_systemd.py
Dec 06 09:45:38 np0005548788.localdomain sudo[228778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:39 np0005548788.localdomain python3.9[228780]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:45:39 np0005548788.localdomain systemd-rc-local-generator[228810]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:45:39 np0005548788.localdomain systemd-sysv-generator[228813]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: Starting nova_compute container...
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:45:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548788.localdomain podman[228821]: 2025-12-06 09:45:39.571254363 +0000 UTC m=+0.142188771 container init b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:45:39 np0005548788.localdomain podman[228821]: 2025-12-06 09:45:39.580974422 +0000 UTC m=+0.151908830 container start b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 06 09:45:39 np0005548788.localdomain podman[228821]: nova_compute
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + sudo -E kolla_set_configs
Dec 06 09:45:39 np0005548788.localdomain systemd[1]: Started nova_compute container.
Dec 06 09:45:39 np0005548788.localdomain sudo[228778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Validating config file
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying service configuration files
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Deleting /etc/ceph
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Creating directory /etc/ceph
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Writing out command to execute
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: ++ cat /run_command
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + CMD=nova-compute
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + ARGS=
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + sudo kolla_copy_cacerts
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + [[ ! -n '' ]]
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + . kolla_extend_start
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: Running command: 'nova-compute'
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + umask 0022
Dec 06 09:45:39 np0005548788.localdomain nova_compute[228836]: + exec nova-compute
Dec 06 09:45:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64901 DF PROTO=TCP SPT=43720 DPT=9105 SEQ=1938518822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB63F00000000001030307) 
Dec 06 09:45:40 np0005548788.localdomain python3.9[228956]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.506 228840 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.507 228840 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.507 228840 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.507 228840 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.633 228840 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.656 228840 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:41 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:41.656 228840 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 09:45:41 np0005548788.localdomain python3.9[229066]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.064 228840 INFO nova.virt.driver [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.176 228840 INFO nova.compute.provider_config [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.183 228840 WARNING nova.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.183 228840 DEBUG oslo_concurrency.lockutils [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.183 228840 DEBUG oslo_concurrency.lockutils [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.184 228840 DEBUG oslo_concurrency.lockutils [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.184 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.184 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.184 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.184 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.184 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.185 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.186 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] console_host                   = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.187 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.188 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.189 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.190 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.191 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.192 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.193 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.194 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.195 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.196 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.197 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.198 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.199 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.200 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.201 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.202 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.203 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.204 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.204 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.204 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.204 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.204 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.204 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.205 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.206 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.207 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.208 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.209 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.210 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.211 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.212 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.213 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.214 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.215 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.216 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.217 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.218 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.219 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.220 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.221 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.221 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.221 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.221 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.221 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.221 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.222 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.223 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.224 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.225 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.226 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.227 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.228 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.229 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.230 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.231 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.232 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.232 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.232 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.232 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.232 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.233 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.234 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.235 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.236 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.237 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.238 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.239 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.240 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.241 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.242 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.243 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.244 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.245 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.246 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.247 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.248 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.249 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.250 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.251 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.251 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.251 228840 WARNING oslo_config.cfg [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: and ``live_migration_inbound_addr`` respectively.
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: ).  Its value may be silently ignored in the future.
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.251 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.251 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.251 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.252 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.253 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rbd_secret_uuid        = 1939e851-b10c-5c3b-9bb7-8e7f380233e8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.254 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.255 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.255 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.255 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.255 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.255 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.255 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.256 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.257 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.258 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.259 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.260 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.261 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.262 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.263 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.264 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.265 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.266 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.267 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.268 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.269 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.270 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.270 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.270 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.270 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.270 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.270 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.271 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.272 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.273 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.274 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.275 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.276 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.276 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.276 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.276 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.276 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.276 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.277 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.278 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.278 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.278 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.278 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.278 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.278 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.279 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.279 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.279 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.279 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.279 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.279 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.280 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.281 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.282 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.282 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.282 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.282 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.282 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.282 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.283 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.284 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.285 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.286 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.286 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.286 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.286 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.286 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.286 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.287 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.288 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.289 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.290 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.291 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.292 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.293 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.294 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.295 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.296 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.297 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.298 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.298 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.298 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.298 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.299 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.300 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.301 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.302 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.303 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.304 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.305 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.306 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.307 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.308 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.309 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.310 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.310 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.310 228840 DEBUG oslo_service.service [None req-eff93721-6911-4f25-bd41-43d4caa2dc77 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.311 228840 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.389 228840 INFO nova.virt.node [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Determined node identity 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from /var/lib/nova/compute_id
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.390 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.391 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.392 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.392 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 09:45:42 np0005548788.localdomain systemd[1]: Started libvirt QEMU daemon.
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.476 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7ff894148b80> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.479 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7ff894148b80> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.480 228840 INFO nova.virt.libvirt.driver [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Connection event '1' reason 'None'
Dec 06 09:45:42 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:42.491 228840 DEBUG nova.virt.libvirt.volume.mount [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.372 228840 INFO nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <host>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <uuid>74aa0f2e-bd78-406d-a4f0-2263c03ef4c3</uuid>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <arch>x86_64</arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model>EPYC-Rome-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <vendor>AMD</vendor>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <microcode version='16777317'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <signature family='23' model='49' stepping='0'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='x2apic'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='tsc-deadline'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='osxsave'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='hypervisor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='tsc_adjust'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='spec-ctrl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='stibp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='arch-capabilities'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='cmp_legacy'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='topoext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='virt-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='lbrv'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='tsc-scale'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='vmcb-clean'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='pause-filter'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='pfthreshold'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='svme-addr-chk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='rdctl-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='mds-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature name='pschange-mc-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <pages unit='KiB' size='4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <pages unit='KiB' size='2048'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <pages unit='KiB' size='1048576'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <power_management>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <suspend_mem/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <suspend_disk/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <suspend_hybrid/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </power_management>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <iommu support='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <migration_features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <live/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <uri_transports>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <uri_transport>tcp</uri_transport>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <uri_transport>rdma</uri_transport>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </uri_transports>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </migration_features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <topology>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <cells num='1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <cell id='0'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           <memory unit='KiB'>16116604</memory>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           <pages unit='KiB' size='4'>4029151</pages>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           <distances>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <sibling id='0' value='10'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           </distances>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           <cpus num='8'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:           </cpus>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         </cell>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </cells>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </topology>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <cache>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </cache>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <secmodel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model>selinux</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <doi>0</doi>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </secmodel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <secmodel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model>dac</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <doi>0</doi>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </secmodel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </host>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <guest>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <os_type>hvm</os_type>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <arch name='i686'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <wordsize>32</wordsize>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <domain type='qemu'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <domain type='kvm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <pae/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <nonpae/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <apic default='on' toggle='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <cpuselection/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <deviceboot/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <externalSnapshot/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </guest>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <guest>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <os_type>hvm</os_type>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <arch name='x86_64'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <wordsize>64</wordsize>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <domain type='qemu'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <domain type='kvm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <apic default='on' toggle='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <cpuselection/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <deviceboot/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <externalSnapshot/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </guest>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: </capabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.382 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.404 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: <domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <domain>kvm</domain>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <arch>i686</arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <vcpu max='240'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <iothreads supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <os supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='firmware'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <loader supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>rom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pflash</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='readonly'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>yes</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='secure'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </loader>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </os>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='maximumMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <vendor>AMD</vendor>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='succor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='custom' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-128'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-256'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-512'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <memoryBacking supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='sourceType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>anonymous</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>memfd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </memoryBacking>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <disk supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='diskDevice'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>disk</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cdrom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>floppy</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>lun</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ide</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>fdc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>sata</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </disk>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <graphics supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vnc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egl-headless</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </graphics>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <video supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='modelType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vga</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cirrus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>none</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>bochs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ramfb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </video>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hostdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='mode'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>subsystem</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='startupPolicy'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>mandatory</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>requisite</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>optional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='subsysType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pci</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='capsType'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='pciBackend'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hostdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <rng supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>random</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </rng>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <filesystem supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='driverType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>path</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>handle</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtiofs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </filesystem>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <tpm supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-tis</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-crb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emulator</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>external</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendVersion'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>2.0</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </tpm>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <redirdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </redirdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <channel supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </channel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <crypto supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </crypto>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <interface supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>passt</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </interface>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <panic supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>isa</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>hyperv</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </panic>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <console supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>null</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dev</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pipe</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stdio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>udp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tcp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu-vdagent</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </console>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <gic supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <genid supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backup supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <async-teardown supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <ps2 supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sev supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sgx supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hyperv supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='features'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>relaxed</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vapic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>spinlocks</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vpindex</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>runtime</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>synic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stimer</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reset</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vendor_id</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>frequencies</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reenlightenment</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tlbflush</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ipi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>avic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emsr_bitmap</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>xmm_input</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hyperv>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <launchSecurity supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='sectype'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tdx</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </launchSecurity>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: </domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.413 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: <domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <domain>kvm</domain>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <arch>i686</arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <vcpu max='1024'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <iothreads supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <os supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='firmware'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <loader supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>rom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pflash</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='readonly'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>yes</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='secure'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </loader>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </os>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='maximumMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <vendor>AMD</vendor>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='succor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='custom' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-128'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-256'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-512'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <memoryBacking supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='sourceType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>anonymous</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>memfd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </memoryBacking>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <disk supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='diskDevice'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>disk</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cdrom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>floppy</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>lun</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>fdc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>sata</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </disk>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <graphics supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vnc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egl-headless</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </graphics>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <video supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='modelType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vga</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cirrus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>none</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>bochs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ramfb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </video>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hostdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='mode'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>subsystem</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='startupPolicy'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>mandatory</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>requisite</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>optional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='subsysType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pci</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='capsType'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='pciBackend'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hostdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <rng supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>random</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </rng>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <filesystem supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='driverType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>path</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>handle</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtiofs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </filesystem>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <tpm supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-tis</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-crb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emulator</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>external</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendVersion'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>2.0</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </tpm>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <redirdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </redirdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <channel supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </channel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <crypto supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </crypto>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <interface supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>passt</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </interface>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <panic supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>isa</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>hyperv</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </panic>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <console supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>null</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dev</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pipe</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stdio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>udp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tcp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu-vdagent</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </console>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <gic supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <genid supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backup supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <async-teardown supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <ps2 supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sev supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sgx supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hyperv supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='features'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>relaxed</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vapic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>spinlocks</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vpindex</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>runtime</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>synic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stimer</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reset</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vendor_id</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>frequencies</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reenlightenment</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tlbflush</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ipi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>avic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emsr_bitmap</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>xmm_input</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hyperv>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <launchSecurity supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='sectype'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tdx</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </launchSecurity>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: </domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.458 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.463 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: <domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <domain>kvm</domain>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <arch>x86_64</arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <vcpu max='240'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <iothreads supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <os supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='firmware'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <loader supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>rom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pflash</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='readonly'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>yes</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='secure'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </loader>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </os>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='maximumMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <vendor>AMD</vendor>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='succor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='custom' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41045 DF PROTO=TCP SPT=46586 DPT=9100 SEQ=3696810987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB6F3F0000000001030307) 
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-128'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-256'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-512'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <memoryBacking supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='sourceType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>anonymous</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>memfd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </memoryBacking>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <disk supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='diskDevice'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>disk</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cdrom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>floppy</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>lun</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ide</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>fdc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>sata</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </disk>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <graphics supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vnc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egl-headless</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </graphics>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <video supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='modelType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vga</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cirrus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>none</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>bochs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ramfb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </video>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hostdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='mode'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>subsystem</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='startupPolicy'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>mandatory</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>requisite</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>optional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='subsysType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pci</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='capsType'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='pciBackend'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hostdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <rng supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>random</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </rng>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <filesystem supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='driverType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>path</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>handle</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtiofs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </filesystem>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <tpm supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-tis</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-crb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emulator</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>external</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendVersion'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>2.0</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </tpm>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <redirdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </redirdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <channel supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </channel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <crypto supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </crypto>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <interface supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>passt</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </interface>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <panic supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>isa</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>hyperv</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </panic>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <console supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>null</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dev</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pipe</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stdio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>udp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tcp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu-vdagent</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </console>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <gic supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <genid supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backup supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <async-teardown supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <ps2 supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sev supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sgx supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hyperv supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='features'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>relaxed</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vapic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>spinlocks</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vpindex</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>runtime</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>synic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stimer</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reset</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vendor_id</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>frequencies</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reenlightenment</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tlbflush</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ipi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>avic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emsr_bitmap</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>xmm_input</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hyperv>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <launchSecurity supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='sectype'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tdx</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </launchSecurity>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: </domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.520 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: <domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <domain>kvm</domain>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <arch>x86_64</arch>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <vcpu max='1024'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <iothreads supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <os supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='firmware'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>efi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <loader supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>rom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pflash</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='readonly'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>yes</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='secure'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>yes</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>no</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </loader>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </os>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='maximumMigratable'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>on</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>off</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <vendor>AMD</vendor>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='succor'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <mode name='custom' supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Denverton-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='auto-ibrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amd-psfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='stibp-always-on'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='EPYC-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-128'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-256'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx10-512'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='prefetchiti'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Haswell-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512er'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512pf'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fma4'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tbm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xop'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='amx-tile'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-bf16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-fp16'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bitalg'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrc'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fzrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='la57'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='taa-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xfd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ifma'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cmpccxadd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fbsdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='fsrs'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ibrs-all'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mcdt-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pbrsb-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='psdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='serialize'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vaes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='hle'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='rtm'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512bw'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512cd'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512dq'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512f'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='avx512vl'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='invpcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pcid'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='pku'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='mpx'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='core-capability'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='split-lock-detect'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='cldemote'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='erms'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='gfni'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdir64b'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='movdiri'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='xsaves'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='athlon-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='core2duo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='coreduo-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='n270-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='ss'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <blockers model='phenom-v1'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnow'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <feature name='3dnowext'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </blockers>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </mode>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </cpu>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <memoryBacking supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <enum name='sourceType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>anonymous</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <value>memfd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </memoryBacking>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <disk supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='diskDevice'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>disk</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cdrom</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>floppy</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>lun</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>fdc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>sata</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </disk>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <graphics supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vnc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egl-headless</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </graphics>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <video supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='modelType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vga</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>cirrus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>none</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>bochs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ramfb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </video>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hostdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='mode'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>subsystem</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='startupPolicy'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>mandatory</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>requisite</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>optional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='subsysType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pci</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>scsi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='capsType'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='pciBackend'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hostdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <rng supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtio-non-transitional</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>random</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>egd</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </rng>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <filesystem supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='driverType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>path</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>handle</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>virtiofs</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </filesystem>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <tpm supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-tis</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tpm-crb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emulator</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>external</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendVersion'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>2.0</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </tpm>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <redirdev supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='bus'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>usb</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </redirdev>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <channel supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </channel>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <crypto supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendModel'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>builtin</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </crypto>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <interface supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='backendType'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>default</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>passt</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </interface>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <panic supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='model'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>isa</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>hyperv</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </panic>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <console supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='type'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>null</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vc</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pty</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dev</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>file</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>pipe</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stdio</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>udp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tcp</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>unix</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>qemu-vdagent</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>dbus</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </console>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </devices>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   <features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <gic supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <genid supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <backup supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <async-teardown supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <ps2 supported='yes'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sev supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <sgx supported='no'/>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <hyperv supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='features'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>relaxed</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vapic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>spinlocks</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vpindex</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>runtime</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>synic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>stimer</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reset</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>vendor_id</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>frequencies</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>reenlightenment</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tlbflush</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>ipi</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>avic</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>emsr_bitmap</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>xmm_input</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </defaults>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </hyperv>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     <launchSecurity supported='yes'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       <enum name='sectype'>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:         <value>tdx</value>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:       </enum>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:     </launchSecurity>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:   </features>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: </domainCapabilities>
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.570 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.570 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.570 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.571 228840 INFO nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Secure Boot support detected
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.573 228840 INFO nova.virt.libvirt.driver [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.573 228840 INFO nova.virt.libvirt.driver [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.587 228840 DEBUG nova.virt.libvirt.driver [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.607 228840 INFO nova.virt.node [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Determined node identity 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from /var/lib/nova/compute_id
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.624 228840 DEBUG nova.compute.manager [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Verified node 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 matches my host np0005548788.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 06 09:45:43 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:43.656 228840 INFO nova.compute.manager [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.083 228840 INFO nova.service [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Updating service version for nova-compute on np0005548788.localdomain from 57 to 66
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.127 228840 DEBUG oslo_concurrency.lockutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.127 228840 DEBUG oslo_concurrency.lockutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.127 228840 DEBUG oslo_concurrency.lockutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.128 228840 DEBUG nova.compute.resource_tracker [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.129 228840 DEBUG oslo_concurrency.processutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.600 228840 DEBUG oslo_concurrency.processutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:44 np0005548788.localdomain systemd[1]: Started libvirt nodedev daemon.
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.978 228840 WARNING nova.virt.libvirt.driver [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.981 228840 DEBUG nova.compute.resource_tracker [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13634MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.981 228840 DEBUG oslo_concurrency.lockutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:44 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:44.982 228840 DEBUG oslo_concurrency.lockutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.070 228840 DEBUG nova.compute.resource_tracker [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.070 228840 DEBUG nova.compute.resource_tracker [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.136 228840 DEBUG nova.scheduler.client.report [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.160 228840 DEBUG nova.scheduler.client.report [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.160 228840 DEBUG nova.compute.provider_tree [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.210 228840 DEBUG nova.scheduler.client.report [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.232 228840 DEBUG nova.scheduler.client.report [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_BMI2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,HW_CPU_X86_SHA,HW_CPU_X86_SSE4A,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.249 228840 DEBUG oslo_concurrency.processutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:45 np0005548788.localdomain sshd[229252]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:45:45 np0005548788.localdomain python3.9[229291]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.721 228840 DEBUG oslo_concurrency.processutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.727 228840 DEBUG nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.727 228840 INFO nova.virt.libvirt.host [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] kernel doesn't support AMD SEV
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.729 228840 DEBUG nova.compute.provider_tree [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.729 228840 DEBUG nova.virt.libvirt.driver [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.756 228840 DEBUG nova.scheduler.client.report [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.812 228840 DEBUG nova.compute.provider_tree [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Updating resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 generation from 2 to 3 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.838 228840 DEBUG nova.compute.resource_tracker [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.839 228840 DEBUG oslo_concurrency.lockutils [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.839 228840 DEBUG nova.service [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.881 228840 DEBUG nova.service [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 09:45:45 np0005548788.localdomain nova_compute[228836]: 2025-12-06 09:45:45.881 228840 DEBUG nova.servicegroup.drivers.db [None req-c99345d3-ad59-4839-a03a-ec2628768a5c - - - - - -] DB_Driver: join new ServiceGroup member np0005548788.localdomain to the compute group, service = <Service: host=np0005548788.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 09:45:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:45:46 np0005548788.localdomain podman[229367]: 2025-12-06 09:45:46.267377395 +0000 UTC m=+0.090543227 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:45:46 np0005548788.localdomain podman[229367]: 2025-12-06 09:45:46.278482737 +0000 UTC m=+0.101648529 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:45:46 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:45:46 np0005548788.localdomain sudo[229439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcxfdbaueuqpxisiufwwcjqyxhfqispp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014345.9267614-4264-98046042106507/AnsiballZ_podman_container.py
Dec 06 09:45:46 np0005548788.localdomain sudo[229439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41047 DF PROTO=TCP SPT=46586 DPT=9100 SEQ=3696810987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB7B300000000001030307) 
Dec 06 09:45:46 np0005548788.localdomain python3.9[229441]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:45:46 np0005548788.localdomain sudo[229439]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:46 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 114.1 (380 of 333 items), suggesting rotation.
Dec 06 09:45:46 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:45:46 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:45:46 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:45:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:45:47.404 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:45:47.405 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:45:47.405 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:48 np0005548788.localdomain sudo[229574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfcmyfcwfeqsaluuqckkpksdypllrfni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014347.7179902-4288-22819538437491/AnsiballZ_systemd.py
Dec 06 09:45:48 np0005548788.localdomain sudo[229574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:48 np0005548788.localdomain python3.9[229576]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:45:48 np0005548788.localdomain systemd[1]: Stopping nova_compute container...
Dec 06 09:45:48 np0005548788.localdomain systemd[1]: libpod-b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0.scope: Deactivated successfully.
Dec 06 09:45:48 np0005548788.localdomain systemd[1]: libpod-b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0.scope: Consumed 3.648s CPU time.
Dec 06 09:45:48 np0005548788.localdomain virtqemud[229107]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 06 09:45:48 np0005548788.localdomain virtqemud[229107]: hostname: np0005548788.localdomain
Dec 06 09:45:48 np0005548788.localdomain virtqemud[229107]: End of file while reading data: Input/output error
Dec 06 09:45:48 np0005548788.localdomain podman[229580]: 2025-12-06 09:45:48.615018012 +0000 UTC m=+0.138059483 container died b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:45:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43724 DF PROTO=TCP SPT=47234 DPT=9102 SEQ=2518988586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB86450000000001030307) 
Dec 06 09:45:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4-merged.mount: Deactivated successfully.
Dec 06 09:45:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0-userdata-shm.mount: Deactivated successfully.
Dec 06 09:45:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43726 DF PROTO=TCP SPT=47234 DPT=9102 SEQ=2518988586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EB92310000000001030307) 
Dec 06 09:45:52 np0005548788.localdomain sshd[229252]: Received disconnect from 45.78.194.186 port 52672:11: Bye Bye [preauth]
Dec 06 09:45:52 np0005548788.localdomain sshd[229252]: Disconnected from 45.78.194.186 port 52672 [preauth]
Dec 06 09:45:53 np0005548788.localdomain podman[229580]: 2025-12-06 09:45:53.993352782 +0000 UTC m=+5.516394263 container cleanup b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:45:53 np0005548788.localdomain podman[229580]: nova_compute
Dec 06 09:45:54 np0005548788.localdomain podman[229878]: error opening file `/run/crun/b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0/status`: No such file or directory
Dec 06 09:45:54 np0005548788.localdomain podman[229865]: 2025-12-06 09:45:54.096443464 +0000 UTC m=+0.068668061 container cleanup b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 09:45:54 np0005548788.localdomain podman[229865]: nova_compute
Dec 06 09:45:54 np0005548788.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 06 09:45:54 np0005548788.localdomain systemd[1]: Stopped nova_compute container.
Dec 06 09:45:54 np0005548788.localdomain systemd[1]: Starting nova_compute container...
Dec 06 09:45:54 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:45:54 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:54 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:54 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:54 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:54 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:54 np0005548788.localdomain podman[229880]: 2025-12-06 09:45:54.245463805 +0000 UTC m=+0.116059945 container init b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:45:54 np0005548788.localdomain podman[229880]: 2025-12-06 09:45:54.256013171 +0000 UTC m=+0.126609291 container start b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 06 09:45:54 np0005548788.localdomain podman[229880]: nova_compute
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + sudo -E kolla_set_configs
Dec 06 09:45:54 np0005548788.localdomain systemd[1]: Started nova_compute container.
Dec 06 09:45:54 np0005548788.localdomain sudo[229574]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Validating config file
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying service configuration files
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /etc/ceph
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Creating directory /etc/ceph
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Writing out command to execute
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: ++ cat /run_command
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + CMD=nova-compute
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + ARGS=
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + sudo kolla_copy_cacerts
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + [[ ! -n '' ]]
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + . kolla_extend_start
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: Running command: 'nova-compute'
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + umask 0022
Dec 06 09:45:54 np0005548788.localdomain nova_compute[229894]: + exec nova-compute
Dec 06 09:45:55 np0005548788.localdomain sudo[230014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtjgzkozduxtswhlmyebfhdmeycuixmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014355.2753012-4315-77596229765430/AnsiballZ_podman_container.py
Dec 06 09:45:55 np0005548788.localdomain sudo[230014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:55 np0005548788.localdomain python3.9[230016]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.014 229898 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.015 229898 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.015 229898 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.015 229898 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 09:45:56 np0005548788.localdomain systemd[1]: Started libpod-conmon-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157.scope.
Dec 06 09:45:56 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.154 229898 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:56 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:56 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:56 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.170 229898 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.170 229898 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 09:45:56 np0005548788.localdomain podman[230044]: 2025-12-06 09:45:56.17910946 +0000 UTC m=+0.169033679 container init 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 09:45:56 np0005548788.localdomain podman[230044]: 2025-12-06 09:45:56.188233621 +0000 UTC m=+0.178157840 container start 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:45:56 np0005548788.localdomain python3.9[230016]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Applying nova statedir ownership
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673
Dec 06 09:45:56 np0005548788.localdomain nova_compute_init[230067]: INFO:nova_statedir:Nova statedir ownership complete
Dec 06 09:45:56 np0005548788.localdomain systemd[1]: libpod-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157.scope: Deactivated successfully.
Dec 06 09:45:56 np0005548788.localdomain podman[230083]: 2025-12-06 09:45:56.328796122 +0000 UTC m=+0.056660801 container died 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 09:45:56 np0005548788.localdomain sudo[230014]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:56 np0005548788.localdomain podman[230083]: 2025-12-06 09:45:56.384532232 +0000 UTC m=+0.112396911 container cleanup 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:45:56 np0005548788.localdomain systemd[1]: libpod-conmon-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157.scope: Deactivated successfully.
Dec 06 09:45:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43727 DF PROTO=TCP SPT=47234 DPT=9102 SEQ=2518988586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBA1F00000000001030307) 
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.602 229898 INFO nova.virt.driver [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.720 229898 INFO nova.compute.provider_config [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.730 229898 WARNING nova.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.730 229898 DEBUG oslo_concurrency.lockutils [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.730 229898 DEBUG oslo_concurrency.lockutils [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.731 229898 DEBUG oslo_concurrency.lockutils [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.731 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.731 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.731 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.731 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.732 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.732 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.732 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.732 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.732 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.732 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.733 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.733 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.733 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.733 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.733 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.734 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.734 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.734 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.734 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.734 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] console_host                   = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.734 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.735 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.735 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.735 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.735 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.735 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.736 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.736 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.736 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.736 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.736 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.737 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.737 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.737 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.737 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.737 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.737 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.738 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.738 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.738 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.738 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.738 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.739 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.739 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.739 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.739 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.739 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.740 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.740 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.740 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.740 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.740 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.740 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.741 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.741 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.741 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.741 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.741 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.742 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.742 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.742 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.742 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.742 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.742 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.743 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.743 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.743 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.743 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.743 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.743 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.744 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.744 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.744 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.744 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.744 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.745 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.745 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.745 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.745 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.745 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.745 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.746 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.746 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.746 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.746 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.746 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.746 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.747 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.747 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.747 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.747 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.747 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.748 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.748 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.748 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.748 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.748 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.748 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.749 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.749 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.749 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.749 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.749 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.750 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.750 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.750 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.750 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.750 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.750 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.751 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.751 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.751 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.751 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.751 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.752 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.752 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.752 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.752 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.752 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.752 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.753 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.753 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.753 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.753 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.753 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.754 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.754 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.754 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.754 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.754 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.754 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.755 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.755 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.755 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.755 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.755 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.755 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.756 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.756 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.756 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.756 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.756 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.757 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.757 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.757 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.757 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.757 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.757 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.758 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.758 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.758 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.758 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.758 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.759 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.759 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.759 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.759 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.759 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.760 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.760 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.760 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.760 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.760 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.760 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.761 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.761 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.761 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.761 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.761 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.762 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.762 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.762 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.762 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.762 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.762 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.763 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.763 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.763 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.763 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.763 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.764 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.764 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.764 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.764 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.764 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.765 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.765 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.765 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.765 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.765 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.765 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.766 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.766 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.766 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.766 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.766 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.767 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.767 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.767 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.767 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.767 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.767 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.768 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.768 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.768 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.768 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.768 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.769 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.769 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.769 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.769 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.769 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.769 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.770 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.770 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.770 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.770 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.770 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.771 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.771 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.771 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.771 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.771 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.771 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.772 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.772 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.772 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.772 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.772 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.773 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.773 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.773 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.773 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.773 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.773 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.774 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.774 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.774 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.774 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.774 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.775 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.775 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.775 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.775 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.775 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.775 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.776 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.776 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.776 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.776 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.776 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.777 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.777 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.777 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.777 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.777 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.777 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.778 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.778 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.778 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.778 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.778 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.779 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.779 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.779 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.779 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.779 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.779 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.780 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.780 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.780 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.780 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.780 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.781 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.781 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.781 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.781 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.781 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.782 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.782 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.782 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.782 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.782 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.783 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.783 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.783 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.783 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.783 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.783 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.784 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.784 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.784 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.784 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.784 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.785 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.785 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.785 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.785 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.785 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.785 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.786 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.786 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.786 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.786 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.786 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.787 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.787 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.787 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.787 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.787 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.787 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.788 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.788 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.788 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.788 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.788 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.789 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.789 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.789 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.789 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.789 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.789 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.790 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.790 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.790 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.790 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.790 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.791 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.791 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.791 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.791 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.791 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.791 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.792 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.792 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.792 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.792 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.792 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.793 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.793 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.793 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.793 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.793 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.793 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.794 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.794 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.794 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.794 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.794 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.795 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.795 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.795 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.795 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.795 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.795 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.796 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.796 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.796 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.796 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.797 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.797 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.797 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.797 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.797 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.797 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.798 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.798 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.798 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.798 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.798 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.798 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.799 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.799 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.799 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.799 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.799 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.800 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.800 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.800 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.800 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.800 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.800 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.801 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.801 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.801 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.801 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.801 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.802 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.802 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.802 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.802 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.802 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.802 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.803 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.803 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.803 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.803 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.803 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.804 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.804 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.804 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.804 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.804 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.804 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.805 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.805 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.805 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.805 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.805 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.805 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.806 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.806 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.806 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.806 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.806 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.807 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.807 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.807 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.807 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.807 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.807 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.808 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.808 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.808 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.808 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.808 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.809 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.809 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.809 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.809 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.809 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.809 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.810 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.810 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.810 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.810 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.810 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.810 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.811 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.811 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.811 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.811 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.811 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.812 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.812 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.812 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.812 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.812 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.812 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.813 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.813 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.813 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.813 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.813 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.814 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.814 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.814 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.814 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.814 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.815 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.815 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.815 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.815 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.815 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.815 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.816 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.816 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.816 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.816 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.816 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.817 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.817 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.817 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.817 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.817 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.818 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.818 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.818 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.818 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.818 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.818 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.819 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.819 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.819 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.819 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.819 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.820 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.820 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.820 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.820 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.820 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.821 229898 WARNING oslo_config.cfg [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: and ``live_migration_inbound_addr`` respectively.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: ).  Its value may be silently ignored in the future.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.821 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.821 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.821 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.821 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.821 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.822 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.822 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.822 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.822 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.822 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.823 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.823 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.823 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.823 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.823 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.824 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.824 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.824 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.824 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rbd_secret_uuid        = 1939e851-b10c-5c3b-9bb7-8e7f380233e8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.824 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.825 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.825 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.825 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.825 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.825 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.825 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.826 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.826 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.826 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.826 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.826 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.827 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.827 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.827 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.827 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.828 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.828 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.828 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.828 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.828 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.828 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.829 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.829 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.829 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.829 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.829 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.830 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.830 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.830 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.830 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.830 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.831 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.831 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.831 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.831 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.831 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.831 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.832 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.832 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.832 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.832 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.832 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.833 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.833 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.833 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.833 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.833 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.833 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.834 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.834 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.834 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.834 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.834 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.835 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.835 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.835 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.835 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.835 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.835 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.836 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.836 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.836 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.836 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.836 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.837 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.837 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.837 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.837 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.837 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.838 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.838 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.838 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.838 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.838 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.838 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.839 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.839 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.839 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.839 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.839 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.839 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.840 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.840 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.840 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.840 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.840 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.841 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.841 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.841 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.841 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.841 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.841 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.842 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.842 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.842 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.842 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.842 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.843 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.843 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.843 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.843 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.843 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.843 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.844 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.844 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.844 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.844 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.844 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.845 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.845 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.845 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.845 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.845 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.846 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.846 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.846 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.846 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.846 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.846 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.847 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.847 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.847 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.847 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.847 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.848 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.848 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.848 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.848 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.848 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.849 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.850 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.851 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.852 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.853 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.853 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.853 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.853 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.853 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.853 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.854 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.855 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.855 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.855 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.855 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.855 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.855 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.856 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.857 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.858 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.859 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.860 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.861 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.862 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.863 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.863 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.863 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.863 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.863 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.863 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.864 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.865 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.866 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.867 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.868 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.869 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.870 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.871 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.872 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.873 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.874 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.875 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.876 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.877 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.878 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.879 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.880 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.881 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.882 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.883 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.883 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.883 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.883 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.883 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.883 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.884 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.885 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.886 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.887 229898 DEBUG oslo_service.service [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.889 229898 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.908 229898 INFO nova.virt.node [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Determined node identity 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from /var/lib/nova/compute_id
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.909 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.909 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.910 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.910 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.920 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fede4ecd460> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.923 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fede4ecd460> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.923 229898 INFO nova.virt.libvirt.driver [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Connection event '1' reason 'None'
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.927 229898 INFO nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <host>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <uuid>74aa0f2e-bd78-406d-a4f0-2263c03ef4c3</uuid>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <cpu>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <arch>x86_64</arch>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model>EPYC-Rome-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <vendor>AMD</vendor>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <microcode version='16777317'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <signature family='23' model='49' stepping='0'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='x2apic'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='tsc-deadline'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='osxsave'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='hypervisor'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='tsc_adjust'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='spec-ctrl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='stibp'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='arch-capabilities'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='cmp_legacy'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='topoext'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='virt-ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='lbrv'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='tsc-scale'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='vmcb-clean'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='pause-filter'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='pfthreshold'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='svme-addr-chk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='rdctl-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='mds-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature name='pschange-mc-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <pages unit='KiB' size='4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <pages unit='KiB' size='2048'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <pages unit='KiB' size='1048576'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </cpu>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <power_management>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <suspend_mem/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <suspend_disk/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <suspend_hybrid/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </power_management>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <iommu support='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <migration_features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <live/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <uri_transports>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <uri_transport>tcp</uri_transport>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <uri_transport>rdma</uri_transport>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </uri_transports>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </migration_features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <topology>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <cells num='1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <cell id='0'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           <memory unit='KiB'>16116604</memory>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           <pages unit='KiB' size='4'>4029151</pages>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           <distances>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <sibling id='0' value='10'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           </distances>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           <cpus num='8'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:           </cpus>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         </cell>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </cells>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </topology>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <cache>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </cache>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <secmodel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model>selinux</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <doi>0</doi>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </secmodel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <secmodel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model>dac</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <doi>0</doi>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </secmodel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </host>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <guest>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <os_type>hvm</os_type>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <arch name='i686'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <wordsize>32</wordsize>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <domain type='qemu'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <domain type='kvm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </arch>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <pae/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <nonpae/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <apic default='on' toggle='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <cpuselection/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <deviceboot/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <externalSnapshot/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </guest>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <guest>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <os_type>hvm</os_type>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <arch name='x86_64'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <wordsize>64</wordsize>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <domain type='qemu'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <domain type='kvm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </arch>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <apic default='on' toggle='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <cpuselection/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <deviceboot/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <externalSnapshot/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </guest>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: </capabilities>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.933 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.936 229898 DEBUG nova.virt.libvirt.volume.mount [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.938 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: <domainCapabilities>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <domain>kvm</domain>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <arch>i686</arch>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <vcpu max='240'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <iothreads supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <os supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <enum name='firmware'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <loader supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:56 np0005548788.localdomain sshd[208012]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>rom</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>pflash</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='readonly'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>yes</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='secure'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </loader>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </os>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <cpu>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='maximumMigratable'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <vendor>AMD</vendor>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='succor'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='custom' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-128'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-256'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-512'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:56 np0005548788.localdomain systemd-logind[765]: Session 54 logged out. Waiting for processes to exit.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain systemd[1]: session-54.scope: Consumed 2min 22.460s CPU time.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain systemd-logind[765]: Removed session 54.
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </cpu>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <memoryBacking supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <enum name='sourceType'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <value>file</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <value>anonymous</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <value>memfd</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </memoryBacking>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <devices>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <disk supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='diskDevice'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>disk</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>cdrom</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>floppy</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>lun</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>ide</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>fdc</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>sata</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </disk>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <graphics supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>vnc</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>egl-headless</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </graphics>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <video supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='modelType'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>vga</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>cirrus</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>none</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>bochs</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>ramfb</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </video>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <hostdev supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='mode'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>subsystem</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='startupPolicy'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>mandatory</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>requisite</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>optional</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='subsysType'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>pci</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='capsType'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='pciBackend'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </hostdev>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <rng supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>random</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>egd</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </rng>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <filesystem supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='driverType'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>path</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>handle</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>virtiofs</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </filesystem>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <tpm supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>tpm-tis</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>tpm-crb</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>emulator</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>external</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='backendVersion'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>2.0</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </tpm>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <redirdev supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </redirdev>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <channel supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </channel>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <crypto supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='model'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>qemu</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </crypto>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <interface supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='backendType'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>passt</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </interface>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <panic supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>isa</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>hyperv</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </panic>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <console supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>null</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>vc</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>dev</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>file</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>pipe</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>stdio</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>udp</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>tcp</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>qemu-vdagent</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </console>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </devices>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <gic supported='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <genid supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <backup supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <async-teardown supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <ps2 supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <sev supported='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <sgx supported='no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <hyperv supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='features'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>relaxed</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>vapic</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>spinlocks</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>vpindex</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>runtime</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>synic</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>stimer</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>reset</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>vendor_id</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>frequencies</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>reenlightenment</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>tlbflush</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>ipi</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>avic</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>emsr_bitmap</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>xmm_input</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <defaults>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </defaults>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </hyperv>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <launchSecurity supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='sectype'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>tdx</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </launchSecurity>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </features>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: </domainCapabilities>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.946 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]: <domainCapabilities>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <domain>kvm</domain>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <arch>i686</arch>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <vcpu max='1024'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <iothreads supported='yes'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <os supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <enum name='firmware'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <loader supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>rom</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>pflash</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='readonly'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>yes</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='secure'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </loader>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   </os>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:   <cpu>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <enum name='maximumMigratable'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <vendor>AMD</vendor>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='succor'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:     <mode name='custom' supported='yes'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v1'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v3'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:56 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-128'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-256'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-512'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </cpu>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <memoryBacking supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <enum name='sourceType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>file</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>anonymous</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>memfd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </memoryBacking>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <devices>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <disk supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='diskDevice'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>disk</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>cdrom</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>floppy</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>lun</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>fdc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>sata</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </disk>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <graphics supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vnc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>egl-headless</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </graphics>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <video supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='modelType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vga</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>cirrus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>none</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>bochs</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ramfb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </video>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <hostdev supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='mode'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>subsystem</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='startupPolicy'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>mandatory</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>requisite</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>optional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='subsysType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pci</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='capsType'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='pciBackend'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </hostdev>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <rng supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>random</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>egd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </rng>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <filesystem supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='driverType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>path</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>handle</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtiofs</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </filesystem>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <tpm supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tpm-tis</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tpm-crb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>emulator</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>external</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendVersion'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>2.0</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </tpm>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <redirdev supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </redirdev>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <channel supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </channel>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <crypto supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>qemu</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </crypto>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <interface supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>passt</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </interface>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <panic supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>isa</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>hyperv</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </panic>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <console supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>null</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dev</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>file</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pipe</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>stdio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>udp</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tcp</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>qemu-vdagent</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </console>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </devices>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <features>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <gic supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <genid supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <backup supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <async-teardown supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <ps2 supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <sev supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <sgx supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <hyperv supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='features'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>relaxed</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vapic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>spinlocks</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vpindex</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>runtime</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>synic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>stimer</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>reset</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vendor_id</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>frequencies</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>reenlightenment</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tlbflush</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ipi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>avic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>emsr_bitmap</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>xmm_input</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <defaults>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </defaults>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </hyperv>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <launchSecurity supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='sectype'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tdx</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </launchSecurity>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </features>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: </domainCapabilities>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.982 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:56.993 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: <domainCapabilities>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <domain>kvm</domain>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <arch>x86_64</arch>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <vcpu max='240'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <iothreads supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <os supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <enum name='firmware'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <loader supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>rom</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pflash</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='readonly'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>yes</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='secure'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </loader>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </os>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <cpu>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='maximumMigratable'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <vendor>AMD</vendor>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='succor'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='custom' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-128'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-256'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-512'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </cpu>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <memoryBacking supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <enum name='sourceType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>file</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>anonymous</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>memfd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </memoryBacking>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <devices>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <disk supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='diskDevice'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>disk</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>cdrom</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>floppy</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>lun</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ide</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>fdc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>sata</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </disk>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <graphics supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vnc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>egl-headless</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </graphics>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <video supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='modelType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vga</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>cirrus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>none</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>bochs</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ramfb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </video>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <hostdev supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='mode'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>subsystem</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='startupPolicy'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>mandatory</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>requisite</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>optional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='subsysType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pci</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='capsType'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='pciBackend'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </hostdev>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <rng supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>random</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>egd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </rng>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <filesystem supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='driverType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>path</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>handle</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtiofs</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </filesystem>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <tpm supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tpm-tis</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tpm-crb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>emulator</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>external</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendVersion'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>2.0</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </tpm>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <redirdev supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </redirdev>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <channel supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </channel>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <crypto supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>qemu</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </crypto>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <interface supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>passt</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </interface>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <panic supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>isa</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>hyperv</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </panic>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <console supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>null</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dev</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>file</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pipe</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>stdio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>udp</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tcp</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>qemu-vdagent</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </console>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </devices>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <features>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <gic supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <genid supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <backup supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <async-teardown supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <ps2 supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <sev supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <sgx supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <hyperv supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='features'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>relaxed</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vapic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>spinlocks</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vpindex</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>runtime</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>synic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>stimer</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>reset</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vendor_id</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>frequencies</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>reenlightenment</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tlbflush</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ipi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>avic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>emsr_bitmap</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>xmm_input</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <defaults>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </defaults>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </hyperv>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <launchSecurity supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='sectype'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tdx</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </launchSecurity>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </features>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: </domainCapabilities>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.040 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: <domainCapabilities>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <domain>kvm</domain>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <arch>x86_64</arch>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <vcpu max='1024'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <iothreads supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <os supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <enum name='firmware'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>efi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <loader supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>rom</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pflash</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='readonly'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>yes</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='secure'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>yes</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>no</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </loader>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </os>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <cpu>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='maximumMigratable'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>on</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>off</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <vendor>AMD</vendor>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='succor'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <mode name='custom' supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe-merged.mount: Deactivated successfully.
Dec 06 09:45:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157-userdata-shm.mount: Deactivated successfully.
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Denverton-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='auto-ibrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amd-psfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='stibp-always-on'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='EPYC-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-128'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-256'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx10-512'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='prefetchiti'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Haswell-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512er'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512pf'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fma4'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tbm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xop'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='amx-tile'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-bf16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-fp16'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bitalg'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrc'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fzrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='la57'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='taa-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xfd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ifma'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cmpccxadd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fbsdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='fsrs'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ibrs-all'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mcdt-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pbrsb-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='psdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='serialize'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vaes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='hle'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='rtm'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512bw'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512cd'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512dq'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512f'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='avx512vl'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='invpcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pcid'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='pku'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='mpx'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='core-capability'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='split-lock-detect'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='cldemote'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='erms'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='gfni'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdir64b'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='movdiri'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='xsaves'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='athlon-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='core2duo-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='coreduo-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='n270-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='ss'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <blockers model='phenom-v1'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnow'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <feature name='3dnowext'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </blockers>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </mode>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </cpu>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <memoryBacking supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <enum name='sourceType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>file</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>anonymous</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <value>memfd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </memoryBacking>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <devices>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <disk supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='diskDevice'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>disk</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>cdrom</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>floppy</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>lun</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>fdc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>sata</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </disk>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <graphics supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vnc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>egl-headless</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </graphics>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <video supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='modelType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vga</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>cirrus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>none</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>bochs</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ramfb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </video>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <hostdev supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='mode'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>subsystem</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='startupPolicy'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>mandatory</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>requisite</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>optional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='subsysType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pci</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>scsi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='capsType'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='pciBackend'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </hostdev>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <rng supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtio-non-transitional</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>random</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>egd</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </rng>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <filesystem supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='driverType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>path</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>handle</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>virtiofs</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </filesystem>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <tpm supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tpm-tis</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tpm-crb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>emulator</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>external</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendVersion'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>2.0</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </tpm>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <redirdev supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='bus'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>usb</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </redirdev>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <channel supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </channel>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <crypto supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>qemu</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendModel'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>builtin</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </crypto>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <interface supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='backendType'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>default</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>passt</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </interface>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <panic supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='model'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>isa</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>hyperv</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </panic>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <console supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='type'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>null</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vc</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pty</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dev</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>file</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>pipe</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>stdio</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>udp</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tcp</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>unix</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>qemu-vdagent</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>dbus</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </console>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </devices>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   <features>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <gic supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <genid supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <backup supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <async-teardown supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <ps2 supported='yes'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <sev supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <sgx supported='no'/>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <hyperv supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='features'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>relaxed</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vapic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>spinlocks</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vpindex</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>runtime</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>synic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>stimer</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>reset</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>vendor_id</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>frequencies</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>reenlightenment</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tlbflush</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>ipi</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>avic</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>emsr_bitmap</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>xmm_input</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <defaults>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </defaults>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </hyperv>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     <launchSecurity supported='yes'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       <enum name='sectype'>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:         <value>tdx</value>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:       </enum>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:     </launchSecurity>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:   </features>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: </domainCapabilities>
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.111 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.111 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.111 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.111 229898 INFO nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Secure Boot support detected
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.113 229898 INFO nova.virt.libvirt.driver [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.113 229898 INFO nova.virt.libvirt.driver [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.120 229898 DEBUG nova.virt.libvirt.driver [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.146 229898 INFO nova.virt.node [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Determined node identity 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from /var/lib/nova/compute_id
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.168 229898 DEBUG nova.compute.manager [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Verified node 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 matches my host np0005548788.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.199 229898 INFO nova.compute.manager [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.305 229898 DEBUG oslo_concurrency.lockutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.305 229898 DEBUG oslo_concurrency.lockutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.305 229898 DEBUG oslo_concurrency.lockutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.306 229898 DEBUG nova.compute.resource_tracker [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.306 229898 DEBUG oslo_concurrency.processutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.761 229898 DEBUG oslo_concurrency.processutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.905 229898 WARNING nova.virt.libvirt.driver [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.906 229898 DEBUG nova.compute.resource_tracker [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13625MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.907 229898 DEBUG oslo_concurrency.lockutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:57.907 229898 DEBUG oslo_concurrency.lockutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.052 229898 DEBUG nova.compute.resource_tracker [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.052 229898 DEBUG nova.compute.resource_tracker [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.085 229898 DEBUG nova.scheduler.client.report [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.164 229898 DEBUG nova.scheduler.client.report [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.165 229898 DEBUG nova.compute.provider_tree [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.178 229898 DEBUG nova.scheduler.client.report [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.204 229898 DEBUG nova.scheduler.client.report [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NODE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.221 229898 DEBUG oslo_concurrency.processutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.734 229898 DEBUG oslo_concurrency.processutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.741 229898 DEBUG nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.741 229898 INFO nova.virt.libvirt.host [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] kernel doesn't support AMD SEV
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.743 229898 DEBUG nova.compute.provider_tree [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.743 229898 DEBUG nova.virt.libvirt.driver [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.770 229898 DEBUG nova.scheduler.client.report [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.793 229898 DEBUG nova.compute.resource_tracker [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.794 229898 DEBUG oslo_concurrency.lockutils [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.794 229898 DEBUG nova.service [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.824 229898 DEBUG nova.service [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 09:45:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:45:58.825 229898 DEBUG nova.servicegroup.drivers.db [None req-3a714e08-848e-4706-aa5d-f4462045ca2a - - - - - -] DB_Driver: join new ServiceGroup member np0005548788.localdomain to the compute group, service = <Service: host=np0005548788.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 09:45:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41049 DF PROTO=TCP SPT=46586 DPT=9100 SEQ=3696810987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBABF00000000001030307) 
Dec 06 09:45:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:45:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:45:59 np0005548788.localdomain podman[230190]: 2025-12-06 09:45:59.276322758 +0000 UTC m=+0.093783257 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Dec 06 09:45:59 np0005548788.localdomain podman[230191]: 2025-12-06 09:45:59.38588267 +0000 UTC m=+0.203470643 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:45:59 np0005548788.localdomain podman[230190]: 2025-12-06 09:45:59.392636518 +0000 UTC m=+0.210096997 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 09:45:59 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:45:59 np0005548788.localdomain podman[230191]: 2025-12-06 09:45:59.454619441 +0000 UTC m=+0.272207414 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 09:45:59 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:46:02 np0005548788.localdomain sshd[230233]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:46:03 np0005548788.localdomain sshd[230233]: Accepted publickey for zuul from 192.168.122.30 port 56244 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:46:03 np0005548788.localdomain systemd-logind[765]: New session 56 of user zuul.
Dec 06 09:46:03 np0005548788.localdomain systemd[1]: Started Session 56 of User zuul.
Dec 06 09:46:03 np0005548788.localdomain sshd[230233]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:46:04 np0005548788.localdomain python3.9[230344]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:46:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43728 DF PROTO=TCP SPT=47234 DPT=9102 SEQ=2518988586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBC1F10000000001030307) 
Dec 06 09:46:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27818 DF PROTO=TCP SPT=51302 DPT=9101 SEQ=1629585311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBC24E0000000001030307) 
Dec 06 09:46:05 np0005548788.localdomain sudo[230420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:46:05 np0005548788.localdomain sudo[230420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:46:05 np0005548788.localdomain sudo[230420]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:05 np0005548788.localdomain sudo[230455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:46:05 np0005548788.localdomain sudo[230455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:46:05 np0005548788.localdomain sudo[230490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-islrxtgdniukktdtahrhmlwrzjlaoihb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014364.94631-68-171827271092980/AnsiballZ_systemd_service.py
Dec 06 09:46:05 np0005548788.localdomain sudo[230490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:05 np0005548788.localdomain python3.9[230494]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:46:05 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:46:05 np0005548788.localdomain systemd-rc-local-generator[230532]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:46:05 np0005548788.localdomain systemd-sysv-generator[230535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548788.localdomain sudo[230490]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:06 np0005548788.localdomain sudo[230455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:06 np0005548788.localdomain sudo[230617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:46:06 np0005548788.localdomain sudo[230617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:46:06 np0005548788.localdomain sudo[230617]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:07 np0005548788.localdomain python3.9[230687]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:46:07 np0005548788.localdomain network[230704]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:46:07 np0005548788.localdomain network[230705]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:46:07 np0005548788.localdomain network[230706]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:46:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27820 DF PROTO=TCP SPT=51302 DPT=9101 SEQ=1629585311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBCE710000000001030307) 
Dec 06 09:46:10 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19687 DF PROTO=TCP SPT=47460 DPT=9105 SEQ=185405877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBD9F00000000001030307) 
Dec 06 09:46:13 np0005548788.localdomain sudo[230939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsgwcaoueyamsdcphzwzzdsrqbajhvrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014372.7439508-125-106056088932468/AnsiballZ_systemd_service.py
Dec 06 09:46:13 np0005548788.localdomain sudo[230939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:13 np0005548788.localdomain python3.9[230941]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:46:13 np0005548788.localdomain sudo[230939]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8507 DF PROTO=TCP SPT=33398 DPT=9100 SEQ=2308320423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBE46F0000000001030307) 
Dec 06 09:46:14 np0005548788.localdomain sudo[231050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzjwglevpscwonvcluosjllwtohvhgdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014373.9269266-155-155032970197736/AnsiballZ_file.py
Dec 06 09:46:14 np0005548788.localdomain sudo[231050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:14 np0005548788.localdomain python3.9[231052]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:14 np0005548788.localdomain sudo[231050]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:14 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 76.6 (255 of 333 items), suggesting rotation.
Dec 06 09:46:14 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:46:14 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:46:14 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:46:15 np0005548788.localdomain sudo[231161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lczlkkpknbwdefwsvecuxfthzzdydtem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014374.786847-179-238844511149265/AnsiballZ_file.py
Dec 06 09:46:15 np0005548788.localdomain sudo[231161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:15 np0005548788.localdomain python3.9[231163]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:15 np0005548788.localdomain sudo[231161]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:16 np0005548788.localdomain sudo[231271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlvzunxvxsmdqkugfjupekbysudewgsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014375.6600442-206-27930367658679/AnsiballZ_command.py
Dec 06 09:46:16 np0005548788.localdomain sudo[231271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:16 np0005548788.localdomain python3.9[231273]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:46:16 np0005548788.localdomain sudo[231271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:16 np0005548788.localdomain podman[231276]: 2025-12-06 09:46:16.466268655 +0000 UTC m=+0.101366858 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 09:46:16 np0005548788.localdomain podman[231276]: 2025-12-06 09:46:16.511715831 +0000 UTC m=+0.146813994 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125)
Dec 06 09:46:16 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:46:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8509 DF PROTO=TCP SPT=33398 DPT=9100 SEQ=2308320423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBF0700000000001030307) 
Dec 06 09:46:17 np0005548788.localdomain python3.9[231404]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:46:17 np0005548788.localdomain sudo[231512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eozhkizideilfcroadagkxqtigmmbmse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014377.515067-260-18716940560190/AnsiballZ_systemd_service.py
Dec 06 09:46:17 np0005548788.localdomain sudo[231512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:18 np0005548788.localdomain python3.9[231514]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:46:18 np0005548788.localdomain systemd-rc-local-generator[231539]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:46:18 np0005548788.localdomain systemd-sysv-generator[231544]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548788.localdomain sudo[231512]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48898 DF PROTO=TCP SPT=41754 DPT=9102 SEQ=487762163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EBFB750000000001030307) 
Dec 06 09:46:19 np0005548788.localdomain sudo[231658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cplhfsmmyctzmoxidckscgsuhhnmiqkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014379.2538667-284-4639114178001/AnsiballZ_command.py
Dec 06 09:46:19 np0005548788.localdomain sudo[231658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:19 np0005548788.localdomain python3.9[231660]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:19 np0005548788.localdomain sudo[231658]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:20 np0005548788.localdomain sudo[231769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pevgtpqynemnnfcvizoijwxnqfvtcicb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014380.0846946-311-147767764777763/AnsiballZ_file.py
Dec 06 09:46:20 np0005548788.localdomain sudo[231769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:20 np0005548788.localdomain python3.9[231771]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:20 np0005548788.localdomain sudo[231769]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:22 np0005548788.localdomain python3.9[231879]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48900 DF PROTO=TCP SPT=41754 DPT=9102 SEQ=487762163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC07700000000001030307) 
Dec 06 09:46:22 np0005548788.localdomain python3.9[231989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:23 np0005548788.localdomain python3.9[232075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014382.2776313-359-22348761338802/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a5de62331599286ef72129f42fa9f9c2e1e20f46 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:24 np0005548788.localdomain sudo[232183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saulaoubrrwkhmugttekjgsbqntqaejd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014383.5884008-404-99300227172823/AnsiballZ_group.py
Dec 06 09:46:24 np0005548788.localdomain sudo[232183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:24 np0005548788.localdomain python3.9[232185]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 06 09:46:24 np0005548788.localdomain sudo[232183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:25 np0005548788.localdomain sudo[232293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrptfbvyxbajxmtvfdhiolfneyeqhuzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014384.962635-437-238367481255005/AnsiballZ_getent.py
Dec 06 09:46:25 np0005548788.localdomain sudo[232293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:25 np0005548788.localdomain python3.9[232295]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 06 09:46:25 np0005548788.localdomain sudo[232293]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:26 np0005548788.localdomain sudo[232404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtxsvjciwxkknhiozqxodvomzzuyfrmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014385.7966144-461-155217911122933/AnsiballZ_group.py
Dec 06 09:46:26 np0005548788.localdomain sudo[232404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:26 np0005548788.localdomain python3.9[232406]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:46:26 np0005548788.localdomain groupadd[232407]: group added to /etc/group: name=ceilometer, GID=42405
Dec 06 09:46:26 np0005548788.localdomain groupadd[232407]: group added to /etc/gshadow: name=ceilometer
Dec 06 09:46:26 np0005548788.localdomain groupadd[232407]: new group: name=ceilometer, GID=42405
Dec 06 09:46:26 np0005548788.localdomain sudo[232404]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48901 DF PROTO=TCP SPT=41754 DPT=9102 SEQ=487762163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC17300000000001030307) 
Dec 06 09:46:27 np0005548788.localdomain sudo[232520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqrazywhqugtuvdozdstmowcmdriqtzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014386.663326-485-254993830992249/AnsiballZ_user.py
Dec 06 09:46:27 np0005548788.localdomain sudo[232520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:27 np0005548788.localdomain python3.9[232522]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548788.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:46:27 np0005548788.localdomain useradd[232524]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 06 09:46:27 np0005548788.localdomain useradd[232524]: add 'ceilometer' to group 'libvirt'
Dec 06 09:46:27 np0005548788.localdomain useradd[232524]: add 'ceilometer' to shadow group 'libvirt'
Dec 06 09:46:27 np0005548788.localdomain sudo[232520]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8511 DF PROTO=TCP SPT=33398 DPT=9100 SEQ=2308320423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC1FF10000000001030307) 
Dec 06 09:46:28 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:28.827 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:28 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:28.855 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:28 np0005548788.localdomain python3.9[232638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:29 np0005548788.localdomain python3.9[232724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765014388.439805-563-13137908103703/.source.conf _original_basename=ceilometer.conf follow=False checksum=e90760659247c177dccfbe1ef7de974794985ce9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:29 np0005548788.localdomain python3.9[232832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:46:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:46:30 np0005548788.localdomain systemd[1]: tmp-crun.bVz7BG.mount: Deactivated successfully.
Dec 06 09:46:30 np0005548788.localdomain podman[232849]: 2025-12-06 09:46:30.27634795 +0000 UTC m=+0.100885023 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 09:46:30 np0005548788.localdomain podman[232849]: 2025-12-06 09:46:30.315435955 +0000 UTC m=+0.139972988 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:46:30 np0005548788.localdomain podman[232850]: 2025-12-06 09:46:30.3265605 +0000 UTC m=+0.149265959 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:46:30 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:46:30 np0005548788.localdomain podman[232850]: 2025-12-06 09:46:30.369675095 +0000 UTC m=+0.192380594 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:46:30 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:46:31 np0005548788.localdomain python3.9[232961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765014389.573656-563-197004408378944/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:31 np0005548788.localdomain python3.9[233069]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:32 np0005548788.localdomain python3.9[233155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765014391.4511633-563-278795383876195/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:33 np0005548788.localdomain python3.9[233263]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:34 np0005548788.localdomain python3.9[233371]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37521 DF PROTO=TCP SPT=38830 DPT=9101 SEQ=2215425954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC377E0000000001030307) 
Dec 06 09:46:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48902 DF PROTO=TCP SPT=41754 DPT=9102 SEQ=487762163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC37F00000000001030307) 
Dec 06 09:46:35 np0005548788.localdomain python3.9[233479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:35 np0005548788.localdomain python3.9[233565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014394.649921-740-105770409028714/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:36 np0005548788.localdomain python3.9[233673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:36 np0005548788.localdomain python3.9[233728]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:37 np0005548788.localdomain python3.9[233836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37523 DF PROTO=TCP SPT=38830 DPT=9101 SEQ=2215425954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC43710000000001030307) 
Dec 06 09:46:37 np0005548788.localdomain python3.9[233922]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014396.885727-740-134834584306087/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:38 np0005548788.localdomain python3.9[234030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:39 np0005548788.localdomain python3.9[234117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014398.0607197-740-57568942258355/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:39 np0005548788.localdomain python3.9[234225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:40 np0005548788.localdomain sshd[234291]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:46:40 np0005548788.localdomain python3.9[234313]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014399.2294433-740-92538135792634/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44319 DF PROTO=TCP SPT=42144 DPT=9105 SEQ=3831065611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC4DF00000000001030307) 
Dec 06 09:46:40 np0005548788.localdomain sshd[234291]: Received disconnect from 148.227.3.232 port 58526:11: Bye Bye [preauth]
Dec 06 09:46:40 np0005548788.localdomain sshd[234291]: Disconnected from authenticating user root 148.227.3.232 port 58526 [preauth]
Dec 06 09:46:40 np0005548788.localdomain python3.9[234421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:41 np0005548788.localdomain python3.9[234507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014400.511874-740-51877308898411/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:42 np0005548788.localdomain python3.9[234615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:42 np0005548788.localdomain python3.9[234701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014401.7018673-740-4175135868797/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:43 np0005548788.localdomain python3.9[234809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49642 DF PROTO=TCP SPT=58228 DPT=9100 SEQ=3930220666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC59A00000000001030307) 
Dec 06 09:46:44 np0005548788.localdomain python3.9[234895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014402.8793123-740-201654426714861/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:45 np0005548788.localdomain python3.9[235003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:46 np0005548788.localdomain python3.9[235089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014405.0037327-740-23662083623035/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49644 DF PROTO=TCP SPT=58228 DPT=9100 SEQ=3930220666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC65B00000000001030307) 
Dec 06 09:46:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:46:47 np0005548788.localdomain systemd[1]: tmp-crun.x6ABBL.mount: Deactivated successfully.
Dec 06 09:46:47 np0005548788.localdomain podman[235153]: 2025-12-06 09:46:47.271140314 +0000 UTC m=+0.092495611 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:46:47 np0005548788.localdomain podman[235153]: 2025-12-06 09:46:47.310233489 +0000 UTC m=+0.131588806 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:46:47 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:46:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:46:47.404 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:46:47.405 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:46:47.405 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:47 np0005548788.localdomain python3.9[235216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:48 np0005548788.localdomain python3.9[235302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014407.0533478-740-248426863319982/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:48 np0005548788.localdomain python3.9[235410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:49 np0005548788.localdomain python3.9[235496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014408.2437255-740-213658845269189/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37323 DF PROTO=TCP SPT=48288 DPT=9102 SEQ=4004561141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC70A50000000001030307) 
Dec 06 09:46:50 np0005548788.localdomain sudo[235604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlkpqmpjocemeqhzalnioukcdbwxkwuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014409.7243538-1205-18306942531164/AnsiballZ_file.py
Dec 06 09:46:50 np0005548788.localdomain sudo[235604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:50 np0005548788.localdomain python3.9[235606]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:50 np0005548788.localdomain sudo[235604]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:50 np0005548788.localdomain sudo[235714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-todkemykiudhyrgijsbibsscfywonwbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014410.4464-1229-220676904923052/AnsiballZ_systemd_service.py
Dec 06 09:46:50 np0005548788.localdomain sudo[235714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:51 np0005548788.localdomain python3.9[235716]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:46:51 np0005548788.localdomain systemd-rc-local-generator[235739]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:46:51 np0005548788.localdomain systemd-sysv-generator[235746]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548788.localdomain systemd[1]: Listening on Podman API Socket.
Dec 06 09:46:51 np0005548788.localdomain sudo[235714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:52 np0005548788.localdomain sudo[235864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmbsxlcrzrllayeiepsoqyeuuoqzegha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.8948312-1256-46829266305734/AnsiballZ_stat.py
Dec 06 09:46:52 np0005548788.localdomain sudo[235864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:52 np0005548788.localdomain python3.9[235866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:52 np0005548788.localdomain sudo[235864]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37325 DF PROTO=TCP SPT=48288 DPT=9102 SEQ=4004561141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC7CB00000000001030307) 
Dec 06 09:46:52 np0005548788.localdomain sudo[235952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogzimdbxohtxkqtmgozoiksgtynppcnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.8948312-1256-46829266305734/AnsiballZ_copy.py
Dec 06 09:46:52 np0005548788.localdomain sudo[235952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:53 np0005548788.localdomain python3.9[235954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014411.8948312-1256-46829266305734/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:53 np0005548788.localdomain sudo[235952]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:53 np0005548788.localdomain sudo[236007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqdmpfbsvaxddamhtoezjpmuqoqijglz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.8948312-1256-46829266305734/AnsiballZ_stat.py
Dec 06 09:46:53 np0005548788.localdomain sudo[236007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:53 np0005548788.localdomain python3.9[236009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:53 np0005548788.localdomain sudo[236007]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:53 np0005548788.localdomain sudo[236095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwpldawcnoxfubqmlndmdwkjwzxdynjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.8948312-1256-46829266305734/AnsiballZ_copy.py
Dec 06 09:46:53 np0005548788.localdomain sudo[236095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:54 np0005548788.localdomain python3.9[236097]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014411.8948312-1256-46829266305734/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:54 np0005548788.localdomain sudo[236095]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:55 np0005548788.localdomain sudo[236205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxvlylpomscnjkvnammxcbjyyuuarqvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014414.5892596-1340-268142578016867/AnsiballZ_container_config_data.py
Dec 06 09:46:55 np0005548788.localdomain sudo[236205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:55 np0005548788.localdomain python3.9[236207]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 06 09:46:55 np0005548788.localdomain sudo[236205]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:55 np0005548788.localdomain sudo[236315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpcbkpfktohhdplzgxzizvdvqdquuapv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014415.5157444-1367-253359695502372/AnsiballZ_container_config_hash.py
Dec 06 09:46:55 np0005548788.localdomain sudo[236315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.185 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.187 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.188 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.188 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.213 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.214 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.214 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.215 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.215 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.215 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain python3.9[236317]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.216 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.216 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.217 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:56 np0005548788.localdomain sudo[236315]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.233 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.234 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.234 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.234 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.235 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:46:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37326 DF PROTO=TCP SPT=48288 DPT=9102 SEQ=4004561141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC8C710000000001030307) 
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.665 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.863 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.865 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13618MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.866 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.866 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.946 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.947 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:46:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:56.964 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:46:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:57.419 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:46:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:57.427 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:46:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:57.457 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:46:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:57.459 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:46:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:46:57.460 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:57 np0005548788.localdomain sudo[236469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mowoscspttdipqhjfwanoplgrqjnkkid ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014416.6258426-1397-67025674580086/AnsiballZ_edpm_container_manage.py
Dec 06 09:46:57 np0005548788.localdomain sudo[236469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:58 np0005548788.localdomain python3[236471]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:46:58 np0005548788.localdomain python3[236471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2",
                                                                    "Digest": "sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:21:53.58682213Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 505175293,
                                                                    "VirtualSize": 505175293,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:a47016624274f5ebad76019f5a2e465c1737f96caa539b36f90ab8e33592f415",
                                                                              "sha256:38a03f5e96658211fb28e2f87c11ffad531281d1797368f48e6cd4af7ac97c0e"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:14:56.244673147Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:14:56.960273159Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:37.588899909Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:41.197123864Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:19.680010224Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:53.584924649Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:56.278821402Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 06 09:46:58 np0005548788.localdomain podman[236520]: 2025-12-06 09:46:58.469295759 +0000 UTC m=+0.101286827 container remove 57005a73ee78f48d6f224b64fe08c15ca564d4266c926189d4ce90e4e745e669 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d18e9db1b81af61c21222485fd9085f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public)
Dec 06 09:46:58 np0005548788.localdomain python3[236471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Dec 06 09:46:58 np0005548788.localdomain podman[236534]: 
Dec 06 09:46:58 np0005548788.localdomain podman[236534]: 2025-12-06 09:46:58.583372847 +0000 UTC m=+0.093551073 container create 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible)
Dec 06 09:46:58 np0005548788.localdomain podman[236534]: 2025-12-06 09:46:58.537104927 +0000 UTC m=+0.047283213 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 06 09:46:58 np0005548788.localdomain python3[236471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 06 09:46:58 np0005548788.localdomain sudo[236469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49646 DF PROTO=TCP SPT=58228 DPT=9100 SEQ=3930220666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EC95F00000000001030307) 
Dec 06 09:47:00 np0005548788.localdomain sudo[236679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfrfjqskegcvgzdggfqozgcbhmwskqnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014418.9928856-1421-131034264452526/AnsiballZ_stat.py
Dec 06 09:47:00 np0005548788.localdomain sudo[236679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:47:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:47:00 np0005548788.localdomain sshd[225025]: fatal: Timeout before authentication for 45.78.219.195 port 42878
Dec 06 09:47:00 np0005548788.localdomain podman[236683]: 2025-12-06 09:47:00.564494677 +0000 UTC m=+0.084736378 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true)
Dec 06 09:47:00 np0005548788.localdomain podman[236683]: 2025-12-06 09:47:00.614659625 +0000 UTC m=+0.134901336 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:47:00 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:47:00 np0005548788.localdomain python3.9[236681]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:47:00 np0005548788.localdomain podman[236682]: 2025-12-06 09:47:00.618350516 +0000 UTC m=+0.138243967 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:47:00 np0005548788.localdomain sudo[236679]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:00 np0005548788.localdomain podman[236682]: 2025-12-06 09:47:00.70264322 +0000 UTC m=+0.222536661 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 09:47:00 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:47:01 np0005548788.localdomain sudo[236833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amrpjuamjrcytowryijtvtxfmifqlylb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.0209017-1448-30583770288365/AnsiballZ_file.py
Dec 06 09:47:01 np0005548788.localdomain sudo[236833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:01 np0005548788.localdomain python3.9[236835]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:01 np0005548788.localdomain sudo[236833]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:02 np0005548788.localdomain sudo[236942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfbfgpjajpobscowozanqpadsildhqpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.5596437-1448-143097417953819/AnsiballZ_copy.py
Dec 06 09:47:02 np0005548788.localdomain sudo[236942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:02 np0005548788.localdomain python3.9[236944]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014421.5596437-1448-143097417953819/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:02 np0005548788.localdomain sudo[236942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:02 np0005548788.localdomain sudo[236997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abfozwrvccbmtynnfcphylmkqkqudida ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.5596437-1448-143097417953819/AnsiballZ_systemd.py
Dec 06 09:47:02 np0005548788.localdomain sudo[236997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:03 np0005548788.localdomain python3.9[236999]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:47:03 np0005548788.localdomain systemd-rc-local-generator[237027]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:03 np0005548788.localdomain systemd-sysv-generator[237030]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548788.localdomain sudo[236997]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:03 np0005548788.localdomain sudo[237088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrftsczrfzybyewthqqivyfpagltvltn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.5596437-1448-143097417953819/AnsiballZ_systemd.py
Dec 06 09:47:03 np0005548788.localdomain sudo[237088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:04 np0005548788.localdomain python3.9[237090]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:47:04 np0005548788.localdomain systemd-sysv-generator[237123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:04 np0005548788.localdomain systemd-rc-local-generator[237119]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:04 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2217e3a2e0e5350d36d55195d434dd196f8d12238f1bdf02ecd05f37bd436475/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:04 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2217e3a2e0e5350d36d55195d434dd196f8d12238f1bdf02ecd05f37bd436475/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37327 DF PROTO=TCP SPT=48288 DPT=9102 SEQ=4004561141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECABF10000000001030307) 
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:47:04 np0005548788.localdomain podman[237131]: 2025-12-06 09:47:04.630466931 +0000 UTC m=+0.155159711 container init 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: tmp-crun.rLSRUw.mount: Deactivated successfully.
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + sudo -E kolla_set_configs
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:04 np0005548788.localdomain sudo[237151]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:47:04 np0005548788.localdomain sudo[237151]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:04 np0005548788.localdomain sudo[237151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:04 np0005548788.localdomain podman[237131]: 2025-12-06 09:47:04.666963475 +0000 UTC m=+0.191656235 container start 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 09:47:04 np0005548788.localdomain podman[237131]: ceilometer_agent_compute
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 06 09:47:04 np0005548788.localdomain sudo[237088]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Validating config file
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Copying service configuration files
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: INFO:__main__:Writing out command to execute
Dec 06 09:47:04 np0005548788.localdomain sudo[237151]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: ++ cat /run_command
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + ARGS=
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + sudo kolla_copy_cacerts
Dec 06 09:47:04 np0005548788.localdomain sudo[237166]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:04 np0005548788.localdomain sudo[237166]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:04 np0005548788.localdomain sudo[237166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:04 np0005548788.localdomain sudo[237166]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + [[ ! -n '' ]]
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + . kolla_extend_start
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + umask 0022
Dec 06 09:47:04 np0005548788.localdomain ceilometer_agent_compute[237145]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 06 09:47:04 np0005548788.localdomain podman[237152]: 2025-12-06 09:47:04.765505848 +0000 UTC m=+0.091943609 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 06 09:47:04 np0005548788.localdomain podman[237152]: 2025-12-06 09:47:04.802592221 +0000 UTC m=+0.129029972 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:47:04 np0005548788.localdomain podman[237152]: unhealthy
Dec 06 09:47:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12318 DF PROTO=TCP SPT=58656 DPT=9101 SEQ=532657671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECACAE0000000001030307) 
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:04 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:47:05 np0005548788.localdomain sudo[237281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuaynljhjprqfebnhnhtogsxvypavtsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014424.9465878-1520-18641892530965/AnsiballZ_systemd.py
Dec 06 09:47:05 np0005548788.localdomain sudo[237281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:05 np0005548788.localdomain python3.9[237283]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.574 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.574 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.574 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.574 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.574 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.574 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.575 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.576 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.577 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.578 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.579 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.579 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.579 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.579 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.579 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.579 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.580 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.581 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.582 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.583 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.584 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.585 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.586 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.587 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.588 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.589 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.590 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.590 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.609 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.610 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.611 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:05 np0005548788.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Dec 06 09:47:05 np0005548788.localdomain systemd[1]: tmp-crun.MdTK9O.mount: Deactivated successfully.
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.698 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.722 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.784 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.784 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.784 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.785 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.786 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.787 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.788 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.789 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.790 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.791 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.792 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.793 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.794 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.795 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.796 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.797 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.798 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.799 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.800 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.801 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.802 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.802 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.802 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.803 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Dec 06 09:47:05 np0005548788.localdomain ceilometer_agent_compute[237145]: 2025-12-06 09:47:05.816 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Dec 06 09:47:05 np0005548788.localdomain virtqemud[229107]: End of file while reading data: Input/output error
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: libpod-2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.scope: Deactivated successfully.
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: libpod-2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.scope: Consumed 1.363s CPU time.
Dec 06 09:47:06 np0005548788.localdomain podman[237290]: 2025-12-06 09:47:06.016530829 +0000 UTC m=+0.383966487 container died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.timer: Deactivated successfully.
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:47:06 np0005548788.localdomain podman[237290]: 2025-12-06 09:47:06.074982885 +0000 UTC m=+0.442418473 container cleanup 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:47:06 np0005548788.localdomain podman[237290]: ceilometer_agent_compute
Dec 06 09:47:06 np0005548788.localdomain podman[237318]: 2025-12-06 09:47:06.187037268 +0000 UTC m=+0.078107998 container cleanup 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:06 np0005548788.localdomain podman[237318]: ceilometer_agent_compute
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2217e3a2e0e5350d36d55195d434dd196f8d12238f1bdf02ecd05f37bd436475/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2217e3a2e0e5350d36d55195d434dd196f8d12238f1bdf02ecd05f37bd436475/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:47:06 np0005548788.localdomain podman[237330]: 2025-12-06 09:47:06.373102803 +0000 UTC m=+0.149838689 container init 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + sudo -E kolla_set_configs
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:06 np0005548788.localdomain sudo[237350]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:47:06 np0005548788.localdomain sudo[237350]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:06 np0005548788.localdomain sudo[237350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:47:06 np0005548788.localdomain podman[237330]: 2025-12-06 09:47:06.422645994 +0000 UTC m=+0.199381860 container start 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 09:47:06 np0005548788.localdomain podman[237330]: ceilometer_agent_compute
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 06 09:47:06 np0005548788.localdomain sudo[237281]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Validating config file
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Copying service configuration files
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: INFO:__main__:Writing out command to execute
Dec 06 09:47:06 np0005548788.localdomain sudo[237350]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: ++ cat /run_command
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + ARGS=
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + sudo kolla_copy_cacerts
Dec 06 09:47:06 np0005548788.localdomain sudo[237369]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:06 np0005548788.localdomain sudo[237369]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:06 np0005548788.localdomain sudo[237369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:06 np0005548788.localdomain sudo[237369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + [[ ! -n '' ]]
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + . kolla_extend_start
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + umask 0022
Dec 06 09:47:06 np0005548788.localdomain ceilometer_agent_compute[237344]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 06 09:47:06 np0005548788.localdomain podman[237353]: 2025-12-06 09:47:06.52484851 +0000 UTC m=+0.093737204 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 09:47:06 np0005548788.localdomain podman[237353]: 2025-12-06 09:47:06.557817595 +0000 UTC m=+0.126706299 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 09:47:06 np0005548788.localdomain podman[237353]: unhealthy
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:06 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:47:07 np0005548788.localdomain sudo[237390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:47:07 np0005548788.localdomain sudo[237390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:47:07 np0005548788.localdomain sudo[237390]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:07 np0005548788.localdomain sudo[237408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:47:07 np0005548788.localdomain sudo[237408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.239 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.239 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.239 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.239 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.240 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.241 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.242 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.243 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.244 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.245 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.246 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.247 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.248 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.249 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.250 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.251 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.252 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.270 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.272 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.273 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.316 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:07 np0005548788.localdomain sudo[237519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxduydjkepxtvxhvaegjyvqjeauggkqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014427.104873-1544-179729765744996/AnsiballZ_stat.py
Dec 06 09:47:07 np0005548788.localdomain sudo[237519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.461 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.461 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.462 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.463 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.464 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.465 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.466 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.467 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.468 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.469 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.470 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.471 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.472 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.473 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.474 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.475 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.476 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.477 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.478 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.479 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.480 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.482 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.488 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:47:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:47:07 np0005548788.localdomain python3.9[237521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:07 np0005548788.localdomain sudo[237519]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:07 np0005548788.localdomain sudo[237408]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12320 DF PROTO=TCP SPT=58656 DPT=9101 SEQ=532657671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECB8B10000000001030307) 
Dec 06 09:47:07 np0005548788.localdomain sudo[237641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-canrimohhzljahstrbgmtecdlfaazqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014427.104873-1544-179729765744996/AnsiballZ_copy.py
Dec 06 09:47:07 np0005548788.localdomain sudo[237641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:08 np0005548788.localdomain python3.9[237643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014427.104873-1544-179729765744996/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:08 np0005548788.localdomain sudo[237641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:08 np0005548788.localdomain sudo[237661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:47:08 np0005548788.localdomain sudo[237661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:47:08 np0005548788.localdomain sudo[237661]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:08 np0005548788.localdomain sudo[237769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljczryuhrxhzmwsfllvdzxmuspktixag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014428.5770257-1595-29122394766139/AnsiballZ_container_config_data.py
Dec 06 09:47:08 np0005548788.localdomain sudo[237769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:09 np0005548788.localdomain python3.9[237771]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 06 09:47:09 np0005548788.localdomain sudo[237769]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7714 DF PROTO=TCP SPT=36332 DPT=9105 SEQ=691667587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECC3F00000000001030307) 
Dec 06 09:47:10 np0005548788.localdomain sudo[237879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwktwlofmvbasihvikkfhbiiyrhyqfno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014430.429032-1622-157206425997030/AnsiballZ_container_config_hash.py
Dec 06 09:47:10 np0005548788.localdomain sudo[237879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:11 np0005548788.localdomain python3.9[237881]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:47:11 np0005548788.localdomain sudo[237879]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:12 np0005548788.localdomain sudo[237989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xypxpzydwcatvuxfqhqqfptudizgwabo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014431.9643393-1652-72199785583793/AnsiballZ_edpm_container_manage.py
Dec 06 09:47:12 np0005548788.localdomain sudo[237989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:12 np0005548788.localdomain python3[237991]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:47:12 np0005548788.localdomain podman[238028]: 
Dec 06 09:47:12 np0005548788.localdomain podman[238028]: 2025-12-06 09:47:12.824377052 +0000 UTC m=+0.085199210 container create 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Dec 06 09:47:12 np0005548788.localdomain podman[238028]: 2025-12-06 09:47:12.777690321 +0000 UTC m=+0.038512509 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 06 09:47:12 np0005548788.localdomain python3[237991]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 06 09:47:13 np0005548788.localdomain sudo[237989]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:13 np0005548788.localdomain sudo[238171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skdtbujxgtlavfpqiulbviyfhdispofk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014433.169543-1676-198133580264321/AnsiballZ_stat.py
Dec 06 09:47:13 np0005548788.localdomain sudo[238171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31048 DF PROTO=TCP SPT=43342 DPT=9100 SEQ=2905028273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECCED00000000001030307) 
Dec 06 09:47:13 np0005548788.localdomain python3.9[238173]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:47:13 np0005548788.localdomain sudo[238171]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:14 np0005548788.localdomain sudo[238283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwqjgjfsisbzbecmvuqhgmkdhitcceuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.2249696-1703-240897630697257/AnsiballZ_file.py
Dec 06 09:47:14 np0005548788.localdomain sudo[238283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:14 np0005548788.localdomain python3.9[238285]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:14 np0005548788.localdomain sudo[238283]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:15 np0005548788.localdomain sudo[238392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqlinqzuuhtgcusryewjhskfqqmotqgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.8912728-1703-129298388924160/AnsiballZ_copy.py
Dec 06 09:47:15 np0005548788.localdomain sudo[238392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:15 np0005548788.localdomain python3.9[238394]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014434.8912728-1703-129298388924160/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:15 np0005548788.localdomain sudo[238392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:15 np0005548788.localdomain sudo[238447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkraevymylkexwvhngqnxngtjobazkax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.8912728-1703-129298388924160/AnsiballZ_systemd.py
Dec 06 09:47:15 np0005548788.localdomain sudo[238447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:16 np0005548788.localdomain python3.9[238449]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:47:16 np0005548788.localdomain systemd-rc-local-generator[238470]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:16 np0005548788.localdomain systemd-sysv-generator[238475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548788.localdomain sudo[238447]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31050 DF PROTO=TCP SPT=43342 DPT=9100 SEQ=2905028273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECDAF00000000001030307) 
Dec 06 09:47:16 np0005548788.localdomain sudo[238538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldwifahwbvvwlwtpiyongbrwemjonfsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.8912728-1703-129298388924160/AnsiballZ_systemd.py
Dec 06 09:47:16 np0005548788.localdomain sudo[238538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:17 np0005548788.localdomain python3.9[238540]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:47:18 np0005548788.localdomain podman[238543]: 2025-12-06 09:47:18.26103446 +0000 UTC m=+0.087422359 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 06 09:47:18 np0005548788.localdomain podman[238543]: 2025-12-06 09:47:18.274674084 +0000 UTC m=+0.101062023 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:47:18 np0005548788.localdomain systemd-rc-local-generator[238581]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:18 np0005548788.localdomain systemd-sysv-generator[238586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Starting node_exporter container...
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:47:18 np0005548788.localdomain podman[238597]: 2025-12-06 09:47:18.74355553 +0000 UTC m=+0.148381354 container init 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.758Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.758Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.758Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.758Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.759Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.759Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.759Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=arp
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=bcache
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=bonding
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=cpu
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=edac
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=filefd
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=netclass
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=netdev
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=netstat
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.760Z caller=node_exporter.go:117 level=info collector=nfs
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=nvme
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=softnet
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=systemd
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=xfs
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.761Z caller=node_exporter.go:117 level=info collector=zfs
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.762Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 06 09:47:18 np0005548788.localdomain node_exporter[238612]: ts=2025-12-06T09:47:18.762Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:47:18 np0005548788.localdomain podman[238597]: 2025-12-06 09:47:18.779761886 +0000 UTC m=+0.184587680 container start 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:18 np0005548788.localdomain podman[238597]: node_exporter
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: Started node_exporter container.
Dec 06 09:47:18 np0005548788.localdomain sudo[238538]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:18 np0005548788.localdomain podman[238621]: 2025-12-06 09:47:18.8436102 +0000 UTC m=+0.057978853 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:18 np0005548788.localdomain podman[238621]: 2025-12-06 09:47:18.874729648 +0000 UTC m=+0.089098331 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:18 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:47:19 np0005548788.localdomain sudo[238751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptnheeztjtlcsvxdiuhbqirhoyehueqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014439.001283-1775-107368199507248/AnsiballZ_systemd.py
Dec 06 09:47:19 np0005548788.localdomain sudo[238751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63397 DF PROTO=TCP SPT=49380 DPT=9102 SEQ=152520205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECE5D50000000001030307) 
Dec 06 09:47:19 np0005548788.localdomain python3.9[238753]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: Stopping node_exporter container...
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: libpod-315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.scope: Deactivated successfully.
Dec 06 09:47:19 np0005548788.localdomain podman[238757]: 2025-12-06 09:47:19.753477194 +0000 UTC m=+0.079843662 container died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.timer: Deactivated successfully.
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7-userdata-shm.mount: Deactivated successfully.
Dec 06 09:47:19 np0005548788.localdomain podman[238757]: 2025-12-06 09:47:19.809584119 +0000 UTC m=+0.135950567 container cleanup 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:19 np0005548788.localdomain podman[238757]: node_exporter
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 06 09:47:19 np0005548788.localdomain podman[238783]: 2025-12-06 09:47:19.918087122 +0000 UTC m=+0.074364233 container cleanup 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:19 np0005548788.localdomain podman[238783]: node_exporter
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: Stopped node_exporter container.
Dec 06 09:47:19 np0005548788.localdomain systemd[1]: Starting node_exporter container...
Dec 06 09:47:20 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:47:20 np0005548788.localdomain podman[238796]: 2025-12-06 09:47:20.095172087 +0000 UTC m=+0.145704621 container init 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.112Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.114Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:117 level=info collector=arp
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:117 level=info collector=bcache
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:117 level=info collector=bonding
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.115Z caller=node_exporter.go:117 level=info collector=cpu
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=edac
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=filefd
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.116Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=netclass
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=netdev
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=netstat
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=nfs
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=nvme
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.117Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=softnet
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=systemd
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=xfs
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.118Z caller=node_exporter.go:117 level=info collector=zfs
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.119Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 06 09:47:20 np0005548788.localdomain node_exporter[238810]: ts=2025-12-06T09:47:20.119Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 06 09:47:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:47:20 np0005548788.localdomain podman[238796]: 2025-12-06 09:47:20.133301842 +0000 UTC m=+0.183834386 container start 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:20 np0005548788.localdomain podman[238796]: node_exporter
Dec 06 09:47:20 np0005548788.localdomain systemd[1]: Started node_exporter container.
Dec 06 09:47:20 np0005548788.localdomain sudo[238751]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:20 np0005548788.localdomain podman[238819]: 2025-12-06 09:47:20.240423742 +0000 UTC m=+0.099411171 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:20 np0005548788.localdomain podman[238819]: 2025-12-06 09:47:20.278524777 +0000 UTC m=+0.137512186 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:47:20 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:47:20 np0005548788.localdomain sudo[238949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quzqrwwaurzycypsmtxrmuhifwdwwqlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014440.3510685-1799-272129469411421/AnsiballZ_stat.py
Dec 06 09:47:20 np0005548788.localdomain sudo[238949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:20 np0005548788.localdomain python3.9[238951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:20 np0005548788.localdomain sudo[238949]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:22 np0005548788.localdomain sudo[239037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awuxyzvvfeavhdcezjaolbcaxtzvyfwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014440.3510685-1799-272129469411421/AnsiballZ_copy.py
Dec 06 09:47:22 np0005548788.localdomain sudo[239037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63399 DF PROTO=TCP SPT=49380 DPT=9102 SEQ=152520205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ECF1F00000000001030307) 
Dec 06 09:47:22 np0005548788.localdomain python3.9[239039]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014440.3510685-1799-272129469411421/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:22 np0005548788.localdomain sudo[239037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:23 np0005548788.localdomain sudo[239147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utciwixspaqbmzoaixvggqywyzfpxiez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014443.0042067-1850-131163998608236/AnsiballZ_container_config_data.py
Dec 06 09:47:23 np0005548788.localdomain sudo[239147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:23 np0005548788.localdomain python3.9[239149]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 06 09:47:23 np0005548788.localdomain sudo[239147]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:24 np0005548788.localdomain sudo[239257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxetrqbeqcfctvokfaoewzwidrnycque ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014444.2557452-1877-8909185507191/AnsiballZ_container_config_hash.py
Dec 06 09:47:24 np0005548788.localdomain sudo[239257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:24 np0005548788.localdomain python3.9[239259]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:47:24 np0005548788.localdomain sudo[239257]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:25 np0005548788.localdomain sudo[239367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmwswzgzggjcttwgufurjobkxexicrst ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014445.2232223-1907-3342747410715/AnsiballZ_edpm_container_manage.py
Dec 06 09:47:25 np0005548788.localdomain sudo[239367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:25 np0005548788.localdomain python3[239369]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:47:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63400 DF PROTO=TCP SPT=49380 DPT=9102 SEQ=152520205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED01B00000000001030307) 
Dec 06 09:47:27 np0005548788.localdomain podman[239383]: 2025-12-06 09:47:25.922682814 +0000 UTC m=+0.046239678 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 06 09:47:27 np0005548788.localdomain podman[239456]: 
Dec 06 09:47:27 np0005548788.localdomain podman[239456]: 2025-12-06 09:47:27.783351277 +0000 UTC m=+0.093795187 container create b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Dec 06 09:47:27 np0005548788.localdomain podman[239456]: 2025-12-06 09:47:27.739237885 +0000 UTC m=+0.049681825 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 06 09:47:27 np0005548788.localdomain python3[239369]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 06 09:47:27 np0005548788.localdomain sudo[239367]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:28 np0005548788.localdomain sudo[239602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exskxviytfzvbyxwefiqlgmxazkfifxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014448.3614461-1931-267159822609919/AnsiballZ_stat.py
Dec 06 09:47:28 np0005548788.localdomain sudo[239602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:28 np0005548788.localdomain python3.9[239604]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:47:28 np0005548788.localdomain sudo[239602]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31052 DF PROTO=TCP SPT=43342 DPT=9100 SEQ=2905028273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED0BF10000000001030307) 
Dec 06 09:47:29 np0005548788.localdomain sudo[239714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whwlmsfrekxsmqscuhpcoumxhxwdhted ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.2133672-1958-235983425581952/AnsiballZ_file.py
Dec 06 09:47:29 np0005548788.localdomain sudo[239714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:29 np0005548788.localdomain python3.9[239716]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:29 np0005548788.localdomain sudo[239714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:30 np0005548788.localdomain sudo[239823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jelbyyrbkbxvfsngfyxwppdceooeattw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.7988846-1958-72851941456056/AnsiballZ_copy.py
Dec 06 09:47:30 np0005548788.localdomain sudo[239823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:30 np0005548788.localdomain python3.9[239825]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014449.7988846-1958-72851941456056/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:30 np0005548788.localdomain sudo[239823]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:30 np0005548788.localdomain sudo[239878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thmzmqtpvqoqlpkvrynxasgcpbbeqvmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.7988846-1958-72851941456056/AnsiballZ_systemd.py
Dec 06 09:47:30 np0005548788.localdomain sudo[239878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:47:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:47:30 np0005548788.localdomain podman[239881]: 2025-12-06 09:47:30.885876333 +0000 UTC m=+0.097461591 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 09:47:30 np0005548788.localdomain podman[239881]: 2025-12-06 09:47:30.921904943 +0000 UTC m=+0.133490201 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:47:30 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:47:30 np0005548788.localdomain systemd[1]: tmp-crun.tMS3IB.mount: Deactivated successfully.
Dec 06 09:47:30 np0005548788.localdomain podman[239882]: 2025-12-06 09:47:30.996100529 +0000 UTC m=+0.204324431 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:47:31 np0005548788.localdomain podman[239882]: 2025-12-06 09:47:31.033470512 +0000 UTC m=+0.241694394 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:47:31 np0005548788.localdomain python3.9[239880]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:47:31 np0005548788.localdomain systemd-rc-local-generator[239944]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:31 np0005548788.localdomain systemd-sysv-generator[239949]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548788.localdomain sudo[239878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:31 np0005548788.localdomain sudo[240009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyexyseuafervttqswntdugnphhqeike ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.7988846-1958-72851941456056/AnsiballZ_systemd.py
Dec 06 09:47:31 np0005548788.localdomain sudo[240009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:32 np0005548788.localdomain python3.9[240011]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:32 np0005548788.localdomain sshd[240014]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:47:33 np0005548788.localdomain systemd-sysv-generator[240042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:33 np0005548788.localdomain systemd-rc-local-generator[240037]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Starting podman_exporter container...
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: tmp-crun.jOAW2T.mount: Deactivated successfully.
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:47:33 np0005548788.localdomain podman[240052]: 2025-12-06 09:47:33.717171969 +0000 UTC m=+0.174611900 container init b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:33 np0005548788.localdomain podman_exporter[240067]: ts=2025-12-06T09:47:33.735Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 06 09:47:33 np0005548788.localdomain podman_exporter[240067]: ts=2025-12-06T09:47:33.735Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 06 09:47:33 np0005548788.localdomain podman_exporter[240067]: ts=2025-12-06T09:47:33.735Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 06 09:47:33 np0005548788.localdomain podman_exporter[240067]: ts=2025-12-06T09:47:33.735Z caller=handler.go:105 level=info collector=container
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Starting Podman API Service...
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Started Podman API Service.
Dec 06 09:47:33 np0005548788.localdomain podman[240052]: 2025-12-06 09:47:33.764050797 +0000 UTC m=+0.221490738 container start b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:47:33 np0005548788.localdomain podman[240052]: podman_exporter
Dec 06 09:47:33 np0005548788.localdomain systemd[1]: Started podman_exporter container.
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:33Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:33Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:33Z" level=info msg="Setting parallel job count to 25"
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:33Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:33Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Dec 06 09:47:33 np0005548788.localdomain sudo[240009]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:47:33 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 06 09:47:33 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:33Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:47:33 np0005548788.localdomain podman[240077]: 2025-12-06 09:47:33.851407362 +0000 UTC m=+0.090519075 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:33 np0005548788.localdomain podman[240077]: 2025-12-06 09:47:33.862070293 +0000 UTC m=+0.101181996 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:47:33 np0005548788.localdomain podman[240077]: unhealthy
Dec 06 09:47:34 np0005548788.localdomain sudo[240222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epaaxmnbtpmgmznddrtimqnbmjixcyeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014454.48363-2030-192920804838076/AnsiballZ_systemd.py
Dec 06 09:47:34 np0005548788.localdomain sudo[240222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14484 DF PROTO=TCP SPT=51696 DPT=9101 SEQ=2070217156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED21DE0000000001030307) 
Dec 06 09:47:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44639 DF PROTO=TCP SPT=54560 DPT=9882 SEQ=3760188689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED21F00000000001030307) 
Dec 06 09:47:35 np0005548788.localdomain python3.9[240224]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: Stopping podman_exporter container...
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:47:36 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:47:33 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1"
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: libpod-b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.scope: Deactivated successfully.
Dec 06 09:47:36 np0005548788.localdomain podman[240228]: 2025-12-06 09:47:36.74838704 +0000 UTC m=+0.569732883 container died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.timer: Deactivated successfully.
Dec 06 09:47:36 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:47:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426-userdata-shm.mount: Deactivated successfully.
Dec 06 09:47:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14486 DF PROTO=TCP SPT=51696 DPT=9101 SEQ=2070217156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED2DF00000000001030307) 
Dec 06 09:47:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9c6c1df39f2c3a7fe7e9265cc44ba2cb5326e276cd5a9962034b780b5cde9728-merged.mount: Deactivated successfully.
Dec 06 09:47:38 np0005548788.localdomain podman[240239]: 2025-12-06 09:47:38.823481318 +0000 UTC m=+2.115525367 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:47:38 np0005548788.localdomain podman[240239]: 2025-12-06 09:47:38.855632797 +0000 UTC m=+2.147676836 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 09:47:38 np0005548788.localdomain podman[240239]: unhealthy
Dec 06 09:47:38 np0005548788.localdomain podman[240228]: 2025-12-06 09:47:38.890139319 +0000 UTC m=+2.711485122 container cleanup b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:47:38 np0005548788.localdomain podman[240228]: podman_exporter
Dec 06 09:47:38 np0005548788.localdomain podman[240251]: 2025-12-06 09:47:38.900703368 +0000 UTC m=+2.144029252 container cleanup b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:47:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:39 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:39 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:47:39 np0005548788.localdomain systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 06 09:47:40 np0005548788.localdomain podman[240271]: 2025-12-06 09:47:40.040635074 +0000 UTC m=+0.066153427 container cleanup b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:40 np0005548788.localdomain podman[240271]: podman_exporter
Dec 06 09:47:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22199 DF PROTO=TCP SPT=47038 DPT=9105 SEQ=2405907586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED37F10000000001030307) 
Dec 06 09:47:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:40 np0005548788.localdomain systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 06 09:47:40 np0005548788.localdomain systemd[1]: Stopped podman_exporter container.
Dec 06 09:47:40 np0005548788.localdomain systemd[1]: Starting podman_exporter container...
Dec 06 09:47:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:47:41 np0005548788.localdomain podman[240284]: 2025-12-06 09:47:41.176677601 +0000 UTC m=+0.448628479 container init b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:47:41 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:47:41.196Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 06 09:47:41 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:47:41.196Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 06 09:47:41 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:47:41 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 06 09:47:41 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:47:41 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:47:41.196Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 06 09:47:41 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:47:41.196Z caller=handler.go:105 level=info collector=container
Dec 06 09:47:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:47:41 np0005548788.localdomain podman[240284]: 2025-12-06 09:47:41.258388781 +0000 UTC m=+0.530339619 container start b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:41 np0005548788.localdomain podman[240284]: podman_exporter
Dec 06 09:47:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:47:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14967 DF PROTO=TCP SPT=48584 DPT=9100 SEQ=2210926505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED44000000000001030307) 
Dec 06 09:47:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-432c853227b9a47c28cbd9f8638abd2f4ba478bfd57b8f9c2584b83011a05ecd-merged.mount: Deactivated successfully.
Dec 06 09:47:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-432c853227b9a47c28cbd9f8638abd2f4ba478bfd57b8f9c2584b83011a05ecd-merged.mount: Deactivated successfully.
Dec 06 09:47:43 np0005548788.localdomain systemd[1]: Started podman_exporter container.
Dec 06 09:47:43 np0005548788.localdomain podman[240308]: 2025-12-06 09:47:43.846325901 +0000 UTC m=+2.632438905 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:43 np0005548788.localdomain podman[240308]: 2025-12-06 09:47:43.860329967 +0000 UTC m=+2.646443011 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:43 np0005548788.localdomain sudo[240222]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:43 np0005548788.localdomain podman[240308]: unhealthy
Dec 06 09:47:44 np0005548788.localdomain sudo[240437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikzioobgevglisdbdjhwgmyzhdtskznf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014464.0583363-2054-170736940876612/AnsiballZ_stat.py
Dec 06 09:47:44 np0005548788.localdomain sudo[240437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:44 np0005548788.localdomain python3.9[240439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:44 np0005548788.localdomain sudo[240437]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:44 np0005548788.localdomain sudo[240525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzajbdnxzlwjsmgsfjjjqzptlutiwily ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014464.0583363-2054-170736940876612/AnsiballZ_copy.py
Dec 06 09:47:44 np0005548788.localdomain sudo[240525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:45 np0005548788.localdomain python3.9[240527]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014464.0583363-2054-170736940876612/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:45 np0005548788.localdomain sudo[240525]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14969 DF PROTO=TCP SPT=48584 DPT=9100 SEQ=2210926505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED4FF00000000001030307) 
Dec 06 09:47:46 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:46 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:47:46 np0005548788.localdomain sudo[240635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvrqtcktklcsukyuethxrwcbnztxoixq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014466.646083-2105-134345019973918/AnsiballZ_container_config_data.py
Dec 06 09:47:46 np0005548788.localdomain sudo[240635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:47 np0005548788.localdomain python3.9[240637]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 06 09:47:47 np0005548788.localdomain sudo[240635]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:47:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:47:47.406 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:47:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:47:47.406 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:47:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:47:47.406 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:47:47 np0005548788.localdomain sudo[240745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfhzimjroalrlkvootxeeidppatghhbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014467.4160857-2132-22289874980047/AnsiballZ_container_config_hash.py
Dec 06 09:47:47 np0005548788.localdomain sudo[240745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:47 np0005548788.localdomain python3.9[240747]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:47:47 np0005548788.localdomain sudo[240745]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:47:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:48 np0005548788.localdomain podman[240819]: 2025-12-06 09:47:48.903934265 +0000 UTC m=+0.095462779 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 09:47:48 np0005548788.localdomain podman[240819]: 2025-12-06 09:47:48.914643238 +0000 UTC m=+0.106171742 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:47:49 np0005548788.localdomain sudo[240874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aekoqoljexgyiewuhnnerrsilqyocbkx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014468.377445-2162-40472617080301/AnsiballZ_edpm_container_manage.py
Dec 06 09:47:49 np0005548788.localdomain sudo[240874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:49 np0005548788.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:49 np0005548788.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:49 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:47:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2133 DF PROTO=TCP SPT=39618 DPT=9102 SEQ=225540985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED5B040000000001030307) 
Dec 06 09:47:49 np0005548788.localdomain python3[240876]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:47:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:49 np0005548788.localdomain systemd[1]: tmp-crun.w0WAeq.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:47:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548788.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:50 np0005548788.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_yaml/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/netifaces.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/netifaces.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/events.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/events.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/reader.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/reader.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/resolver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/resolver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/scanner.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/scanner.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/_yaml.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/_yaml.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/loader.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/loader.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/emitter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/emitter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/composer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/composer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/parser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/parser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/reader.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/reader.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/scanner.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/scanner.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/scanner.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/scanner.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/tokens.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/tokens.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/constructor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/constructor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/constructor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/constructor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/error.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/error.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/events.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/events.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/loader.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/loader.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/emitter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/emitter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/events.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/events.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/tokens.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/tokens.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/composer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/composer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/cyaml.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/cyaml.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/cyaml.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/cyaml.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/reader.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/reader.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/resolver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/resolver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/dumper.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/dumper.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/error.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/error.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/nodes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/nodes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/parser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/parser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/representer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/representer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/dumper.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/dumper.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/loader.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/loader.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/nodes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/nodes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/representer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/representer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/resolver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/__pycache__/resolver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/composer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/composer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/constructor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/constructor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/dumper.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/dumper.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/emitter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/emitter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/error.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/error.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/representer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/representer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/cyaml.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/cyaml.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/nodes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/nodes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/parser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/parser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/tokens.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/yaml/tokens.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/PyYAML-5.4.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/fallback.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/fallback.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/_version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/_version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/_version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/_version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/ext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/ext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/ext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/ext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/fallback.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/fallback.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Getting root fs size for \"03368b47654a69a8e06e0f65fb208281bce865870adba2cf83de8016116842a3\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": creating overlay mount to /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/H7YOTL3SFJRONSRX22VQHLOVZ2:/var/lib/containers/storage/overlay/l/J3YUH3ZFFKJKK3VHQQB2HTHNU7,upperdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/diff,workdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/work,nodev,metacopy=on\": no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/fallback.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/__pycache__/fallback.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/_cmsgpack.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/_cmsgpack.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/_version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/_version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/ext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack/ext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/MarkupSafe-1.1.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/pyrsistent-0.17.3-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/ujson-4.0.2-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_pyrsistent_version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_pyrsistent_version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi-1.14.5-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/fernet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/fernet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/fernet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/fernet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__about__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__about__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__about__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__about__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/binding.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/binding.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/_conditional.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/_conditional.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/_conditional.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/_conditional.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/binding.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/__pycache__/binding.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/_conditional.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/_conditional.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/binding.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/openssl/binding.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_openssl.abi3.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_openssl.abi3.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/asn1.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/asn1.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/ocsp.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/ocsp.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/x509.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/x509.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/__init__.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust/__init__.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust.abi3.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/bindings/_rust.abi3.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/_asymmetric.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/_asymmetric.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/hashes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/hashes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_serialization.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_serialization.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/cmac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/cmac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hashes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hashes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/keywrap.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/keywrap.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_serialization.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_serialization.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/cmac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/cmac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/constant_time.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/constant_time.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/keywrap.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/keywrap.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/padding.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/padding.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_asymmetric.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_asymmetric.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_cipheralgorithm.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_cipheralgorithm.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/constant_time.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/constant_time.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hashes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hashes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hmac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hmac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/poly1305.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/poly1305.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_asymmetric.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_asymmetric.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_cipheralgorithm.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/_cipheralgorithm.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hmac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/hmac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/padding.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/padding.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/poly1305.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__pycache__/poly1305.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/cmac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/cmac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/constant_time.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/constant_time.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/poly1305.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/poly1305.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/hmac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/hmac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/kbkdf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/kbkdf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/pbkdf2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/pbkdf2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/scrypt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/scrypt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/x963kdf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/x963kdf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/hkdf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/hkdf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/hkdf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/hkdf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/pbkdf2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/pbkdf2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/scrypt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/scrypt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/x963kdf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/x963kdf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/concatkdf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/concatkdf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/concatkdf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/concatkdf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/kbkdf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/kbkdf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/scrypt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/scrypt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/x963kdf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/x963kdf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/pbkdf2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/pbkdf2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/kbkdf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/kbkdf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/concatkdf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/concatkdf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/hkdf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/kdf/hkdf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/keywrap.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/keywrap.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/padding.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/padding.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/_serialization.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/_serialization.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/x448.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/x448.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dh.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dh.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dsa.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/dsa.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/ed25519.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/ed25519.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/x25519.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/x25519.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dsa.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dsa.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x448.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x448.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed448.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed448.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x25519.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x25519.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x448.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x448.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dh.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dh.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ec.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ec.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed25519.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed25519.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed448.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed448.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x25519.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/x25519.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dsa.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dsa.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/padding.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/padding.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/padding.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/padding.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/rsa.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/rsa.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/rsa.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/rsa.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dh.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/dh.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ec.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ec.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed25519.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/__pycache__/ed25519.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/ec.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/ec.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/ed448.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/ed448.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/padding.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/padding.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/rsa.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/asymmetric/rsa.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/modes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/modes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/algorithms.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/algorithms.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/aead.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/aead.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/aead.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/aead.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/algorithms.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/algorithms.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/modes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/modes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/modes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/__pycache__/modes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/aead.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/aead.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/algorithms.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/algorithms.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/ciphers/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs12.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs12.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs12.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs12.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs7.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs7.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs7.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/pkcs7.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/ssh.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/ssh.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/ssh.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__pycache__/ssh.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/pkcs12.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/pkcs12.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/pkcs7.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/pkcs7.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/ssh.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/ssh.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/serialization/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/_cipheralgorithm.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/_cipheralgorithm.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/hotp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/hotp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/hotp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/hotp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/totp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/totp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/totp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/totp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/hotp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/hotp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/totp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/primitives/twofactor/totp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/_oid.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/_oid.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/_oid.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/__pycache__/_oid.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/_oid.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/_oid.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/interfaces.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/interfaces.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/interfaces.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/__pycache__/interfaces.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/interfaces.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/interfaces.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ec.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ec.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ed25519.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ed25519.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/hmac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/hmac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/aead.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/aead.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/backend.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/backend.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/hashes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/hashes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/x448.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/x448.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/x509.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/x509.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dsa.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dsa.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ec.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ec.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/poly1305.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/poly1305.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dh.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dh.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed25519.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed25519.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/decode_asn1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/decode_asn1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/backend.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/backend.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/cmac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/cmac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dh.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dh.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hashes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hashes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hashes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hashes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x509.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x509.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/aead.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/aead.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dsa.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/dsa.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed25519.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed25519.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x25519.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x25519.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/cmac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/cmac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed448.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed448.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hmac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hmac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x448.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x448.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/aead.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/aead.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/poly1305.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/poly1305.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/rsa.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/rsa.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/decode_asn1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/decode_asn1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/backend.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/backend.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ciphers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ciphers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ec.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ec.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed448.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ed448.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x509.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x509.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ciphers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/ciphers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/encode_asn1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/encode_asn1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/encode_asn1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/encode_asn1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hmac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/hmac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/rsa.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/rsa.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x25519.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x25519.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x448.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/x448.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/decode_asn1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/decode_asn1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/dsa.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/dsa.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/encode_asn1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/encode_asn1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/rsa.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/rsa.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/poly1305.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/poly1305.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/x25519.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/x25519.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ciphers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ciphers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/cmac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/cmac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/dh.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/dh.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ed448.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/hazmat/backends/openssl/ed448.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__about__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__about__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/certificate_transparency.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/certificate_transparency.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/general_name.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/general_name.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/name.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/name.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/oid.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/oid.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/extensions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/extensions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/ocsp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/ocsp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/certificate_transparency.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/certificate_transparency.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/certificate_transparency.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/certificate_transparency.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/oid.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/oid.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/name.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/name.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/ocsp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/ocsp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/oid.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/oid.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/general_name.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/general_name.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/name.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/name.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/extensions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/extensions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/general_name.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/__pycache__/general_name.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/extensions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/extensions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/ocsp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/x509/ocsp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/fernet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/fernet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/py.typed\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cryptography/py.typed: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/decoder.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/decoder.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/scanner.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/scanner.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/tool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/tool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/errors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/errors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/raw_json.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/raw_json.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/raw_json.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/raw_json.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/encoder.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/encoder.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/encoder.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/encoder.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/ordered_dict.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/ordered_dict.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/ordered_dict.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/ordered_dict.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/scanner.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/scanner.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/decoder.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/decoder.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/errors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/errors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/tool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__pycache__/tool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/_speedups.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/_speedups.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/decoder.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/decoder.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/ordered_dict.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/ordered_dict.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/scanner.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/scanner.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/encoder.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/encoder.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/errors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/errors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/raw_json.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/raw_json.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_dump.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_dump.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_encode_for_html.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_encode_for_html.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_namedtuple.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_namedtuple.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_pass1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_pass1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_pass2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_pass2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_pass3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_pass3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_raw_json.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_raw_json.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_decode.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_decode.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_separators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_separators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_iterable.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_iterable.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/_cibw_runner.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/_cibw_runner.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_indent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_indent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_item_sort_key.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_item_sort_key.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_separators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_separators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_for_html.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_for_html.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_fail.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_fail.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_for_json.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_for_json.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decode.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decode.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_iterable.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_iterable.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_item_sort_key.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_item_sort_key.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_dump.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_dump.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/_cibw_runner.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/_cibw_runner.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_basestring_ascii.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_basestring_ascii.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bigint_as_string.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bigint_as_string.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_default.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_default.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_namedtuple.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_namedtuple.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tuple.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tuple.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decimal.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decimal.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_basestring_ascii.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_basestring_ascii.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_raw_json.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_raw_json.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_recursion.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_recursion.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_check_circular.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_check_circular.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_errors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_errors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_for_json.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_for_json.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bitsize_int_as_string.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bitsize_int_as_string.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_indent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_indent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_namedtuple.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_namedtuple.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_str_subclass.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_str_subclass.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_raw_json.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_raw_json.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_scanstring.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_scanstring.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_separators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_separators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_unicode.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_unicode.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bigint_as_string.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bigint_as_string.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_dump.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_dump.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_speedups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_speedups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_default.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_default.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_errors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_errors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_scanstring.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_scanstring.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_speedups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_speedups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decimal.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decimal.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_fail.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_fail.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_float.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_float.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_float.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_float.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_pass1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_str_subclass.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_str_subclass.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_subclass.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_subclass.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tuple.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_tuple.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_for_html.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_encode_for_html.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bitsize_int_as_string.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_bitsize_int_as_string.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_check_circular.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_check_circular.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decode.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_decode.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_iterable.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_iterable.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_recursion.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_recursion.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_subclass.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_subclass.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_unicode.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/test_unicode.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/_cibw_runner.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__pycache__/_cibw_runner.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_bitsize_int_as_string.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_bitsize_int_as_string.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_encode_basestring_ascii.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_encode_basestring_ascii.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_item_sort_key.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_item_sort_key.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_recursion.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_recursion.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_str_subclass.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_str_subclass.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_scanstring.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_scanstring.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_tuple.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_tuple.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_for_json.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_for_json.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_errors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_errors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_float.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_float.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_default.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_default.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_indent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_indent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_speedups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_speedups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_subclass.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_subclass.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_tool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_tool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_decimal.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_decimal.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_check_circular.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_check_circular.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_fail.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_fail.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_unicode.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_unicode.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_bigint_as_string.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tests/test_bigint_as_string.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/simplejson/tool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/_wrappers.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/_wrappers.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/decorators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/decorators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/importer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/importer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/wrappers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/wrappers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/wrappers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/wrappers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/wrappers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/wrappers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/decorators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/decorators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/decorators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/decorators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/importer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/importer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/importer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/wrapt/__pycache__/importer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/__pycache__/_pyrsistent_version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/__pycache__/_pyrsistent_version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/__pycache__/_pyrsistent_version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/__pycache__/_pyrsistent_version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_cffi_backend.cpython-39-x86_64-linux-gnu.so\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/_cffi_backend.cpython-39-x86_64-linux-gnu.so: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/ffiplatform.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/ffiplatform.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/backend_ctypes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/backend_ctypes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/commontypes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/commontypes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/cffi_opcode.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/cffi_opcode.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/cparser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/cparser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/pkgconfig.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/pkgconfig.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/_cffi_errors.h\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/_cffi_errors.h: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/_cffi_include.h\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/_cffi_include.h: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/_embedding.h\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/_embedding.h: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/recompiler.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/recompiler.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/vengine_cpy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/vengine_cpy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/vengine_gen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/vengine_gen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/verifier.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/verifier.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/error.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/error.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/lock.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/lock.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/parse_c_type.h\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/parse_c_type.h: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/model.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/model.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_cpy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_cpy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cffi_opcode.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cffi_opcode.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/commontypes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/commontypes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cparser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cparser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/error.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/error.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/backend_ctypes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/backend_ctypes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/error.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/error.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/ffiplatform.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/ffiplatform.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_gen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_gen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/verifier.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/verifier.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cffi_opcode.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cffi_opcode.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/commontypes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/commontypes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/recompiler.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/recompiler.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/pkgconfig.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/pkgconfig.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/ffiplatform.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/ffiplatform.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/recompiler.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/recompiler.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/verifier.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/verifier.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cparser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/cparser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/lock.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/lock.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/setuptools_ext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/setuptools_ext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_gen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_gen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/pkgconfig.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/pkgconfig.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/setuptools_ext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/setuptools_ext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/backend_ctypes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/backend_ctypes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/lock.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/lock.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/model.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/model.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_cpy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/__pycache__/vengine_cpy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/model.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/model.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/setuptools_ext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/cffi/setuptools_ext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/python3.9/site-packages/msgpack-1.0.2-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/libcbor.so.0.7.0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/libcbor.so.0.7.0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/libdouble-conversion.so.3.1.5\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib64/libdouble-conversion.so.3.1.5: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-multi-pack-index\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-multi-pack-index: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-unpack-file\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-unpack-file: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-ref-format\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-ref-format: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rerere\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rerere: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rev-list\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rev-list: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-ext\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-ext: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bisect--helper\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bisect--helper: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-ignore\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-ignore: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-count-objects\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-count-objects: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sh-i18n\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sh-i18n: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-tag\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-tag: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-checkout\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-checkout: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-gc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-gc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mktag\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mktag: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential-cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential-cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-name-rev\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-name-rev: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-prune\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-prune: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-http\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-http: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential-cache--daemon\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential-cache--daemon: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-get-tar-commit-id\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-get-tar-commit-id: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-file\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-file: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-verify-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-verify-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-branch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-branch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-mailmap\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-mailmap: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff-tree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff-tree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-difftool--helper\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-difftool--helper: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fsck-objects\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fsck-objects: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-am\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-am: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-cherry\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-cherry: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff-files\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff-files: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-describe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-describe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-update-ref\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-update-ref: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-upload-archive\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-upload-archive: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pack-objects\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pack-objects: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pull\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pull: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-revert\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-revert: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sparse-checkout\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sparse-checkout: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-index-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-index-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-base\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-base: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mv\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mv: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-verify-commit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-verify-commit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diagnose\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diagnose: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-ls-tree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-ls-tree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-clean\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-clean: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-status\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-status: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-maintenance\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-maintenance: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pack-redundant\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pack-redundant: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-stripspace\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-stripspace: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-var\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-var: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-verify-tag\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-verify-tag: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-attr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-check-attr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-checkout-index\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-checkout-index: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-commit-tree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-commit-tree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-version\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-version: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-env--helper\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-env--helper: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-http-push\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-http-push: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-replace\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-replace: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-subtree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-subtree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-repack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-repack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-blame\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-blame: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-commit-graph\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-commit-graph: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-init\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-init: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rev-parse\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rev-parse: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rm\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rm: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-for-each-repo\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-for-each-repo: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-hash-object\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-hash-object: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-notes\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-notes: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-switch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-switch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-commit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-commit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pack-refs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-pack-refs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-send-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-send-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-ftp\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-ftp: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-upload-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-upload-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-checkout--worker\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-checkout--worker: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-cherry-pick\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-cherry-pick: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mergetool--lib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mergetool--lib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rebase\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-rebase: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-stash\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-stash: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mailinfo\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mailinfo: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mailsplit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mailsplit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-ours\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-ours: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-imap-send\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-imap-send: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-range-diff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-range-diff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-one-file\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-one-file: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mergetool\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mergetool: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-receive-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-receive-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-https\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-https: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sh-i18n--envsubst\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sh-i18n--envsubst: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff-index\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-diff-index: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fetch-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fetch-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-index\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-index: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-shortlog\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-shortlog: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-fd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-fd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-update-index\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-update-index: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-clone\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-clone: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-http-backend\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-http-backend: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bundle\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bundle: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fsmonitor--daemon\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fsmonitor--daemon: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/scalar\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/scalar: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-ls-files\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-ls-files: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show-ref\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show-ref: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-symbolic-ref\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-symbolic-ref: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-write-tree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-write-tree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-column\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-column: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fetch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fetch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-help\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-help: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-archive\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-archive: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-cat-file\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-cat-file: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-for-each-ref\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-for-each-ref: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-patch-id\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-patch-id: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-prune-packed\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-prune-packed: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-add\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-add: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-annotate\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-annotate: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sh-setup\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-sh-setup: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-hook\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-hook: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-interpret-trailers\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-interpret-trailers: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-ls-remote\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-ls-remote: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-shell\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-shell: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-update-server-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-update-server-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-apply\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-apply: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-difftool\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-difftool: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fast-export\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fast-export: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/diffuse\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/diffuse: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/guiffy\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/guiffy: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/kdiff3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/kdiff3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/tkdiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/tkdiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/winmerge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/winmerge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/xxdiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/xxdiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/codecompare\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/codecompare: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/examdiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/examdiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/meld\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/meld: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/tortoisemerge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/tortoisemerge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/vimdiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/vimdiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/bc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/bc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/deltawalker\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/deltawalker: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/gvimdiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/gvimdiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/nvimdiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/nvimdiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/opendiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/opendiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/smerge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/smerge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/araxis\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/araxis: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/diffmerge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/diffmerge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/ecmerge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/ecmerge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/emerge\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/emerge: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/kompare\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/mergetools/kompare: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-recursive\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-recursive: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-push\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-push: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-ftps\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-remote-ftps: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-submodule--helper\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-submodule--helper: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-worktree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-worktree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fmt-merge-msg\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fmt-merge-msg: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-format-patch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-format-patch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-octopus\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-octopus: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-init-db\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-init-db: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-read-tree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-read-tree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-reset\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-reset: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show-index\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show-index: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-whatchanged\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-whatchanged: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bisect\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bisect: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fast-import\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fast-import: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-http-fetch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-http-fetch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mktree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-mktree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-submodule\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-submodule: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-tree\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-tree: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-quiltimport\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-quiltimport: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-restore\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-restore: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show-branch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-show-branch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-stage\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-stage: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bugreport\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-bugreport: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fsck\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-fsck: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-resolve\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-merge-resolve: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-unpack-objects\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-unpack-objects: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-web--browse\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-web--browse: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential-store\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-credential-store: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-grep\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-grep: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-reflog\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/git-core/git-reflog: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh/ssh-keysign\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh/ssh-keysign: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh/ssh-pkcs11-helper\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh/ssh-pkcs11-helper: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh/ssh-sk-helper\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/libexec/openssh/ssh-sk-helper: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/d3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/d3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/d3/ceaff56d36eb840073c1f3a4876860eb9269a9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/d3/ceaff56d36eb840073c1f3a4876860eb9269a9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/1b\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/1b: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/1b/f221ab089a3e7aac596dd1a780199df639c775\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/1b/f221ab089a3e7aac596dd1a780199df639c775: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/24\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/24: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/24/eda0300f5c6edfebdb5c1708c4f2469ca52c9a\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/24/eda0300f5c6edfebdb5c1708c4f2469ca52c9a: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bd/f978fc49ebdbbb2babe4f707493f5bc087a295\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bd/f978fc49ebdbbb2babe4f707493f5bc087a295: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bd/e409eab5f7ae08107a94f57dd876edaf4b4178\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bd/e409eab5f7ae08107a94f57dd876edaf4b4178: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/be\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/be: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/be/49f9764b6ea8ebfda38f95561b80cedc1c0c1f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/be/49f9764b6ea8ebfda38f95561b80cedc1c0c1f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fe/50f5c96b5921b04d132450fad9703bca0bea42\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fe/50f5c96b5921b04d132450fad9703bca0bea42: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2d/7c94ea1208b02054a1e96d71740704167c1398\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2d/7c94ea1208b02054a1e96d71740704167c1398: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/6e\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/6e: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/6e/7306c74c779736e02f1826c46021bb2ba34cdb\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/6e/7306c74c779736e02f1826c46021bb2ba34cdb: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f4\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f4: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f4/afb3a8cd1894522c7400f25d4b76f5e189a63d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f4/afb3a8cd1894522c7400f25d4b76f5e189a63d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a0/ef6b3d916a19bd95a96992bc855c51a3d5af97\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a0/ef6b3d916a19bd95a96992bc855c51a3d5af97: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a0/92a4567d565c49bc95e901b235fbb79a29f273\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a0/92a4567d565c49bc95e901b235fbb79a29f273: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/ce\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/ce: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/ce/4b3325880240abd5738a15ae1e700818e832de\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/ce/4b3325880240abd5738a15ae1e700818e832de: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2a\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2a: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2a/1647e50e566249b5ca6d91132700f821b50ffa\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2a/1647e50e566249b5ca6d91132700f821b50ffa: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2a/a534138afabeb2cbaa3e1b179e07934be08658\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/2a/a534138afabeb2cbaa3e1b179e07934be08658: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/eb\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/eb: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/eb/5779bdd11f1d10b777d9d9a7825fb2d27e8a05\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/eb/5779bdd11f1d10b777d9d9a7825fb2d27e8a05: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/36\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/36: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/36/29e3052b8e5a951df048c8dfd5fa29c0f17079\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/36/29e3052b8e5a951df048c8dfd5fa29c0f17079: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c0/24e4ab78f41bad1b684e1647cf555ffbf0632d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c0/24e4ab78f41bad1b684e1647cf555ffbf0632d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/07\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/07: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/07/92be822d01814e381671821973faecf18cfc3f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/07/92be822d01814e381671821973faecf18cfc3f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/47\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/47: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/47/56039ba6afac53ad61d55693367385c0eb2c95\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/47/56039ba6afac53ad61d55693367385c0eb2c95: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/49\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/49: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/49/8d2d6c8ba0f4217afc0c7a24226e845ff0fc39\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/49/8d2d6c8ba0f4217afc0c7a24226e845ff0fc39: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/49/99b821b4382a9c3397c2fd1466f268ed9c655a\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/49/99b821b4382a9c3397c2fd1466f268ed9c655a: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4b\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4b: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4b/0082ec1ced1495999689f0243c6be7f7e2d297\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4b/0082ec1ced1495999689f0243c6be7f7e2d297: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/20\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/20: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/20/a19d990fbf275ad0f60ab1b07c45d2a58d8f72\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/20/a19d990fbf275ad0f60ab1b07c45d2a58d8f72: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a7\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a7: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a7/44286e5a7be3979b9a99982a378ead65d07ef9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a7/44286e5a7be3979b9a99982a378ead65d07ef9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c6\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c6: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c6/049ad7e043abf833c4f280b0144c3086249266\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c6/049ad7e043abf833c4f280b0144c3086249266: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c3/e86e487c26e16955633222d31603721f44a9b6\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c3/e86e487c26e16955633222d31603721f44a9b6: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/e6\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/e6: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/e6/4c739347a080f4621482e6e61cd93302d1ebaa\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/e6/4c739347a080f4621482e6e61cd93302d1ebaa: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a6\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a6: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a6/db2592783c8cd37f53e46b811dd6aaba74a40d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/a6/db2592783c8cd37f53e46b811dd6aaba74a40d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/32\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/32: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/32/68416e39eff72bbcb1f52a5d4c182fb2aab2f3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/32/68416e39eff72bbcb1f52a5d4c182fb2aab2f3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c4\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c4: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c4/b33f6f7086dad417330eef573d96c08b16fa32\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/c4/b33f6f7086dad417330eef573d96c08b16fa32: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f0/6482219147a7ffc98f61a6b7416f65456e04aa\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/f0/6482219147a7ffc98f61a6b7416f65456e04aa: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/5f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/5f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/5f/ee255cff4fea3e86361f25b4de20a016ed0c7c\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/5f/ee255cff4fea3e86361f25b4de20a016ed0c7c: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/df\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/df: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/df/3120102ae87b31e59bd28720b0c5b6f44e999f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/df/3120102ae87b31e59bd28720b0c5b6f44e999f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/15\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/15: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/15/6155eb5a967e1d0f188750a70065bbd68aa1b1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/15/6155eb5a967e1d0f188750a70065bbd68aa1b1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/15/5f51d5337c231c03fa04cbec78dc29c392ad0b\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/15/5f51d5337c231c03fa04cbec78dc29c392ad0b: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fd/3278891ff6c541e0354406ee763dfa32b08a86\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/fd/3278891ff6c541e0354406ee763dfa32b08a86: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/96\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/96: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/96/92595646b7d0e6ba05aef2fa588bb386d02ce0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/96/92595646b7d0e6ba05aef2fa588bb386d02ce0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/01\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/01: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/01/4764b3eef2dbd986a00eefc1ccd33f6ee7ab4f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/01/4764b3eef2dbd986a00eefc1ccd33f6ee7ab4f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4f/a7ebeadcd13a3356a270312ef4ac36c279e13c\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/4f/a7ebeadcd13a3356a270312ef4ac36c279e13c: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/25\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/25: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/25/b0e842d0be2a76dc9ac21afaf49db48b95ce95\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/25/b0e842d0be2a76dc9ac21afaf49db48b95ce95: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bc/b006cb5db04d2a2a848d231d9ab56433e89b3f\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/bc/b006cb5db04d2a2a848d231d9ab56433e89b3f: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/44\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/44: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/44/e886f83106e45b1ff9b5fc84a281102d3c8c0c\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/44/e886f83106e45b1ff9b5fc84a281102d3c8c0c: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/77\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/77: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/77/bc92f85237993ffeaf5fa9317e6245ed458567\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/77/bc92f85237993ffeaf5fa9317e6245ed458567: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/10\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/10: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/10/0e0b30b2bb85ac550b0be644c95bcf916463e0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/.build-id/10/0e0b30b2bb85ac550b0be644c95bcf916463e0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstacksdk-0.55.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache-3.5.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch-1.21-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch-2.5.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.context-3.2.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.i18n-5.0.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/context.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/context.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/context.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/context.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/context.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/context.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_context/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/_argparse.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/_argparse.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/app.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/app.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/command.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/command.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/help.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/help.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/hooks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/hooks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/hooks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/hooks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/_argparse.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/_argparse.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/help.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/help.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/lister.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/lister.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/sphinxext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/sphinxext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/app.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/app.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/command.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/command.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/command.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/command.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/commandmanager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/commandmanager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/columns.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/columns.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/interactive.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/interactive.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/sphinxext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/sphinxext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/display.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/display.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/interactive.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/interactive.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/lister.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/lister.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/_argparse.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/_argparse.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/app.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/app.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/commandmanager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/commandmanager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/show.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/show.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/show.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/show.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/columns.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/columns.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/complete.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/complete.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/complete.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/complete.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/display.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__pycache__/display.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/columns.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/columns.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/lister.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/lister.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/sphinxext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/sphinxext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/commandmanager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/commandmanager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/json_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/json_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/table.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/table.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/commaseparated.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/commaseparated.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/value.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/value.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/yaml_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/yaml_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/commaseparated.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/commaseparated.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/json_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/json_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/table.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/table.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/value.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/value.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/yaml_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__pycache__/yaml_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/json_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/json_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/table.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/table.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/value.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/value.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/yaml_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/yaml_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/commaseparated.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/commaseparated.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/formatters/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/help.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/help.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/interactive.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/interactive.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/complete.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/complete.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/display.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/display.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/hooks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/hooks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/show.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff/show.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser-2.20-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types-1.7.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_swiftclient-3.11.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/win32.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/win32.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/winterm.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/winterm.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansi.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansi.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/initialise.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/initialise.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/win32.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/win32.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/win32.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/win32.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/winterm.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/winterm.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/winterm.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/winterm.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansitowin32.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansitowin32.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansitowin32.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansitowin32.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/initialise.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/initialise.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansi.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/__pycache__/ansi.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/ansi.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/ansi.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/ansitowin32.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/ansitowin32.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/initialise.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama/initialise.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient-7.0.7-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/exc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/exc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/exc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/exc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/exc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/exc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_node.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_node.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_portgroup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_portgroup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_volume_target.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_volume_target.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_node.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_node.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_node.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_node.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_chassis.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_chassis.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_create.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_create.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_portgroup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_portgroup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_connector.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_connector.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_target.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_target.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_conductor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_conductor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_deploy_template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_deploy_template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_target.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_target.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_allocation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_allocation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_chassis.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_chassis.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_conductor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_conductor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_portgroup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_portgroup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_allocation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_allocation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_create.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_create.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_deploy_template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_deploy_template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_connector.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__pycache__/baremetal_volume_connector.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_chassis.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_chassis.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_conductor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_conductor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_deploy_template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_deploy_template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_volume_connector.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_volume_connector.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_allocation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_allocation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_create.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/v1/baremetal_create.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/chassis.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/chassis.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/volume_target.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/volume_target.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/allocation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/allocation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/conductor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/conductor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/create_resources.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/create_resources.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/events.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/events.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/node.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/node.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/volume_connector.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/volume_connector.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/portgroup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/portgroup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/resource_fields.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/resource_fields.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/create_resources.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/create_resources.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/deploy_template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/deploy_template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_target.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_target.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/allocation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/allocation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/chassis.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/chassis.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/resource_fields.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/resource_fields.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_connector.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_connector.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/portgroup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/portgroup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/conductor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/conductor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/node.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/node.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/conductor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/conductor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/events.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/events.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/resource_fields.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/resource_fields.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_target.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_target.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/allocation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/allocation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/events.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/events.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/node.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/node.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/deploy_template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/deploy_template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/portgroup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/portgroup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_connector.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/volume_connector.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/create_resources.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/create_resources.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/chassis.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/chassis.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/deploy_template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/deploy_template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/v1/driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/filecache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/filecache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/http.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/http.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/filecache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/filecache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/http.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/http.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/apiclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/filecache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/filecache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/http.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/http.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/common/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/test_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/test_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/test_import.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/test_import.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_volume_connector.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_volume_connector.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_conductor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_conductor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_create_resources.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_create_resources.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_events.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_events.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_node.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_node.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_portgroup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_portgroup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_allocation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_allocation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_volume_target.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_volume_target.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_deploy_template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_deploy_template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_resource_fields.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_resource_fields.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_portgroup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_portgroup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_target.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_target.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_target.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_target.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_resource_fields.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_resource_fields.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_connector.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_connector.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_connector.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_volume_connector.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_resource_fields.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_resource_fields.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_allocation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_allocation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_create_resources.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_create_resources.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_deploy_template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_deploy_template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_events.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_events.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_conductor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_conductor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_conductor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_conductor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_node.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_node.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_allocation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_allocation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_create_resources.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_create_resources.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_deploy_template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_deploy_template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_events.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_events.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_chassis.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_chassis.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_chassis.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_chassis.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_node.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_node.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_portgroup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/test_portgroup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_chassis.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/v1/test_chassis.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_exc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_exc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_exc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_exc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_import.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_import.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_import.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/test_import.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/test_plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/test_plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/test_plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/test_plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/test_plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/test_plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_create.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_create.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_portgroup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_portgroup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_portgroup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_portgroup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_connector.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_connector.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_conductor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_conductor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_conductor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_conductor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_node.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_node.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_target.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_target.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_chassis.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_chassis.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_deploy_template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_deploy_template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_deploy_template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_deploy_template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_create.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_create.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_node.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_node.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_connector.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_connector.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_chassis.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_chassis.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_target.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__pycache__/test_baremetal_volume_target.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_conductor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_conductor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_create.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_create.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_allocation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_allocation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_deploy_template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_deploy_template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_node.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_node.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_portgroup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_portgroup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_volume_connector.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_volume_connector.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_volume_target.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_volume_target.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_chassis.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/osc/v1/test_baremetal_chassis.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/test_exc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/test_exc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_filecache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_filecache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_http.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_http.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_filecache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_filecache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_http.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_http.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/__pycache__/test_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/__pycache__/test_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/test_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/test_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/test_exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/apiclient/test_exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_filecache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_filecache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_http.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_http.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/unit/common/test_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_portgroup_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_portgroup_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_allocation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_allocation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_chassis_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_chassis_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_create_negative.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_create_negative.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_power_states.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_power_states.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_port_create.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_port_create.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_driver_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_driver_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_create_negative.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_create_negative.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_fields.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_fields.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_power_states.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_power_states.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_create.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_create.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_chassis_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_chassis_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_negative.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_negative.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_driver_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_driver_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_fields.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_fields.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_negative.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_negative.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_portgroup_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_portgroup_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_chassis_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_chassis_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_deploy_template_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_deploy_template_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_power_states.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_power_states.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_conductor_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_conductor_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_deploy_template_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_deploy_template_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_create_negative.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_create_negative.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_conductor_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_conductor_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_provision_states.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_provision_states.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_provision_states.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_node_provision_states.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_create.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_port_create.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_portgroup_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_portgroup_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__pycache__/test_baremetal_allocation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_conductor_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_conductor_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_deploy_template_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_deploy_template_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_negative.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_negative.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_provision_states.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_provision_states.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_port_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_port_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_driver_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_driver_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_fields.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/osc/v1/test_baremetal_node_fields.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ironicclient/tests/functional/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/opts.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/opts.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/opts.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/opts.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/session.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/session.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/conf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/conf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/identity.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/identity.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/identity.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/identity.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/session.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/session.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/conf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/conf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/adapter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/adapter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/adapter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/adapter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/adapter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/adapter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/conf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/conf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/opts.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/opts.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/admin_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/admin_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/http_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/http_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/generic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/generic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/generic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/generic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/__pycache__/v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/generic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/generic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/identity/v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/noauth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/noauth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/noauth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/noauth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/admin_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/admin_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/admin_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/admin_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/http_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/http_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/http_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/http_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/noauth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_plugins/__pycache__/noauth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/identity.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/identity.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/session.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/loading/session.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/noauth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/noauth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/service_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/service_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/test_auth_saml2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/test_auth_saml2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_adfs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_adfs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_adfs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_adfs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_saml2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_saml2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_saml2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/test_auth_saml2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples/xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples/xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples/xml/ADFS_RequestSecurityTokenResponse.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples/xml/ADFS_RequestSecurityTokenResponse.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples/xml/ADFS_fault.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/examples/xml/ADFS_fault.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates/authn_request.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates/authn_request.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates/saml_assertion.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates/saml_assertion.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates/soap_response.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/fixtures/templates/soap_response.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/test_auth_adfs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/saml2/test_auth_adfs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_fedkerb_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_fedkerb_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_kerberos_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_kerberos_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_kerberos_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_kerberos_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_mapped.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_mapped.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_fedkerb_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_fedkerb_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_mapped.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/test_mapped.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_fedkerb_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_fedkerb_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_kerberos_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_kerberos_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_mapped.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/kerberos/test_mapped.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/test_oauth1_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/test_oauth1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/test_oauth1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/test_oauth1_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/test_oauth1_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/extras/oauth1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/test_exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/test_exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/test_exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/test_exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/test_exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/exceptions/test_exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_betamax_serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_betamax_serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_matchers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_matchers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_token_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_token_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/ksa_betamax_test_cassette.yaml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/ksa_betamax_test_cassette.yaml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/ksa_serializer_data.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/ksa_serializer_data.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/test_pre_record_hook.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/test_pre_record_hook.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/README\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/README: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v2_sample_request.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v2_sample_request.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v2_sample_response.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v2_sample_response.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v3_sample_request.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v3_sample_request.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v3_sample_response.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/data/keystone_v3_sample_response.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/k2k_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/k2k_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/matchers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/matchers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_betamax_hooks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_betamax_hooks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/client_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/client_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/oidc_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/oidc_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_discovery.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_discovery.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_service_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_service_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_token_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_token_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/k2k_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/k2k_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fair_sempahore.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fair_sempahore.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/k2k_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/k2k_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_noauth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_noauth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_session.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_session.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_session.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_session.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/keystoneauth_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/keystoneauth_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/keystoneauth_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/keystoneauth_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_http_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_http_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_matchers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_matchers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_hooks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_hooks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_http_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_http_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_matchers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_matchers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_discovery.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_discovery.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fair_sempahore.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fair_sempahore.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/client_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/client_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/matchers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/matchers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_hooks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_hooks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_betamax_serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_noauth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_noauth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_service_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_service_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/matchers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/matchers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/oidc_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/oidc_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_token_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__pycache__/test_token_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/keystoneauth_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/keystoneauth_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_entry_points.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_entry_points.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_adapter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_adapter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_conf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_conf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_entry_points.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_entry_points.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_generic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_generic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_adapter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_adapter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_entry_points.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_entry_points.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_session.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_session.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_generic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_generic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_session.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_session.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_conf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_conf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/__pycache__/test_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_adapter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_adapter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_conf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_conf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_generic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_generic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_session.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/test_session.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/loading/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_betamax_fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_betamax_fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_session.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_session.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_fair_sempahore.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_fair_sempahore.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_http_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_http_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/oidc_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/oidc_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_discovery.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_discovery.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_service_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_service_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_service_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_service_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_service_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_service_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_service_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_service_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v3_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__pycache__/test_v2_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v2_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v2_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v2_service_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v2_service_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v3_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v3_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v3_service_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/test_v3_service_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/access/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/client_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/client_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_password.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_password.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_tokenless_auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_tokenless_auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v3_federation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v3_federation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v3_oidc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/test_identity_v3_oidc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_federation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_federation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_tokenless_auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_tokenless_auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_oidc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_oidc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_tokenless_auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_tokenless_auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_password.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_password.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_password.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_password.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_federation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_federation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_oidc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v3_oidc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/identity/__pycache__/test_identity_v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_noauth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_noauth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_service_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/unit/test_service_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/discover.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/discover.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/password.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/password.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/password.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/password.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/__pycache__/token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/password.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/generic/password.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/application_credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/application_credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/federation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/federation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/oidc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/oidc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/tokenless_auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/tokenless_auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/receipt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/receipt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/k2k.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/k2k.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/receipt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/receipt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/totp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/totp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/federation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/federation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/multi_factor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/multi_factor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/totp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/totp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/application_credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/application_credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/receipt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/receipt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/tokenless_auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/tokenless_auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/federation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/federation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/k2k.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/k2k.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/multi_factor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/multi_factor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/password.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/password.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/tokenless_auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/tokenless_auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/application_credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/application_credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/oidc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/oidc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/oidc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/oidc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/password.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/__pycache__/password.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/k2k.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/k2k.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/multi_factor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/multi_factor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/password.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/password.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/totp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/identity/v3/totp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/session.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/session.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/__pycache__/_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/oauth1/v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/adfs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/adfs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/adfs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/adfs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/saml2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/saml2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/saml2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/saml2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/adfs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/adfs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/saml2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/_saml2/v3/saml2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/_loading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/_loading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/_loading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/__pycache__/_loading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/_loading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/extras/kerberos/_loading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/discovery.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/discovery.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/exception.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/exception.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/hooks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/hooks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/keystoneauth_betamax.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/keystoneauth_betamax.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/keystoneauth_betamax.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/keystoneauth_betamax.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/discovery.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/discovery.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/exception.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/exception.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/hooks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/hooks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__pycache__/v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/discovery.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/discovery.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/exception.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/exception.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/hooks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/hooks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/keystoneauth_betamax.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/keystoneauth_betamax.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/fixture/v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/discovery.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/discovery.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/http.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/http.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/connection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/connection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/auth_plugins.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/auth_plugins.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/oidc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/oidc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/response.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/response.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/service_providers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/service_providers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth_plugins.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth_plugins.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/http.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/http.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/service_providers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/service_providers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/service_providers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/service_providers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/connection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/connection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/discovery.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/discovery.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/http.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/http.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/response.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/response.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/connection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/connection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/discovery.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/discovery.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/oidc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/oidc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth_plugins.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/auth_plugins.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/oidc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/oidc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/response.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/exceptions/__pycache__/response.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/token_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/token_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/_fair_semaphore.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/_fair_semaphore.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_providers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_providers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_providers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/__pycache__/service_providers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/service_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/service_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/service_providers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/access/service_providers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/checks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/checks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/checks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/__pycache__/checks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/checks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/hacking/checks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/http_basic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/http_basic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/discover.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/discover.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/http_basic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/http_basic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/noauth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/noauth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/token_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/token_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/token_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/token_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/adapter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/adapter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/http_basic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/http_basic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_fair_semaphore.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_fair_semaphore.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/discover.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/discover.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/adapter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/adapter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/noauth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/noauth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/service_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/service_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/service_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/service_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_fair_semaphore.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_fair_semaphore.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/session.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/session.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/session.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1/__pycache__/session.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_glanceclient-3.3.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_openstackclient-5.5.2.dev16-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp-3.4.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/debug.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/debug.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/rand.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/rand.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/SSL.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/SSL.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/debug.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/debug.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/_util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/_util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/SSL.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/SSL.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/SSL.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/SSL.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/debug.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/debug.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/crypto.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/crypto.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/rand.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/rand.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/rand.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/rand.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/_util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/_util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/crypto.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/__pycache__/crypto.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/_util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/_util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/crypto.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/OpenSSL/crypto.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/crypto.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/crypto.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/crypto.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/crypto.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/api_versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/api_versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/api_versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/api_versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/crypto.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/crypto.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/api_versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/api_versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/availability_zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/availability_zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/instance_usage_audit_log.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/instance_usage_audit_log.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/agents.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/agents.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/aggregates.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/aggregates.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/keypairs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/keypairs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/assisted_volume_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/assisted_volume_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/flavors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/flavors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/servers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/servers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_groups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_groups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_usage_audit_log.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_usage_audit_log.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/keypairs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/keypairs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/images.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/images.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_action.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_action.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/networks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/networks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/aggregates.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/aggregates.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/keypairs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/keypairs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/usage.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/usage.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/networks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/networks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_external_events.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_external_events.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/agents.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/agents.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/availability_zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/availability_zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/availability_zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/availability_zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_groups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_groups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_migrations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_migrations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/usage.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/usage.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/images.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/images.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/hypervisors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/hypervisors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/hypervisors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/hypervisors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_external_events.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_external_events.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/volumes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/volumes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/agents.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/agents.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/assisted_volume_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/assisted_volume_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/volumes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/volumes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_usage_audit_log.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_usage_audit_log.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/assisted_volume_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/assisted_volume_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavor_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavor_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/migrations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/migrations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/servers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/servers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/aggregates.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/aggregates.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_action.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/instance_action.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavor_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/flavor_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/migrations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/migrations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_migrations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/server_migrations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/networks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/networks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/server_migrations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/server_migrations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/volumes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/volumes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/hypervisors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/hypervisors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/server_groups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/server_groups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/usage.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/usage.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/flavor_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/flavor_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/images.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/images.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/instance_action.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/instance_action.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/migrations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/migrations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/server_external_events.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/server_external_events.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/servers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/v2/servers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/novaclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/setup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/setup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/simple.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/simple.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/setup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/setup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/setup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/setup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/simple.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/simple.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/load_as_extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/simple.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/__pycache__/simple.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/load_as_driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/load_as_driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/load_as_extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example/load_as_extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/fields.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/fields.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/fields.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/fields.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/setup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/setup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/setup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/setup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/fields.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/fields.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/setup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/example2/setup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/sphinxext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/sphinxext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/dispatch.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/dispatch.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/exception.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/exception.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/named.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/named.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/hook.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/hook.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/sphinxext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/sphinxext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/_cache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/_cache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/enabled.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/enabled.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/exception.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/exception.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/named.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/named.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/_cache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/_cache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/dispatch.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/dispatch.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/enabled.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/enabled.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/hook.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/__pycache__/hook.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/sphinxext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/sphinxext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/enabled.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/enabled.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/named.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/named.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_enabled.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_enabled.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_named.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_named.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_simple.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_simple.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_sphinxext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_sphinxext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_test_manager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_test_manager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_sphinxext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_sphinxext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/extension_unimportable.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/extension_unimportable.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/manager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/manager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_cache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_cache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_simple.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_simple.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/extension_unimportable.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/extension_unimportable.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_dispatch.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_dispatch.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_dispatch.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_dispatch.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_named.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_named.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_test_manager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_test_manager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_callback.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_callback.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_fields.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_fields.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_hook.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_hook.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_hook.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_hook.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/manager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/manager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_cache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_cache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_callback.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_callback.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_enabled.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_enabled.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_fields.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/test_example_fields.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_cache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_cache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_dispatch.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_dispatch.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_named.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_named.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_sphinxext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_sphinxext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_test_manager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_test_manager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_example_simple.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_example_simple.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_hook.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_hook.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/extension_unimportable.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/extension_unimportable.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/manager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/manager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_enabled.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_enabled.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_example_fields.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_example_fields.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_callback.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_callback.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/tests/test_driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/_cache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/_cache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/dispatch.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/dispatch.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/exception.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/exception.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/hook.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore/hook.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/test_iso8601.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/test_iso8601.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/iso8601.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/iso8601.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/iso8601.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/iso8601.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/test_iso8601.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/__pycache__/test_iso8601.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/iso8601.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/iso8601.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/test_iso8601.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601/test_iso8601.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib-2.3.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/generator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/generator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_fixture.conf\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_fixture.conf: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_generator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_generator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/blaa_opt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/blaa_opt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/fbaar_baa_opt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/fbaar_baa_opt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/fbar_foo_opt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/fbar_foo_opt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/fblaa_opt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/fblaa_opt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/bar_foo_opt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/bar_foo_opt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbar_foo_opt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbar_foo_opt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbar_foo_opt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbar_foo_opt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fblaa_opt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fblaa_opt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/baz_qux_opt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/baz_qux_opt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/blaa_opt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/blaa_opt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbaar_baa_opt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbaar_baa_opt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbaar_baa_opt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fbaar_baa_opt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/blaa_opt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/blaa_opt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/bar_foo_opt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/bar_foo_opt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/baz_qux_opt.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/baz_qux_opt.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fblaa_opt.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/__pycache__/fblaa_opt.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/bar_foo_opt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/bar_foo_opt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/baz_qux_opt.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/testmods/baz_qux_opt.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_validator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_validator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_cfg.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_cfg.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_generator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_generator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_get_location.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_get_location.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_iniparser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_iniparser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_iniparser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_iniparser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_sources.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_sources.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_cfg.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_cfg.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_validator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_validator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_validator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_validator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_generator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_generator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_sources.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_sources.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_get_location.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/__pycache__/test_get_location.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_cfg.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_cfg.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_get_location.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_get_location.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_iniparser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_iniparser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_sources.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_sources.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/tests/test_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sphinxconfiggen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sphinxconfiggen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sphinxext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sphinxext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/validator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/validator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/_list_opts.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/_list_opts.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/cfg.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/cfg.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_environment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_environment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_uri.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_uri.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_uri.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_uri.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_environment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/__pycache__/_environment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/_environment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/_environment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/_uri.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/sources/_uri.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_list_opts.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_list_opts.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/generator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/generator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxconfiggen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxconfiggen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/cfg.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/cfg.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/iniparser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/iniparser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/iniparser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/iniparser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxconfiggen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxconfiggen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_list_opts.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_list_opts.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/cfg.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/cfg.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/generator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/generator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/validator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/validator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/sphinxext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/validator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/validator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/iniparser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_config/iniparser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/dates.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/dates.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/dates.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/dates.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/languages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/languages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/numbers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/numbers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/units.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/units.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/_compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/_compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/_compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/_compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/dates.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/dates.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/languages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/languages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/lists.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/lists.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/localedata.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/localedata.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/plural.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/plural.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/numbers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/numbers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/support.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/support.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/localedata.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/localedata.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/lists.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/lists.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/plural.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/plural.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/support.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/support.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/units.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/units.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/_compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/_compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_MO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_MO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv_AX.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv_AX.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq_MK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq_MK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cs.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cs.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dsb_DE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dsb_DE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ko_KR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ko_KR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lg_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lg_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kab.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kab.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/root.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/root.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MP.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MP.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ja.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ja.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bem.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bem.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tk_TM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tk_TM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ne.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ne.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/prg_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/prg_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wo_SN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wo_SN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ckb_IQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ckb_IQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GP.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GP.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/km_KH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/km_KH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/my.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/my.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_PF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_PF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ki.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ki.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgh_MZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgh_MZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ia.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ia.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Arab_PK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Arab_PK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uk.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uk.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_QA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_QA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_ES.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_ES.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/chr.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/chr.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/et_EE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/et_EE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_ML.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_ML.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kde_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kde_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_SX.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_SX.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rof.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rof.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/as_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/as_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_GT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_GT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kab_DZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kab_DZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_TD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_TD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gl_ES.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gl_ES.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lu_CD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lu_CD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_MX.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_MX.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_MR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_MR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ses.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ses.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tk.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tk.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ur_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ur_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_BA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_BA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uk_UA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uk_UA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ur.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ur.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/asa.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/asa.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_NE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_NE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksh_DE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksh_DE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ne_NP.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ne_NP.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/th.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/th.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Cyrl_BA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Cyrl_BA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_IT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_IT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ebu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ebu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/khq_ML.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/khq_ML.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/is_IS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/is_IS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jmc.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jmc.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se_NO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se_NO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dua_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dua_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_RS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_RS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vo_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vo_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mt_MT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mt_MT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_SG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_SG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_ER.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_ER.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_SM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_SM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_AO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_AO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mni_Beng.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mni_Beng.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hr.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hr.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_RU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_RU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_SG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_SG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_DZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_DZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/teo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/teo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ak_GH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ak_GH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_DJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_DJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cu_RU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cu_RU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_AT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_AT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mt.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mt.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mas.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mas.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nn_NO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nn_NO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Deva.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Deva.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ga_IE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ga_IE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kam_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kam_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ko_KP.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ko_KP.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_KM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_KM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_RW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_RW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_NC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_NC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mk.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mk.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mn_MN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mn_MN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kw.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kw.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mr.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mr.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mr_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mr_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_PS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_PS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ga_GB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ga_GB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kln.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kln.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rw.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rw.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/si.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/si.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/su.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/su.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_RE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_RE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gu_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gu_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ka_GE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ka_GE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Guru_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Guru_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_DJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_DJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw_LI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw_LI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv_FI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv_FI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_TN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_TN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dav.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dav.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mer.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mer.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tzm.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tzm.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nyn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nyn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/smn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/smn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/as.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/as.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_DJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_DJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/haw.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/haw.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ko.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ko.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ml_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ml_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_VE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_VE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq_AL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq_AL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ml.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ml.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se_SE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se_SE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/be_BY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/be_BY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ce_RU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ce_RU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kk_KZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kk_KZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksb_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksb_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pcm_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pcm_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_ME.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_ME.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/brx_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/brx_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ccp_BD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ccp_BD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ee_TG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ee_TG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_BF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_BF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/id_ID.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/id_ID.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_BE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_BE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/os_GE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/os_GE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sah_RU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sah_RU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_BH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_BH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_LU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_LU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mai_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mai_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_PT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_PT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/id.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/id.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pl_PL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pl_PL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ka.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ka.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_CD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_CD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lrc_IR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lrc_IR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dje.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dje.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eo_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eo_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gv.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gv.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mzn_IR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mzn_IR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_CN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_CN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_IT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_IT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ig.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ig.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nds_DE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nds_DE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_TD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_TD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_SY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_SY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_CG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_CG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_BQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_BQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Cyrl_AZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Cyrl_AZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kk.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kk.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sbp.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sbp.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hi.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hi.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mni.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mni.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nb_NO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nb_NO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ZW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ZW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gd_GB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gd_GB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/teo_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/teo_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ti_ET.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ti_ET.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fil.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fil.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ig_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ig_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kn_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kn_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/da_GL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/da_GL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dsb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dsb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_US.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_US.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hant.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hant.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ks_Arab_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ks_Arab_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_HK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_HK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vi.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vi.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_BZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_BZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jmc_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jmc_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nnh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nnh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/te_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/te_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bo_CN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bo_CN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ER.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ER.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nnh_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nnh_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ja_JP.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ja_JP.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lkt.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lkt.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lt.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lt.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_EH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_EH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/asa_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/asa_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_HK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_HK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_IC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_IC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nmg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nmg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xog_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xog_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hans_CN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hans_CN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mer_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mer_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgo_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgo_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_SV.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_SV.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/he_IL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/he_IL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kl_GL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kl_GL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rn_BI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rn_BI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xog.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xog.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_CV.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_CV.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/th_TH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/th_TH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Cyrl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Cyrl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gd.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gd.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hsb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hsb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mfe.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mfe.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/naq.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/naq.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nd.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nd.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_XK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_XK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hi_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hi_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/om.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/om.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv_SE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv_SE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yi.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yi.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fa.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fa.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hsb_DE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hsb_DE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mua.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mua.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/su_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/su_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sv.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_AO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_AO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jv.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jv.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cgg_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cgg_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksf_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksf_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/smn_FI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/smn_FI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ro.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ro.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/seh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/seh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Arab.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Arab.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_LB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_LB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bn_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bn_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_LC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_LC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ku.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ku.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lag_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lag_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kw_GB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kw_GB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ce.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ce.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fo_DK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fo_DK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hr_HR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hr_HR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kok.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kok.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kam.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kam.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_MZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_MZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_LU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_LU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lrc.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lrc.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mai.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mai.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cy.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cy.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TV.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TV.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fo_FO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fo_FO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ro_MD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ro_MD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Latn_MA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Latn_MA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sah.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sah.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sat_Olck.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sat_Olck.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/br.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/br.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fa_IR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fa_IR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hr_BA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hr_BA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mfe_MU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mfe_MU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/om_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/om_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_419.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_419.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_KM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_KM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/saq.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/saq.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Tfng.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Tfng.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/si_LK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/si_LK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ckb_IR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ckb_IR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_SR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_SR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ps_PK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ps_PK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_KZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_KZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_FJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sbp_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sbp_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_SL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_SL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hy.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hy.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ii.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ii.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_MD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_MD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_SO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_SO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bas_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bas_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bm.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bm.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_LR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_LR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_RW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_RW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ku_TR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ku_TR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tt.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tt.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hans.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hans.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cs_CZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cs_CZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mas_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mas_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nds.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nds.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/os.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/os.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_XK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Latn_XK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/et.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/et.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha_NE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha_NE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lkt_US.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lkt_US.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ak.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ak.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/be.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/be.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/el.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/el.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_AR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_AR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_BO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_BO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yo_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yo_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nb_SJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nb_SJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_TL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_TL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_ET.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so_ET.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq_XK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq_XK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tt_RU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tt_RU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_WS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_WS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lb_LU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lb_LU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/or.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/or.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bo_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bo_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ewo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ewo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_NE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_NE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hu_HU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hu_HU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fy_NL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fy_NL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_KG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_KG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ti.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ti.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_LS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_LS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_EC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_EC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_GW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_MR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_MR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hant_HK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yue_Hant_HK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ks_Arab.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ks_Arab.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu_PE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu_PE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Tfng_MA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Tfng_MA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ps.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ps.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_UM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_UM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ZA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ZA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rwk.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rwk.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/is.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/is.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_CF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln_CF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mi_NZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mi_NZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Arab_AF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Arab_AF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/af.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/af.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bez_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bez_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/guz_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/guz_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sg_CF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sg_CF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu_EC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu_EC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rof_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rof_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/teo_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/teo_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_KW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_KW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luy.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luy.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_YE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_YE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/el_CY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/el_CY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gv_IM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gv_IM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lo_LA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lo_LA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_BY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_BY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_FR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_FR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fur_IT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fur_IT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ks.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ks.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_JO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_JO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_OM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_OM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lv.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lv.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_LK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_LK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_GQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_GQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_GW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_GW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Cyrl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Cyrl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_FR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_FR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hy_AM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hy_AM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luo_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luo_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tzm_MA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tzm_MA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_MO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans_MO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lag.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lag.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_MY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ta_MY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ckb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ckb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kkj.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kkj.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Arab.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Arab.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_CD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sw_CD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/br_FR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/br_FR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ee.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ee.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ps_AF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ps_AF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wae_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wae_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dz.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dz.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_ID.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_ID.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_ES_VALENCIA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_ES_VALENCIA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_SN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_SN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/guz.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/guz.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/he.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/he.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_DO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_DO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kea.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kea.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Arab_PK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Arab_PK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ro_RO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ro_RO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mgo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mni_Beng_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mni_Beng_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mas_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mas_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xh_ZA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/xh_ZA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zgh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zgh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_AE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_AE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_JM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_JM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_VA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_VA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/khq.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/khq.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/shi_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tr.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tr.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_MR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_MR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_SL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_SL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_GA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mzn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mzn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dyo_SN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dyo_SN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ky_KG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ky_KG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/prg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/prg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ZM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_ZM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fi_FI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fi_FI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_WF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_WF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kkj_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kkj_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_GW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_LI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_LI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_US_POSIX.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_US_POSIX.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ug_CN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ug_CN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/da_DK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/da_DK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_150.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_150.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_VU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mk_MK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mk_MK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_NL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_NL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sk.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sk.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/am_ET.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/am_ET.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_SN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_SN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_PM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_PM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_SN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_SN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_AW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_AW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_JE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_JE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_ES.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_ES.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hu.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/hu.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jgo_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jgo_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rwk_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rwk_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/twq.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/twq.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/agq.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/agq.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_MY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_MY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tg_TJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tg_TJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/chr_US.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/chr_US.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_KN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lv_LV.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lv_LV.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_DZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_DZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha_NG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha_NG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/saq_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/saq_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ast.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ast.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha_GH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ha_GH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yi_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yi_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant_TW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant_TW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_AD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ca_AD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fa_AF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fa_AF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_TG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_TG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ky.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ky.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vun_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vun_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tr_TR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tr_TR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yav.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yav.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant_MO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant_MO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ast_ES.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ast_ES.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ee_GH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ee_GH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_LR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_LR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nus_SS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nus_SS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/seh_MZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/seh_MZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ebu_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ebu_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CX.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CX.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_BF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Latn_BF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sat.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sat.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Vaii.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Vaii.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vi_VN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vi_VN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nus.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nus.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/or_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/or_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se_FI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se_FI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ccp_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ccp_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dyo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dyo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_NZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_GQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_GQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sk_SK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sk_SK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/os_RU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/os_RU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Guru.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pa_Guru.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_DE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_DE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fi.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fi.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_NE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_NE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ii_CN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ii_CN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kok_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kok_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/naq_NA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/naq_NA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/af_NA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/af_NA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ceb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ceb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_NI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_NI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_VU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_VU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw_FR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/gsw_FR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_BR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_BR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ga.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ga.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rm_CH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rm_CH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/am.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/am.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_EG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_EG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ccp.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ccp.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dje_NE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dje_NE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_EA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_EA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kln_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kln_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Cyrl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Cyrl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SX.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SX.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nds_NL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nds_NL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sat_Olck_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sat_Olck_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zu_ZA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zu_ZA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sq.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/to_TO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/to_TO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/af_ZA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/af_ZA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bas.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bas.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ceb_PH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ceb_PH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_GI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_BF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fur.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fur.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/haw_US.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/haw_US.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_BN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_BN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_IL.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_IL.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dua.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dua.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_SD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_HT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_HT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_UA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ru_UA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mi.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mi.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_DG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_YT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_YT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kde.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kde.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kea_CV.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/kea_CV.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lt_LT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lt_LT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bm_ML.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bm_ML.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ses_ML.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ses_ML.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yav_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yav_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tr_CY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/tr_CY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_BR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_BR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cgg.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cgg.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dav_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dav_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ia_001.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ia_001.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ki_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ki_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nmg_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nmg_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_MA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_MA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_BM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/te.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/te.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Latn_LR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Latn_LR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wae.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/wae.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/km.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/km.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lrc_IQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/lrc_IQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/agq_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/agq_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/da.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/da.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eu_ES.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/eu_ES.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bem_ZM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bem_ZM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_TZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_US.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_US.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Vaii_LR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vai_Vaii_LR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_SG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ms_SG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_CW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nl_CW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu_BO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/qu_BO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_IM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_PK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_HN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_HN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jgo.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jgo.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rw_RW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rw_RW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Deva_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sd_Deva_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vun.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/vun.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_MH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luy_KE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/luy_KE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sl_SI.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sl_SI.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fil_PH.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fil_PH.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MF.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MF.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jv_ID.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/jv_ID.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mg_MG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mg_MG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pcm.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pcm.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bn_BD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bn_BD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cy_GB.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/cy_GB.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rm.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rm.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/to.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/to.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nyn_UG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nyn_UG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/twq_NE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/twq_NE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Latn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Latn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Latn_AZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/az_Latn_AZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CO.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_CO.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_LR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ff_Adlm_LR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_SC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_SC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ug.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ug.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Cyrl_UZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Cyrl_UZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksh.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksh.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_ST.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_ST.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rn.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/rn.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/se.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nb.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nb.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/om_ET.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/om_ET.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_RS.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_RS.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bez.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bez.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_PE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_TN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_TN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_IT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/it_IT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant_HK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hant_HK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_LU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/pt_LU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_ME.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_ME.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yo_BJ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/yo_BJ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_UY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/es_UY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sl.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sl.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zgh_MA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zgh_MA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ti_ER.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ti_ER.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CC.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_CC.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fy.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fy.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Arab.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Arab.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_BE.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/de_BE.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AU.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/en_AU.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CD.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_CD.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ln.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nd_ZW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/nd_ZW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sn_ZW.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sn_ZW.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/so.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_BA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/sr_Cyrl_BA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_SY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/brx.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/brx.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/el_GR.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/el_GR.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ewo_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ewo_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mua_CM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/mua_CM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/su_Latn_ID.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/su_Latn_ID.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Latn_UZ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/uz_Latn_UZ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Latn_BA.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bs_Latn_BA.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dz_BT.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/dz_BT.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/my_MM.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/my_MM.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ne_IN.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ne_IN.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/zh_Hans.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_IQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_IQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_LY.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ar_LY.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksf.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ksf.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bg_BG.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/bg_BG.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MQ.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/fr_MQ.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ur_PK.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/locale-data/ur_PK.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/plural.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/plural.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_win32.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_win32.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_unix.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_unix.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_unix.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_unix.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_win32.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/__pycache__/_win32.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/_unix.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/_unix.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/_win32.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localtime/_win32.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/numbers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/numbers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/support.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/support.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/units.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/units.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/lists.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/lists.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localedata.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/localedata.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/extract.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/extract.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/plurals.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/plurals.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/plurals.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/plurals.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/pofile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/pofile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/checkers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/checkers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/frontend.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/frontend.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/mofile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/mofile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/mofile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/mofile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/frontend.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/frontend.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/jslexer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/jslexer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/pofile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/pofile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/checkers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/checkers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/extract.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/extract.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/jslexer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__pycache__/jslexer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/checkers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/checkers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/extract.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/extract.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/plurals.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/plurals.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/frontend.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/frontend.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/jslexer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/jslexer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/mofile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/mofile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/pofile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/messages/pofile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/global.dat\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/global.dat: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/languages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/babel/languages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_locale.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_locale.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_translate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_translate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/log.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/log.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_lazy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_lazy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_message.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_message.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/log.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/log.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_factory.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_factory.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_factory.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_factory.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_gettextutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_gettextutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_lazy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_lazy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_locale.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_locale.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/log.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/log.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_gettextutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_gettextutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_translate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_translate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_translate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_translate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_locale.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_locale.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_message.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__pycache__/_message.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_lazy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_lazy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_gettextutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_gettextutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_message.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_message.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_gettextutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_gettextutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_factory.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_factory.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_handler.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_handler.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_lazy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_lazy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_message.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_message.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_handler.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_handler.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_locale_dir_variable.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_locale_dir_variable.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_locale_dir_variable.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_locale_dir_variable.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_translate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_translate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_gettextutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_gettextutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_gettextutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_gettextutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_handler.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_handler.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_lazy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_lazy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_lazy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_lazy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_message.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_message.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_public_api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_public_api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_public_api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_public_api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_logging.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_logging.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_factory.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_factory.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_logging.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_logging.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_translate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_translate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_factory.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/__pycache__/test_factory.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_logging.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_logging.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_message.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_message.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_locale_dir_variable.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_locale_dir_variable.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_public_api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_public_api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_translate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/tests/test_translate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_factory.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_i18n/_factory.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/renames.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/renames.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/updating.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/updating.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/moves.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/moves.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/moves.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/moves.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/renames.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/renames.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/updating.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/updating.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/removals.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/removals.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/removals.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/removals.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/renames.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/renames.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/updating.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/updating.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/disable.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/disable.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/disable.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/disable.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/disable.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/fixtures/__pycache__/disable.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/moves.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/moves.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/removals.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector/removals.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/authv1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/authv1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/command_helpers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/command_helpers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/command_helpers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/command_helpers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/multithreading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/multithreading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/command_helpers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/command_helpers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/authv1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/authv1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/authv1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/authv1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/multithreading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/multithreading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/multithreading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/swiftclient/multithreading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.config-8.5.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/INSTALLER\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/INSTALLER: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/LICENSE.APACHE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/LICENSE.APACHE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/LICENSE.BSD\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/LICENSE.BSD: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/METADATA\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/METADATA: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/WHEEL\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/WHEEL: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging-20.9.dist-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_generator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_generator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_lexer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_lexer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_parser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_parser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/yacctab.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/yacctab.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_ast.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_ast.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_ast_gen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_ast_gen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_ast_gen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_ast_gen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_build_tables.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_build_tables.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/ast_transforms.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/ast_transforms.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_ast.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_ast.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_generator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_generator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/plyparser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/plyparser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/plyparser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/plyparser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/lextab.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/lextab.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_lexer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_lexer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_parser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/c_parser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/lextab.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/lextab.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/yacctab.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/yacctab.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/ast_transforms.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/ast_transforms.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_build_tables.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__pycache__/_build_tables.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/_ast_gen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/_ast_gen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/_build_tables.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/_build_tables.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/_c_ast.cfg\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/_c_ast.cfg: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/ast_transforms.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/ast_transforms.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_ast.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_ast.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_generator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_generator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/lextab.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/lextab.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/plyparser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/plyparser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/yacctab.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/yacctab.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_lexer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_lexer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_parser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pycparser/c_parser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/stevedore-3.3.3-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpointer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpointer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/pyparsing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/pyparsing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/monotonic.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/monotonic.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/prettytable.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/prettytable.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/appdirs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/appdirs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpatch.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpatch.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/pyparsing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/pyparsing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/zipp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/zipp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/appdirs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/appdirs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpatch.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpatch.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpointer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/jsonpointer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/zipp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/zipp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/monotonic.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/monotonic.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/prettytable.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/__pycache__/prettytable.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/model.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/model.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/model.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/model.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/model.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock/model.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata-1.7.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_barbicanclient-5.3.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_manilaclient-2.6.4-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/warlock-1.3.3-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema-3.2.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/subnet_splitter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/subnet_splitter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/subnet_splitter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/__pycache__/subnet_splitter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/subnet_splitter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/contrib/subnet_splitter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/fbsocket.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/fbsocket.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/rfc1924.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/rfc1924.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/sets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/sets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/glob.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/glob.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/iana.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/iana.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/nmap.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/nmap.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/rfc1924.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/rfc1924.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/sets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/sets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/glob.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/glob.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/iana.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/iana.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/nmap.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__pycache__/nmap.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/iana.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/iana.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/ipv6-unicast-address-assignments.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/ipv6-unicast-address-assignments.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/multicast-addresses.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/multicast-addresses.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/nmap.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/nmap.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/glob.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/glob.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/ipv4-address-space.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/ipv4-address-space.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/ipv6-address-space.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/ipv6-address-space.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/rfc1924.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/rfc1924.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/sets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/ip/sets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui48.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui48.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui64.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui64.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv6.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv6.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv4.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv4.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv6.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv6.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui48.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui48.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui64.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/eui64.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv4.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__pycache__/ipv4.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/eui48.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/eui48.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/eui64.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/eui64.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/ipv4.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/ipv4.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/ipv6.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/ipv6.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/strategy/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/fbsocket.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/fbsocket.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/fbsocket.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/fbsocket.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/ieee.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/ieee.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/ieee.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/__pycache__/ieee.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/iab.idx\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/iab.idx: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/iab.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/iab.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/ieee.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/ieee.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/oui.idx\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/oui.idx: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/oui.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr/eui/oui.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_ironicclient-4.6.4-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_import_vendors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_import_vendors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_init.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_init.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_import_vendors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_import_vendors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_cloud_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_cloud_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_cloud_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_cloud_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_import_vendors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_import_vendors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_init.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_init.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_init.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_init.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_environ.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_environ.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_environ.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/__pycache__/test_environ.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_cloud_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_cloud_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_environ.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/tests/test_environ.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/constructors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/constructors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/cloud_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/cloud_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/defaults.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/defaults.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/cloud_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/cloud_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/constructors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/constructors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/defaults.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__pycache__/defaults.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/cloud_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/cloud_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/defaults.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/defaults.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/vendors/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/constructors.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/constructors.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/constructors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/constructors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/defaults.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config/defaults.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_novaclient-17.4.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/asyncio.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/asyncio.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/blocking.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/blocking.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/test_asyncio.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/test_asyncio.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/test_asyncio.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/__pycache__/test_asyncio.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/test_asyncio.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tests/test_asyncio.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tornado.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/tornado.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/blocking.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/blocking.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/blocking.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/blocking.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/tornado.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/tornado.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/tornado.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/tornado.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/asyncio.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/asyncio.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/asyncio.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/integrate/__pycache__/asyncio.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/wrappers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/wrappers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/bus.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/bus.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/bus_messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/bus_messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tornado.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tornado.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/trio.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/trio.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/threading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/threading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/asyncio.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/asyncio.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/asyncio.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/asyncio.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/trio.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/trio.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/blocking.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/blocking.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/threading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/threading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/tornado.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/tornado.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/trio.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/trio.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/blocking.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/blocking.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/threading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/threading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/tornado.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__pycache__/tornado.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/asyncio.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/asyncio.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/blocking.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/blocking.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_trio.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_trio.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_asyncio.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_asyncio.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_blocking.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_blocking.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_blocking.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_blocking.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_threading.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_threading.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_threading.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_threading.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_trio.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_trio.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_asyncio.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__pycache__/test_asyncio.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_asyncio.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_asyncio.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_blocking.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_blocking.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_threading.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_threading.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_trio.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/test_trio.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/io/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/low_level.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/low_level.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/routing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/routing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/wrappers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/wrappers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bindgen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bindgen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bindgen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bindgen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus_messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus_messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/low_level.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/low_level.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/routing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/routing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus_messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/bus_messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/low_level.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/low_level.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/wrappers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/__pycache__/wrappers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/bindgen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/bindgen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/routing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/routing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus_messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus_messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_low_level.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_low_level.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bindgen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bindgen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus_messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus_messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_low_level.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_low_level.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bindgen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bindgen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_bus.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_routing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_routing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_routing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__pycache__/test_routing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/secrets_introspect.xml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/secrets_introspect.xml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_bindgen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_bindgen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_low_level.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_low_level.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_bus.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_bus.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_bus_messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_bus_messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_routing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney/tests/test_routing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/functions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/functions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/ast.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/ast.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/ast.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/ast.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/visitor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/visitor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/functions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/functions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/functions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/functions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/lexer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/lexer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/lexer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/lexer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/parser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/parser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/parser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/parser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/visitor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/visitor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/ast.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/ast.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/lexer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/lexer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/parser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/parser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/visitor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/visitor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneauth1-4.4.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/service_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/service_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/exc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/exc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/exc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/exc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/service_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/__pycache__/service_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/service-types.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/service-types.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/data/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/exc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/exc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/service_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/service_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_match.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_match.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_remote.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_remote.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_singleton.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_singleton.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_builtin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_builtin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_match.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_match.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_misc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_misc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_singleton.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_singleton.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_misc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_misc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_warn.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_warn.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_warn.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_warn.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_builtin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_builtin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_data.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_data.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_data.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_data.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_match.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_match.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_remote.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_remote.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_remote.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/__pycache__/test_remote.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_builtin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_builtin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_data.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_data.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_misc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_misc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_singleton.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_singleton.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_warn.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_service_types/tests/test_warn.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/cpp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/cpp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/ctokens.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/ctokens.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/lex.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/lex.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/yacc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/yacc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/ygen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/ygen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/yacc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/yacc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ygen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ygen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/cpp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/cpp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ctokens.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ctokens.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/lex.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/lex.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/yacc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/yacc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ygen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ygen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/cpp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/cpp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ctokens.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/ctokens.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/lex.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply/__pycache__/lex.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/dhcrypto.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/dhcrypto.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/dhcrypto.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/dhcrypto.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/item.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/item.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/collection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/collection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/item.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/item.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/collection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/collection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/defines.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/defines.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/defines.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/defines.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/collection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/collection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/defines.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/defines.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/dhcrypto.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/dhcrypto.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/item.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/item.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/py.typed\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/secretstorage/py.typed: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_formatter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_formatter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/keystone_client_fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/keystone_client_fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/keystone_client_fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/keystone_client_fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_barbican.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_barbican.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_barbican.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_barbican.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_formatter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__pycache__/test_formatter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/keystone_client_fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/keystone_client_fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_barbican.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_barbican.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_formatter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/test_formatter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_cas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_cas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_containers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_containers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_orders.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_orders.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_secrets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_secrets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_acls.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_acls.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_cas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_cas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_orders.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_orders.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_secrets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_secrets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_acls.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_acls.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_cas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_cas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_containers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_containers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_containers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_containers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_orders.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_orders.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_secrets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/__pycache__/test_secrets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_acls.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/v1/test_acls.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/containers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/containers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/orders.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/orders.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/orders.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/orders.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/cas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/cas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/containers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/containers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/cas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/cas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/acls.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/acls.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/acls.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/acls.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/secrets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/secrets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/orders.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/orders.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/containers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/containers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/secrets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/secrets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/acls.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/acls.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/cas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/v1/cas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/osc_plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/osc_plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/osc_plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/osc_plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/barbican.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/barbican.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/formatter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/formatter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/barbican.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/barbican.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/formatter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/formatter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/formatter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/formatter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/osc_plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/osc_plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/containers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/containers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/orders.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/orders.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/orders.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/orders.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/secrets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/secrets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/acls.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/acls.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/cas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/cas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/secrets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/secrets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/acls.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/acls.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/cas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/cas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/containers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__pycache__/containers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/acls.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/acls.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/cas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/cas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/containers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/containers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/orders.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/orders.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/secrets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/secrets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/barbicanclient/barbican_cli/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__main__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__main__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/credentials.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/credentials.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/devpi_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/devpi_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/http.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/http.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/errors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/errors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/http.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/http.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__main__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/__main__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/backend.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/backend.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/backend.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/backend.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/credentials.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/credentials.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/devpi_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/devpi_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/errors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__pycache__/errors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backend.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backend.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/OS_X.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/OS_X.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/_OS_X_API.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/_OS_X_API.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/_OS_X_API.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/_OS_X_API.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/chainer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/chainer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/chainer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/chainer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/fail.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/fail.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/fail.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/fail.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/OS_X.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/OS_X.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/SecretService.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/SecretService.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/SecretService.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/SecretService.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/_OS_X_API.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/_OS_X_API.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/kwallet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/kwallet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/null.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/null.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/null.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/null.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/OS_X.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/OS_X.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/Windows.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/Windows.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/Windows.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/Windows.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/kwallet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__pycache__/kwallet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/chainer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/chainer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/kwallet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/kwallet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/SecretService.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/SecretService.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/Windows.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/Windows.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/fail.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/fail.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/null.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/backends/null.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/credentials.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/credentials.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/devpi_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/devpi_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/errors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/errors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/http.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/http.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/backend.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/backend.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/backend.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/backend.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/__pycache__/util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/backend.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/backend.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/testing/util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/properties.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/properties.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/properties.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/properties.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/platform_.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/platform_.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/platform_.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/__pycache__/platform_.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/platform_.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/platform_.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/properties.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/util/properties.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__main__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring/__main__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/lexer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/lexer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/autohandler.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/autohandler.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/preprocessors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/preprocessors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/pygmentplugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/pygmentplugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/turbogears.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/turbogears.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/autohandler.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/autohandler.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/babelplugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/babelplugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/babelplugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/babelplugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/linguaplugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/linguaplugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/autohandler.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/autohandler.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/extract.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/extract.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/linguaplugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/linguaplugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/preprocessors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/preprocessors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/turbogears.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/turbogears.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/pygmentplugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/pygmentplugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/pygmentplugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/pygmentplugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/turbogears.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/turbogears.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/beaker_cache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/beaker_cache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/beaker_cache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/beaker_cache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/extract.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/extract.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/preprocessors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/__pycache__/preprocessors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/babelplugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/babelplugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/beaker_cache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/beaker_cache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/extract.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/extract.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/linguaplugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ext/linguaplugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/pygen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/pygen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/ast.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/ast.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/filters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/filters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pyparser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pyparser.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/_ast_util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/_ast_util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/codegen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/codegen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/codegen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/codegen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lexer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lexer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/parsetree.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/parsetree.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lexer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lexer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/runtime.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/runtime.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/runtime.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/runtime.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/_ast_util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/_ast_util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/ast.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/ast.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pygen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pygen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lookup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lookup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pyparser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pyparser.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pygen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/pygen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cmd.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cmd.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cmd.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/cmd.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/filters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/filters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lookup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/lookup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/parsetree.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/parsetree.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/__pycache__/template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/lookup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/lookup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/pyparser.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/pyparser.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/runtime.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/runtime.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/cache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/cache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/codegen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/codegen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/cmd.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/cmd.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/filters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/filters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/parsetree.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/parsetree.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/_ast_util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/_ast_util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ast.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/mako/ast.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_wide.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_wide.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_zero.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_zero.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_zero.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_zero.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/wcwidth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/wcwidth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_wide.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/table_wide.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/unicode_versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/unicode_versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/wcwidth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/wcwidth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/unicode_versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/__pycache__/unicode_versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/table_wide.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/table_wide.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/table_zero.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/table_zero.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/unicode_versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/unicode_versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/wcwidth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth/wcwidth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/zipp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2-1.4.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/models.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/models.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone_export.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone_export.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone_import.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone_import.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone_transfer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone_transfer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_blacklist.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_blacklist.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_blacklist.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_blacklist.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tlds.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tlds.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tsigkeys.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tsigkeys.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_export.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_export.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_blacklist.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_blacklist.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_transfer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_transfer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_recordsets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_recordsets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_export.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_export.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_transfer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_transfer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_recordsets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_recordsets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tlds.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tlds.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tsigkeys.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_tsigkeys.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_import.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_import.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_import.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__pycache__/test_zone_import.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_recordsets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_recordsets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_tlds.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_tlds.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_tsigkeys.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_tsigkeys.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/test_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/datagen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/datagen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/datagen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/datagen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/models.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/models.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/models.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/models.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/datagen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/functionaltests/datagen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/osc/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/tlds.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/tlds.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/blacklists.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/blacklists.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/tsigkeys.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/tsigkeys.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/blacklists.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/blacklists.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/blacklists.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/blacklists.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/recordsets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/recordsets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/recordsets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/recordsets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tlds.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tlds.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/service_statuses.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/service_statuses.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tlds.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tlds.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tsigkeys.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tsigkeys.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/reverse.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/reverse.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/reverse.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/reverse.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/service_statuses.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/service_statuses.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tsigkeys.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__pycache__/tsigkeys.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/blacklists.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/blacklists.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/recordsets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/recordsets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/reverse.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/reverse.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/tlds.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/tlds.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/service_statuses.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/cli/service_statuses.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/pools.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/pools.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/recordsets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/recordsets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/nameservers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/nameservers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/service_statuses.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/service_statuses.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/tsigkeys.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/tsigkeys.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/blacklists.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/blacklists.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/reverse.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/reverse.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/blacklists.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/blacklists.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/pools.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/pools.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tsigkeys.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tsigkeys.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/nameservers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/nameservers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/reverse.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/reverse.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tlds.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tlds.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tsigkeys.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tsigkeys.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/service_statuses.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/service_statuses.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/pools.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/pools.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/nameservers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/nameservers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/recordsets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/recordsets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/recordsets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/recordsets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/service_statuses.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/service_statuses.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tlds.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/__pycache__/tlds.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/reverse.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/reverse.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/designateclient/v2/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/lock.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/lock.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/nameregistry.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/nameregistry.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/readwrite_lock.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/readwrite_lock.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/readwrite_lock.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/readwrite_lock.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/langhelpers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/langhelpers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/langhelpers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/langhelpers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/nameregistry.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/nameregistry.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/langhelpers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/langhelpers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/nameregistry.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/nameregistry.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/readwrite_lock.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/util/readwrite_lock.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/lock.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/lock.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/lock.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/lock.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/mako_cache.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/mako_cache.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/mako_cache.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/__pycache__/mako_cache.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/mako_cache.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/plugins/mako_cache.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/region.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/region.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/region.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/region.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/region.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/region.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/exception.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/exception.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/exception.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/__pycache__/exception.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/memcached.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/memcached.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/memory.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/memory.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/null.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/null.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/redis.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/redis.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/file.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/file.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memcached.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memcached.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/null.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/null.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/redis.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/redis.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/redis.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/redis.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/file.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/file.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memcached.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memcached.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memory.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memory.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memory.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/memory.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/null.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/__pycache__/null.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/file.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/backends/file.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/exception.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/exception.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/cache/util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist-2.3.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/clientmanager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/clientmanager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/clientmanager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/clientmanager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/clientmanager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/clientmanager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/common/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/v2_0/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/bandwidth_limit_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/bandwidth_limit_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/dscp_marking_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/dscp_marking_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/minimum_bandwidth_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/minimum_bandwidth_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/dscp_marking_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/dscp_marking_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/dscp_marking_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/dscp_marking_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/minimum_bandwidth_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/minimum_bandwidth_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/bandwidth_limit_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/bandwidth_limit_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/bandwidth_limit_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/bandwidth_limit_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/minimum_bandwidth_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/minimum_bandwidth_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/qos/__pycache__/rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/tag.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/tag.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network_ip_availability.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network_ip_availability.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/floatingip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/floatingip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/router.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/router.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/servicetype.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/servicetype.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnetpool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnetpool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/dns.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/dns.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/servicetype.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/servicetype.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/auto_allocated_topology.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/auto_allocated_topology.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/metering.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/metering.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/purge.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/purge.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/rbac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/rbac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/rbac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/rbac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/router.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/router.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/auto_allocated_topology.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/auto_allocated_topology.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network_ip_availability.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network_ip_availability.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/tag.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/tag.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/address_scope.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/address_scope.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agentscheduler.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agentscheduler.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/metering.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/metering.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/securitygroup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/securitygroup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/address_scope.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/address_scope.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/dns.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/dns.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/floatingip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/floatingip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/purge.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/purge.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/securitygroup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/securitygroup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnetpool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/subnetpool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agentscheduler.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/agentscheduler.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/tag.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__pycache__/tag.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/network_ip_availability.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/network_ip_availability.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/agentscheduler.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/agentscheduler.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/subnet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/subnet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/purge.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/purge.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/subnetpool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/subnetpool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/speaker.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/speaker.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/dragentscheduler.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/dragentscheduler.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/dragentscheduler.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/dragentscheduler.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/peer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/peer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/peer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/peer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/speaker.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/speaker.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/speaker.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/speaker.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/dragentscheduler.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/dragentscheduler.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/peer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/bgp/peer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/servicetype.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/servicetype.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/floatingip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/floatingip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/auto_allocated_topology.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/auto_allocated_topology.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/dns.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/dns.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/router.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/router.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/ipsecpolicy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/ipsecpolicy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/vpnservice.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/vpnservice.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/vpnservice.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/vpnservice.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/endpoint_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/endpoint_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ikepolicy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ikepolicy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ikepolicy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ikepolicy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsec_site_connection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsec_site_connection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/endpoint_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/endpoint_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsec_site_connection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsec_site_connection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/vpnservice.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/vpnservice.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsecpolicy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsecpolicy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsecpolicy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/ipsecpolicy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/endpoint_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/endpoint_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/ikepolicy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/ikepolicy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/ipsec_site_connection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/vpn/ipsec_site_connection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/__pycache__/flavor_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/flavor_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/flavor/flavor_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/firewallpolicy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/firewallpolicy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/firewallrule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/firewallrule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallrule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallrule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallrule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallrule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewall.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewall.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewall.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewall.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallpolicy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallpolicy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallpolicy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/__pycache__/firewallpolicy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/firewall.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/fw/firewall.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/healthmonitor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/healthmonitor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/healthmonitor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/healthmonitor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/member.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/member.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/member.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/member.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/vip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/vip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/vip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__pycache__/vip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/healthmonitor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/healthmonitor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/member.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/member.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/l7policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/l7policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/l7rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/l7rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/loadbalancer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/loadbalancer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/healthmonitor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/healthmonitor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/member.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/member.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/listener.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/listener.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/loadbalancer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/loadbalancer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/healthmonitor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/healthmonitor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/listener.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/listener.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/loadbalancer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/loadbalancer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/l7policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/member.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__pycache__/member.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/healthmonitor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/healthmonitor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/member.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/member.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/listener.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/v2/listener.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/vip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/vip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/lb/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/metering.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/metering.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/address_scope.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/address_scope.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/_fox_sockets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/_fox_sockets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/_fox_sockets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/__pycache__/_fox_sockets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/_fox_sockets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/contrib/_fox_sockets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/rbac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/rbac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/securitygroup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/neutron/v2_0/securitygroup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/network_association.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/network_association.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/port_association.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/port_association.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/resource_association.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/resource_association.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/router_association.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/router_association.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/bgpvpn.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/bgpvpn.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/router_association.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/router_association.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/bgpvpn.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/bgpvpn.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/network_association.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/network_association.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/port_association.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/port_association.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/router_association.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/router_association.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/port_association.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/port_association.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/resource_association.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/resource_association.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/network_association.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/network_association.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/resource_association.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/__pycache__/resource_association.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/bgpvpn.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/networking_bgpvpn/bgpvpn.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_port_pair_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_port_pair_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_service_graph.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_service_graph.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_flow_classifier.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_flow_classifier.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_flow_classifier.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_flow_classifier.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_pair_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_chain.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_chain.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_chain.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_port_chain.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_service_graph.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_service_graph.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_service_graph.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/__pycache__/sfc_service_graph.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_flow_classifier.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_flow_classifier.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_port_chain.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_port_chain.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_port_pair.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/sfc/sfc_port_pair.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/subnet_onboard.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/subnet_onboard.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/subnet_onboard.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/subnet_onboard.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/subnet_onboard.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/subnet_onboard/subnet_onboard.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/network_trunk.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/network_trunk.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/network_trunk.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/__pycache__/network_trunk.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/network_trunk.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/trunk/network_trunk.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/bgp_peer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/bgp_peer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/bgp_speaker.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/bgp_speaker.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_dragent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_dragent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_dragent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_dragent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_peer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_peer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_speaker.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_speaker.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_speaker.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_speaker.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_peer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/bgp_peer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/bgp_dragent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/dynamic_routing/bgp_dragent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallpolicy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallpolicy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallgroup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallgroup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallpolicy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallpolicy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallrule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallrule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallrule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallrule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallgroup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/__pycache__/firewallgroup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/firewallgroup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/firewallgroup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/firewallpolicy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/firewallpolicy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/firewallrule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/fwaas/firewallrule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/network_log.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/network_log.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/network_log.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/network_log.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/network_log.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/network_log.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/logging/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/lbaas/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsecpolicy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsecpolicy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/endpoint_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/endpoint_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/endpoint_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/endpoint_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/vpnservice.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/vpnservice.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ikepolicy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ikepolicy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsec_site_connection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsec_site_connection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ikepolicy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ikepolicy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsec_site_connection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsec_site_connection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsecpolicy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/ipsecpolicy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/vpnservice.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__pycache__/vpnservice.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/endpoint_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/endpoint_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/ikepolicy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/ikepolicy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/ipsec_site_connection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/ipsec_site_connection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/ipsecpolicy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/ipsecpolicy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/vpnservice.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/vpnservice.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/v2/vpnaas/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/osc/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/neutronclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/prettytable-0.7.2-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/serde.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/serde.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/conftest.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/conftest.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_integration.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_integration.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_rendezvous.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_rendezvous.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_serde.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_serde.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_benchmark.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_benchmark.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_integration.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_integration.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_rendezvous.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_rendezvous.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/conftest.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/conftest.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_hash.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_hash.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_hash.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_hash.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_retry.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_retry.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_serde.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_serde.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/conftest.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/conftest.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_benchmark.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_benchmark.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_retry.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_client_retry.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_integration.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_integration.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_rendezvous.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_rendezvous.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_serde.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_serde.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/__pycache__/test_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_benchmark.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_benchmark.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_client_hash.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_client_hash.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_client_retry.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/test/test_client_retry.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/serde.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/serde.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/serde.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/serde.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/fallback.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/fallback.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/fallback.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/__pycache__/fallback.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/retrying.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/retrying.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/hash.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/hash.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/murmur3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/murmur3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/retrying.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/retrying.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/rendezvous.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/rendezvous.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/retrying.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/retrying.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/hash.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/hash.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/murmur3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/murmur3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/rendezvous.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/__pycache__/rendezvous.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/hash.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/hash.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/murmur3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/murmur3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/rendezvous.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/client/rendezvous.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/fallback.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/fallback.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pymemcache/pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_next_gen.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_next_gen.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/converters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/converters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/converters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/converters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_make.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_make.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_next_gen.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_next_gen.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_version_info.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_version_info.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_version_info.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_version_info.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/converters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/converters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/setters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/setters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/setters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/setters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/filters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/filters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_funcs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_funcs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_make.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_make.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_next_gen.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_next_gen.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/filters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/filters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_funcs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/_funcs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_version_info.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_version_info.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/converters.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/converters.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/filters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/filters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__init__.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/__init__.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_make.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_make.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_version_info.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_version_info.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/setters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/setters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_funcs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/_funcs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/exceptions.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/exceptions.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/filters.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/filters.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/py.typed\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/py.typed: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/setters.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/setters.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/validators.pyi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attr/validators.pyi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/noauth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/noauth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/noauth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__pycache__/noauth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/noauth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/noauth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/shell_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/shell_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/availability_zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/availability_zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/pools.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/pools.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_backups_restore.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_backups_restore.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_encryption_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_encryption_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/list_extensions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/list_extensions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/list_extensions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/contrib/list_extensions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volumes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volumes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/consistencygroups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/consistencygroups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/capabilities.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/capabilities.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/pools.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/pools.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_transfers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_transfers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/cgsnapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/cgsnapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/qos_specs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/qos_specs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_encryption_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_encryption_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/availability_zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/availability_zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/availability_zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/availability_zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_encryption_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_encryption_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_transfers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_transfers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volumes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volumes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/capabilities.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/capabilities.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups_restore.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups_restore.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/consistencygroups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/consistencygroups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/cgsnapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/cgsnapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/pools.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/pools.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/qos_specs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/qos_specs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups_restore.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_backups_restore.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volume_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volumes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/__pycache__/volumes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/cgsnapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/cgsnapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/consistencygroups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/consistencygroups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/qos_specs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/qos_specs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/capabilities.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/capabilities.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_backups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_backups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_transfers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_transfers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v2/volume_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/api_versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/api_versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/shell_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/api_versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/api_versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/api_versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/api_versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/qos_specs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/qos_specs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/group_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/group_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_backups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_backups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/attachments.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/attachments.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/availability_zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/availability_zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/workers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/workers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/clusters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/clusters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/default_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/default_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/group_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/group_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/pools.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/pools.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_encryption_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_encryption_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_transfers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_transfers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/capabilities.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/capabilities.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/clusters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/clusters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/resource_filters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/resource_filters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_transfers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_transfers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/qos_specs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/qos_specs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups_restore.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups_restore.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/capabilities.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/capabilities.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/default_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/default_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/qos_specs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/qos_specs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/resource_filters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/resource_filters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups_restore.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups_restore.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/consistencygroups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/consistencygroups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/pools.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/pools.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volumes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volumes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/workers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/workers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volumes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volumes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/workers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/workers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/availability_zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/availability_zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/cgsnapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/cgsnapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/groups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/groups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/groups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/groups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_backups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/attachments.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/attachments.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_encryption_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_encryption_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/consistencygroups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/consistencygroups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/default_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/default_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/attachments.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/attachments.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/pools.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/pools.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/cgsnapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/cgsnapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/clusters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/clusters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/availability_zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/availability_zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/group_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_encryption_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__pycache__/volume_encryption_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/capabilities.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/capabilities.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/list_extensions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/__pycache__/list_extensions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/list_extensions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/contrib/list_extensions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_backups_restore.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_backups_restore.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/groups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/groups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/resource_filters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/resource_filters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_transfers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volume_transfers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volumes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/volumes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/cgsnapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/cgsnapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/consistencygroups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/consistencygroups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/v3/quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cinderclient/apiclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cliff-3.7.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr-5.5.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.serialization-4.1.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__about__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__about__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/_structures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/_structures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/markers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/markers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/requirements.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/requirements.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/specifiers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/specifiers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/tags.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/tags.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/specifiers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/specifiers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/tags.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/tags.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__about__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__about__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_structures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_structures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/markers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/markers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/markers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/markers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/specifiers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/specifiers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/tags.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/tags.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_typing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_typing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_structures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_structures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_typing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/_typing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__about__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/__about__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/requirements.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/requirements.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/requirements.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/requirements.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/_compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/_compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/_typing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/_typing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/py.typed\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/packaging/py.typed: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/handlers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/handlers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/versionutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/versionutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/setlevel.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/setlevel.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/setlevel.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/setlevel.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/logging_error.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/logging_error.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/logging_error.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__pycache__/logging_error.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/logging_error.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/logging_error.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/setlevel.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/setlevel.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/fixture/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/rate_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/rate_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/watchers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/watchers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/convert_json.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/convert_json.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/convert_json.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/convert_json.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/convert_json.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/cmds/__pycache__/convert_json.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/helpers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/helpers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/log.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/log.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/handlers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/handlers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/helpers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/helpers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/rate_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/rate_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/rate_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/rate_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_options.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_options.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/formatters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/formatters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/formatters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/formatters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/helpers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/helpers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/handlers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/handlers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/log.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/log.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/versionutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/versionutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/watchers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/watchers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_options.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/_options.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/watchers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/watchers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/log.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/log.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/versionutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/__pycache__/versionutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/_options.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/_options.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/formatters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_log/formatters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_octaviaclient-2.3.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz-2021.1-py3.9.egg-info/zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Mako-1.1.4.dev0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_futures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_futures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_periodics.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_periodics.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_periodics.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_periodics.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_waiters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_waiters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_executors.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_executors.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_executors.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_executors.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_waiters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/__pycache__/test_waiters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/test_executors.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/test_executors.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/test_periodics.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/test_periodics.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/test_waiters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/tests/test_waiters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/waiters.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/waiters.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/rejection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/rejection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/periodics.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/periodics.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_thread.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_thread.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/rejection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/rejection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_green.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_green.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/periodics.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/periodics.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_thread.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_thread.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/waiters.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/waiters.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/waiters.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/waiters.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_futures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_futures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_green.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_green.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/rejection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/rejection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_futures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/__pycache__/_futures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_green.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_green.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_thread.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_thread.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/periodics.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/futurist/periodics.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/benchmark.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/benchmark.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/benchmark.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/benchmark.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/gendoc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/gendoc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/osc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/osc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/gendoc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/gendoc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/osc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/osc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/benchmark.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/benchmark.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/gendoc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/gendoc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/osc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/osc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/aggregates_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/aggregates_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/status.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/status.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/build.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/build.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/metric_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/metric_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource_type_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource_type_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/status_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/status_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy_rule_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy_rule_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/build_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/build_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/capabilities.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/capabilities.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/capabilities.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/aggregates.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/build_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/metric_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/archive_policy_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/resource_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/__pycache__/status_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/aggregates.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/aggregates.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/archive_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/capabilities_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/capabilities_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/metric.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/metric.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/gnocchiclient/v1/resource_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip-1.8.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_designateclient-4.2.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/reference.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/reference.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/tzfile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/tzfile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/tzinfo.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/tzinfo.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/reference.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/reference.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/reference.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/reference.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzinfo.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzinfo.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/lazy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/lazy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/lazy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/lazy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzfile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzfile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzfile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzfile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzinfo.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/__pycache__/tzinfo.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/lazy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pytz/lazy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/session.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/session.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/adapter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/adapter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/session.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/session.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/discover.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/discover.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/service_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/service_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/baseclient.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/baseclient.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/discover.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/discover.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/baseclient.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/baseclient.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/session.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/session.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/_discover.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/_discover.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/adapter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/adapter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/_discover.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/_discover.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/httpclient.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/httpclient.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/httpclient.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/httpclient.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/service_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/service_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/_discover.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/_discover.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/adapter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/adapter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/discover.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/discover.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/httpclient.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/httpclient.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/baseclient.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/baseclient.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/certificates.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/certificates.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/endpoints.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/endpoints.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/extensions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/extensions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/tenants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/tenants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/users.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/users.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/roles.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/roles.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/roles.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/roles.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/ec2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/ec2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/extensions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/extensions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tenants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tenants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tokens.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tokens.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/users.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/users.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/certificates.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/certificates.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/ec2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/ec2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/endpoints.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/endpoints.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/extensions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/extensions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tokens.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tokens.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/users.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/users.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/endpoints.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/endpoints.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tenants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/tenants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/certificates.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/__pycache__/certificates.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/ec2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/ec2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/roles.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/roles.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/tokens.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v2_0/tokens.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/token_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/token_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/token_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/token_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/conf.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/conf.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/conf.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__pycache__/conf.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/conf.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/conf.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/password.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/password.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/password.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/password.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/password.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/password.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/generic/__pycache__/token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/password.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/password.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/federated.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/federated.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/password.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/password.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/federated.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/federated.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/__pycache__/token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/federated.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/federated.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/password.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/password.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/identity/v3/token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/token_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/token_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/auth/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/cms.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/cms.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/cms.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/__pycache__/cms.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/cms.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/common/cms.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/generic/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/oidc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/oidc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/oidc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/oidc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/saml2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/saml2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/saml2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__pycache__/saml2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/oidc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/oidc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/saml2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/saml2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/auth/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/contrib/ec2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/discovery.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/discovery.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/exception.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/exception.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/v3.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/v3.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/discovery.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/discovery.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/discovery.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/discovery.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v3.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v3.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v3.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v3.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/exception.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/exception.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/exception.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/fixture/__pycache__/exception.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/service_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/service_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domains.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domains.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/groups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/groups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/users.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/users.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/tokens.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/tokens.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/access_rules.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/access_rules.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/credentials.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/credentials.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/groups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/groups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/policies.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/policies.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/role_assignments.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/role_assignments.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/roles.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/roles.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/ec2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/ec2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoint_groups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoint_groups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/regions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/regions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/tokens.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/tokens.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/application_credentials.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/application_credentials.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domain_configs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domain_configs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/ec2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/ec2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoints.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoints.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/projects.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/projects.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/registered_limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/registered_limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoint_groups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoint_groups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/policies.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/policies.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/regions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/regions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/credentials.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/credentials.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoints.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/endpoints.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/role_assignments.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/role_assignments.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/users.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/users.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/roles.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/roles.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/registered_limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/registered_limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/access_rules.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/access_rules.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/application_credentials.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/application_credentials.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domain_configs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domain_configs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domains.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/domains.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/projects.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__pycache__/projects.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/endpoint_groups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/endpoint_groups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/role_assignments.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/role_assignments.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/roles.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/roles.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/policies.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/policies.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/groups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/groups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/projects.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/projects.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/regions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/regions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/trusts.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/trusts.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_filter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_filter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_filter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_filter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/simple_cert.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/simple_cert.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/endpoint_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/simple_cert.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/simple_cert.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/trusts.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/trusts.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/trusts.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/__pycache__/trusts.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/endpoint_filter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/endpoint_filter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/endpoint_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/endpoint_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/domains.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/domains.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/mappings.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/mappings.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/saml.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/saml.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/projects.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/projects.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/protocols.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/protocols.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/protocols.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/protocols.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/service_providers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/service_providers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/domains.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/domains.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/domains.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/domains.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/identity_providers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/identity_providers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/identity_providers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/identity_providers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/saml.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/saml.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/saml.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/saml.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/mappings.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/mappings.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/projects.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/projects.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/service_providers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/service_providers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/mappings.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/__pycache__/mappings.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/protocols.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/protocols.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/service_providers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/service_providers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/identity_providers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/identity_providers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/projects.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/federation/projects.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/consumers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/consumers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/request_tokens.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/request_tokens.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/access_tokens.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/access_tokens.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/consumers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/consumers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/request_tokens.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/request_tokens.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/request_tokens.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/request_tokens.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/access_tokens.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/access_tokens.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/consumers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/__pycache__/consumers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/access_tokens.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/oauth1/access_tokens.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/simple_cert.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/contrib/simple_cert.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/domain_configs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/domain_configs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/domains.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/domains.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/registered_limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/registered_limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/users.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/users.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/tokens.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/tokens.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/access_rules.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/access_rules.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/application_credentials.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/application_credentials.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/credentials.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/credentials.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/ec2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/ec2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/endpoints.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keystoneclient/v3/endpoints.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/monotonic-1.5-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/octavia.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/octavia.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/octavia.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__pycache__/octavia.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/octavia.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/octavia.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/api/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/checks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/checks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/checks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__pycache__/checks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/checks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/checks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/hacking/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/l7rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/l7rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/load_balancer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/load_balancer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/member.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/member.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/health_monitor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/health_monitor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/l7policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/l7policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/health_monitor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/health_monitor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/listener.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/listener.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/validate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/validate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/amphora.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/amphora.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/load_balancer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/load_balancer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzoneprofile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzoneprofile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/listener.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/listener.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/amphora.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/amphora.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzoneprofile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/availabilityzoneprofile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavorprofile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavorprofile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/load_balancer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/load_balancer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/member.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/member.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/validate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/validate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/l7rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavorprofile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/flavorprofile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/health_monitor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/health_monitor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/member.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/member.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/__pycache__/quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/listener.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/listener.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/validate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/validate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/availabilityzoneprofile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/availabilityzoneprofile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/flavorprofile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/flavorprofile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/amphora.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/amphora.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/availabilityzone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/v2/availabilityzone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/osc/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/octaviaclient/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/python3_compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/python3_compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/python3_compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/__pycache__/python3_compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/python3_compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/munch/python3_compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_keystoneclient-4.3.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_neutronclient-7.3.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/requestsexceptions-1.4.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/uri.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/uri.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/normalizers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/normalizers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/abnf_regexp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/abnf_regexp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/builder.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/builder.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/_mixin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/_mixin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/_mixin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/_mixin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/uri.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/uri.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/builder.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/builder.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/misc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/misc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/normalizers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/normalizers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/parseresult.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/parseresult.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/iri.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/iri.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/iri.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/iri.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/misc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/misc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/abnf_regexp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/abnf_regexp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/parseresult.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/parseresult.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/uri.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__pycache__/uri.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/misc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/misc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/parseresult.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/parseresult.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/_mixin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/_mixin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/abnf_regexp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/abnf_regexp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/normalizers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/normalizers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/iri.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/iri.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/builder.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/builder.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/colorama-0.4.4-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/debtcollector-2.2.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/keyring-21.8.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/fixture.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/fixture.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/strutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/strutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/excutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/excutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/fnmatch.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/fnmatch.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/imageutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/imageutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/secretutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/secretutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/timeutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/timeutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/dictutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/dictutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/eventletutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/eventletutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/fileutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/fileutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/importutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/importutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/netutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/netutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/reflection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/reflection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/specs_matcher.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/specs_matcher.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/uuidutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/uuidutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/dictutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/dictutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fileutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fileutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/netutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/netutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/specs_matcher.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/specs_matcher.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/units.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/units.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/excutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/excutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/excutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/excutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/importutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/importutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/reflection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/reflection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/reflection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/reflection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/secretutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/secretutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/encodeutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/encodeutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/specs_matcher.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/specs_matcher.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/timeutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/timeutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/uuidutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/uuidutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/eventletutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/eventletutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fileutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fileutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/imageutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/imageutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/strutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/strutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/versionutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/versionutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/versionutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/versionutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fnmatch.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fnmatch.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/imageutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/imageutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/netutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/netutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/secretutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/secretutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/units.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/units.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/encodeutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/encodeutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/eventletutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/eventletutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/dictutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/dictutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fixture.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fixture.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fixture.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fixture.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fnmatch.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/fnmatch.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/importutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/importutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/strutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/strutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/timeutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/timeutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/uuidutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/__pycache__/uuidutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/versionutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/versionutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/encodeutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/encodeutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/units.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_utils/units.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/SecretStorage-3.3.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/dogpile.cache-1.1.5-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpointer-2.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/os_client_config-2.1.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/clientmanager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/clientmanager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/clientmanager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/clientmanager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/logs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/logs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/clientmanager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/clientmanager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/logs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/logs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/auth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/auth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/auth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/auth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/auth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/auth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/api/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/command.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/command.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/commandmanager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/commandmanager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/timing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/timing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/command.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/command.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/command.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/command.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/commandmanager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/commandmanager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/commandmanager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/commandmanager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/timing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/timing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/timing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/command/__pycache__/timing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/logs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/logs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/columns.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/columns.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/tags.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/tags.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/columns.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/columns.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/columns.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/columns.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/tags.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/tags.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/tags.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/tags.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/utils/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/identity.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/identity.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/parseractions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/parseractions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/client_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/client_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/format_columns.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/format_columns.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/identity.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/identity.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/parseractions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/parseractions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/client_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/client_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/format_columns.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/__pycache__/format_columns.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/client_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/client_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/format_columns.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/format_columns.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/identity.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/identity.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/parseractions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/osc_lib/cli/parseractions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonpatch.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/wcwidth-0.2.5-py3.9.egg-info/zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/attrs-20.3.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/iso8601-0.1.12-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.utils-4.8.2-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/exc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/exc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/images.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/images.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/image_members.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/image_members.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/image_members.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/image_members.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/images.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/images.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/apiclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/image_members.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/image_members.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/images.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v1/images.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/progressbar.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/progressbar.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/progressbar.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/progressbar.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/https.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/https.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/http.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/http.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/https.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/https.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/http.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/__pycache__/http.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/http.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/http.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/https.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/https.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/progressbar.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/common/progressbar.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/image_members.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/image_members.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/metadefs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/metadefs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_members.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_members.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_tags.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_tags.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/metadefs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/metadefs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/resource_type_schema.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/resource_type_schema.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/tasks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/tasks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/images.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/images.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/namespace_schema.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/namespace_schema.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/schemas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/schemas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_members.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_members.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_schema.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_schema.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_schema.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_schema.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_tags.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/image_tags.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/images.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/images.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/metadefs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/metadefs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/namespace_schema.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/namespace_schema.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/resource_type_schema.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/resource_type_schema.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/schemas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/schemas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/tasks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__pycache__/tasks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/tasks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/tasks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/resource_type_schema.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/resource_type_schema.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/image_schema.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/image_schema.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/image_tags.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/image_tags.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/images.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/images.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/namespace_schema.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/namespace_schema.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/schemas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/v2/schemas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/exc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/exc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/exc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/__pycache__/exc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/glanceclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__main__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__main__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/_suite.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/_suite.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_jsonschema_test_suite.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_jsonschema_test_suite.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_helpers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_helpers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_suite.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_suite.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_jsonschema_test_suite.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_jsonschema_test_suite.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_helpers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_helpers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_suite.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/_suite.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_jsonschema_test_suite.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/__pycache__/test_jsonschema_test_suite.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/_helpers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/_helpers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/tests/test_validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/cli.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/cli.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft3.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft3.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft4.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft4.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft6.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft6.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft7.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/schemas/draft7.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_legacy_validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_legacy_validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_reflect.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_reflect.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/_validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/validators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/validators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_reflect.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_reflect.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_reflect.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_reflect.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_legacy_validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_legacy_validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/cli.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/cli.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/validators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/validators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__main__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__main__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__main__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__main__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_legacy_validators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_legacy_validators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/cli.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/cli.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/issue232.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/issue232.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/issue232.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/issue232.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/json_schema_test_suite.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/json_schema_test_suite.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/json_schema_test_suite.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__pycache__/json_schema_test_suite.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/issue232.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/issue232.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/json_schema_test_suite.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/json_schema_test_suite.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/benchmarks/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jsonschema/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo.log-4.4.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/rfc3986-1.4.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/Babel-2.9.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/METADATA\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/METADATA: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/RECORD\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/RECORD: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/REQUESTED\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/REQUESTED: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/WHEEL\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/WHEEL: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/INSTALLER\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/INSTALLER: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs-1.4.4.dist-info/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/appdirs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__main__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__main__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__main__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__main__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__main__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyperclip/__pycache__/__main__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/not-zip-safe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/not-zip-safe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/pbr.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/pbr.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/python_cinderclient-7.4.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/_compat.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/_compat.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/fixtures.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/fixtures.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_integration.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_integration.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_integration.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_integration.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_main.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_main.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_zip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_zip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/fixtures.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/fixtures.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_main.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_main.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_zip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/__pycache__/test_zip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/example-21.12-py3-none-any.whl\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/example-21.12-py3-none-any.whl: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/example-21.12-py3.6.egg\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/data/example-21.12-py3.6.egg: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/fixtures.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/fixtures.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_integration.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_integration.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_main.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_main.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_zip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/tests/test_zip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/_compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/_compat.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/_compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/importlib_metadata/__pycache__/_compat.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/progressbar.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/progressbar.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/project_purge.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/project_purge.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/module.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/module.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/progressbar.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/progressbar.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_cleanup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_cleanup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/clientmanager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/clientmanager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/configuration.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/configuration.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/progressbar.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/progressbar.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/sdk_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/sdk_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/clientmanager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/clientmanager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/configuration.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/configuration.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_cleanup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_cleanup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/module.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/module.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_purge.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_purge.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_purge.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/project_purge.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/sdk_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/sdk_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__pycache__/versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/clientmanager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/clientmanager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/configuration.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/configuration.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/module.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/module.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/project_cleanup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/project_cleanup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/sdk_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/common/sdk_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/object.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/object.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/account.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/account.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/account.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/account.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/container.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/container.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/container.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/container.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/object.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/__pycache__/object.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/account.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/account.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/container.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/container.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/object.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/v1/object.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/object/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_keypair.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_keypair.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_aggregate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_aggregate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_server_event.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_server_event.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_server_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_server_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_event.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_event.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_aggregate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_aggregate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_keypair.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_keypair.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_aggregate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_aggregate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_event.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server_event.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_keypair.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_keypair.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/__pycache__/test_server.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_server.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/compute/v2/test_server.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_ec2_credentials.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_ec2_credentials.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/test_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_ec2_credentials.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_ec2_credentials.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_ec2_credentials.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_ec2_credentials.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v2/__pycache__/test_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_region.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_region.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_application_credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_application_credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_idp.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_idp.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_application_credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_application_credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_domain.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_domain.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_region.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_region.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_registered_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_registered_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_region.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_region.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_registered_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_registered_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_domain.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_domain.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_idp.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_idp.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/__pycache__/test_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_registered_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_registered_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_idp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_idp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_application_credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_application_credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_domain.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/identity/v3/test_domain.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_qos.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_qos.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_volume_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_volume_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_qos.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_qos.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_volume_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_qos.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_qos.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/__pycache__/test_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v1/test_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_qos.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_qos.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_qos.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_qos.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/test_volume_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_qos.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_qos.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v2/test_volume_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_volume_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_volume_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_volume_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_volume_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_qos.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_qos.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_volume_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_qos.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_qos.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/test_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_qos.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_qos.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/volume/v3/test_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_container.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_container.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_container.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_container.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_object.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_object.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_object.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/test_object.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/test_container.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/test_container.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/test_object.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/test_object.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/object/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_configuration.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_configuration.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_args.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_args.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_help.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_help.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_module.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/test_module.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_args.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_args.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_help.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_help.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_configuration.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_configuration.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_configuration.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_configuration.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_module.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_module.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_args.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_args.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_help.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_help.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_module.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_module.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/common/__pycache__/test_versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/test_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/test_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/test_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/__pycache__/test_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/test_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v1/test_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/test_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/test_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/test_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/__pycache__/test_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/test_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/image/v2/test_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_qos_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_qos_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_qos_rule_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_qos_rule_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_address_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_address_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_address_scope.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_address_scope.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_ip_availability.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_ip_availability.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_flavor_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_flavor_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_meter_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_meter_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_security_group_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_security_group_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_floating_ip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_floating_ip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_qos_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_qos_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_segment_range.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_segment_range.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_subnet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_subnet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment_range.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment_range.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment_range.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment_range.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_router.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_router.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet_pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet_pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_rbac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_rbac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_router.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_router.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_scope.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_scope.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_rbac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_rbac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_scope.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_address_scope.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_floating_ip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_floating_ip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_ip_availability.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_ip_availability.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet_pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet_pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_ip_availability.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_ip_availability.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_qos_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_meter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_floating_ip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_floating_ip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_subnet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_network_segment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/__pycache__/test_security_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_meter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_meter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_rbac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_rbac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_segment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_network_segment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_router.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_router.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_security_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_security_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_subnet_pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/functional/network/v2/test_subnet_pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/test_shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/test_shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/test_shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/test_shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_console.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_console.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_hypervisor_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_hypervisor_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_keypair.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_keypair.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_event.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_event.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_usage.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_usage.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_hypervisor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_hypervisor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_aggregate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_aggregate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_keypair.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_keypair.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_keypair.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_keypair.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_console.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_console.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_hypervisor_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_event.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_event.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_usage.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_usage.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_host.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_host.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_aggregate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_aggregate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_console.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_console.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_host.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_host.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_usage.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_usage.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_event.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_event.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__pycache__/test_server_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_host.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_host.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_server_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_aggregate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/compute/v2/test_aggregate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_role_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_role_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__pycache__/test_role_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/test_user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v2_0/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_endpoint_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_endpoint_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_unscoped_saml.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_unscoped_saml.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_oauth.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_oauth.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_consumer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_consumer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_mappings.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_mappings.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_registered_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_registered_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_oauth.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_oauth.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_domain.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_domain.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_access_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_access_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_domain.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_domain.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_trust.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_trust.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_protocol.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_protocol.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_protocol.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_protocol.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_registered_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_registered_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_trust.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_trust.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_access_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_access_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_identity_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_identity_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_region.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_region.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_identity_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_identity_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_mappings.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_mappings.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_unscoped_saml.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_unscoped_saml.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_unscoped_saml.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_unscoped_saml.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_region.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_region.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_application_credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_application_credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_application_credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_application_credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_consumer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_consumer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_implied_role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_implied_role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_implied_role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_implied_role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/__pycache__/test_endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_access_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_access_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_identity_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_identity_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_consumer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_consumer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_domain.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_domain.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_registered_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_registered_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_implied_role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_implied_role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_oauth.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_oauth.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_region.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_region.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_trust.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_trust.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_application_credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_application_credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_mappings.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_mappings.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_role_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_role_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_protocol.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/v3/test_protocol.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_sdk_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_sdk_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_sdk_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/__pycache__/test_sdk_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/test_common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/test_common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/test_sdk_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/test_sdk_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_flavor_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_flavor_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_segment_range.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_segment_range.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_rule_compute.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_rule_compute.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_rule_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_rule_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_address_scope.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_address_scope.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_ip_availability.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_ip_availability.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_auto_allocated_topology.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_auto_allocated_topology.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_qos_rule_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_qos_rule_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_address_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_address_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_compute.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_compute.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_meter_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_meter_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_pool_compute.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_pool_compute.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_meter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_meter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_qos_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_qos_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_compute.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_compute.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_subnet_pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_subnet_pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_compute.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_compute.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_pool_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_pool_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_router.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_router.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_compute.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_compute.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_auto_allocated_topology.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_auto_allocated_topology.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_router.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_router.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_compute.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_compute.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_compute.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_compute.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_rbac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_rbac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_compute.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_compute.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_compute.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_compute.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_scope.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_scope.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet_pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet_pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_flavor_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_port_forwarding.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_port_forwarding.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_port_forwarding.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_port_forwarding.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_compute.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_compute.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment_range.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment_range.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_router.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_router.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_compute.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_compute.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_compute.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_security_group_rule_compute.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_scope.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_address_scope.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_ip_availability.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_ip_availability.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_rbac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_rbac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_auto_allocated_topology.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_auto_allocated_topology.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_compute.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_compute.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet_pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_subnet_pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_compute.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_compute.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_qos_rule_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_floating_ip_pool_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_ip_availability.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_ip_availability.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_meter_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment_range.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/test_network_segment_range.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_segment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_segment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_rbac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_rbac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_security_group_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_subnet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_subnet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_port_forwarding.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_floating_ip_port_forwarding.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_qos_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/network/v2/test_network_qos_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_container.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_container.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_container_all.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_container_all.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_object.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_object.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_object_all.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/test_object_all.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container_all.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container_all.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object_all.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object_all.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object_all.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_object_all.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container_all.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/__pycache__/test_container_all.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/object/v1/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume_host.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume_host.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume_backend.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume_backend.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_backup_record.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_backup_record.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_consistency_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_consistency_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_qos_specs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_qos_specs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_backup_record.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_backup_record.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backend.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backend.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_host.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_host.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_backup_record.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_backup_record.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backend.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backend.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_qos_specs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_qos_specs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_qos_specs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_qos_specs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_host.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_volume_host.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/__pycache__/test_consistency_group_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_consistency_group_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v2/test_consistency_group_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v3/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/test_find_resource.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/test_find_resource.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/test_find_resource.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/test_find_resource.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/test_find_resource.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/test_find_resource.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_qos_specs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_qos_specs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_volume_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_qos_specs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/__pycache__/test_qos_specs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_qos_specs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_qos_specs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_volume_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/volume/v1/test_volume_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/__pycache__/test_project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/test_project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/test_project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/test_shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/integ/cli/test_shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/test_shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/test_shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_image_v1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_image_v1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_image_v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_image_v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_object_store_v1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_object_store_v1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_compute_v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_compute_v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_object_store_v1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_object_store_v1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_object_store_v1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_object_store_v1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_image_v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_compute_v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/__pycache__/test_compute_v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_compute_v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/api/test_compute_v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_progressbar.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_progressbar.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_project_cleanup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_project_cleanup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_command.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_command.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_logs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_logs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_parseractions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_parseractions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_progressbar.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_progressbar.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_logs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_logs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_module.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_module.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_configuration.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_configuration.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_logs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_logs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_parseractions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_parseractions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_clientmanager.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_clientmanager.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_parseractions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_parseractions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_progressbar.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_progressbar.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_purge.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_purge.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_configuration.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_configuration.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_cleanup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_cleanup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_command.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_command.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_cleanup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_cleanup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_command.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_command.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_module.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_module.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_purge.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_project_purge.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_clientmanager.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_clientmanager.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/__pycache__/test_extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_configuration.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_configuration.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_project_purge.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_project_purge.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_clientmanager.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_clientmanager.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_module.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/common/test_module.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/test_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/test_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/test_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/test_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/test_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v1/test_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/test_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/test_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/test_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/test_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/test_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/tests/unit/image/v2/test_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/__pycache__/image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v2/image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/__pycache__/image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/image/v1/image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_rbac.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_rbac.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/security_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/security_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_segment_range.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_segment_range.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/security_group_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/security_group_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_qos_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_qos_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/subnet_pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/subnet_pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/address_scope.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/address_scope.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/floating_ip_pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/floating_ip_pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_auto_allocated_topology.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_auto_allocated_topology.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_flavor_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_flavor_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_scope.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_scope.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment_range.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment_range.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_auto_allocated_topology.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_auto_allocated_topology.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_scope.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_scope.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_port_forwarding.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_port_forwarding.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet_pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet_pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/ip_availability.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/ip_availability.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_port_forwarding.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip_port_forwarding.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_rbac.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_rbac.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment_range.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment_range.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet_pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet_pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/ip_availability.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/ip_availability.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_auto_allocated_topology.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_auto_allocated_topology.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/subnet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/address_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_meter_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_qos_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/router.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/router.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_rbac.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_rbac.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_segment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/floating_ip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/network_flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/router.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/router.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__pycache__/security_group_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_meter.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_meter.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/subnet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/subnet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_meter_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_meter_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/router.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/router.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/floating_ip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/floating_ip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/floating_ip_port_forwarding.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/floating_ip_port_forwarding.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_qos_rule_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_qos_rule_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_segment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_segment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/address_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/address_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/ip_availability.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/ip_availability.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_qos_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/v2/network_qos_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/sdk_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/sdk_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/sdk_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/sdk_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/sdk_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/sdk_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/network/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_host.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_host.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/qos_specs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/qos_specs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/backup_record.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/backup_record.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backend.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backend.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backend.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backend.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_host.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/volume_host.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/backup_record.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/backup_record.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/consistency_group_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/qos_specs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__pycache__/qos_specs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/consistency_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/consistency_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/qos_specs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/qos_specs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/consistency_group_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/consistency_group_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_backend.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_backend.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_host.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_host.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/volume_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/backup_record.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v2/backup_record.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/qos_specs.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/qos_specs.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_transfer_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_transfer_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/qos_specs.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/qos_specs.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_transfer_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__pycache__/volume_transfer_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/qos_specs.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/qos_specs.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_transfer_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_transfer_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/volume/v1/volume_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/api.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/api.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/compute_v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/compute_v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/object_store_v1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/object_store_v1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/api.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/api.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/compute_v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/compute_v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v1.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v1.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/image_v2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/object_store_v1.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/object_store_v1.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/api.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/api.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/compute_v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/compute_v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/image_v1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/image_v1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/image_v2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/image_v2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/object_store_v1.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/api/object_store_v1.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/console.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/console.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_event.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_event.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/host.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/host.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/keypair.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/keypair.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/aggregate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/aggregate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/hypervisor_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/hypervisor_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/usage.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/usage.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_event.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_event.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_event.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_event.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/keypair.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/keypair.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/keypair.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/keypair.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/usage.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/usage.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/aggregate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/aggregate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/console.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/console.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/host.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/host.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/aggregate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/aggregate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/console.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/console.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/host.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/host.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server_volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/usage.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/usage.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/hypervisor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/__pycache__/server.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/hypervisor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/hypervisor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/v2/server_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/compute/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/ec2creds.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/ec2creds.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/role_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/role_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/ec2creds.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/ec2creds.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/role_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/ec2creds.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/ec2creds.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v2_0/endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/access_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/access_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/identity_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/identity_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/registered_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/registered_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/role_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/role_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/unscoped_saml.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/unscoped_saml.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/catalog.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/catalog.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/implied_role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/implied_role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/registered_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/registered_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/ec2creds.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/ec2creds.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/implied_role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/implied_role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/tag.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/tag.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/application_credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/application_credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/identity_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/identity_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/region.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/region.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/unscoped_saml.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/unscoped_saml.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/catalog.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/catalog.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/domain.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/domain.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/consumer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/consumer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/tag.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/tag.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/identity_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/identity_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/mapping.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/mapping.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/token.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/token.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/access_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/access_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/application_credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/application_credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/token.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/token.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/unscoped_saml.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/unscoped_saml.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/access_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/access_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/consumer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/consumer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/trust.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/trust.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/ec2creds.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/ec2creds.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/federation_protocol.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/federation_protocol.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/trust.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/trust.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/domain.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/domain.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/federation_protocol.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/federation_protocol.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/registered_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/registered_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/role_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/mapping.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/mapping.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/region.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/__pycache__/region.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/domain.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/domain.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/mapping.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/mapping.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/catalog.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/catalog.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/ec2creds.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/ec2creds.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/federation_protocol.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/federation_protocol.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/consumer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/consumer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/tag.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/tag.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/token.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/token.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/endpoint_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/endpoint_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/region.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/region.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/trust.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/trust.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/application_credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/application_credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/implied_role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/implied_role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/v3/limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstackclient/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/base64.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/base64.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/base64.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/base64.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/jsonutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/jsonutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/jsonutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/jsonutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/msgpackutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/msgpackutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/msgpackutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/msgpackutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/base64.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/base64.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/jsonutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/jsonutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/msgpackutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/msgpackutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/msgpack_serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/msgpack_serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/base_serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/base_serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/base_serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/base_serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/json_serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/json_serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/json_serializer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/json_serializer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/msgpack_serializer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__pycache__/msgpack_serializer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/base_serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/base_serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/json_serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/json_serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/msgpack_serializer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/msgpack_serializer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/oslo_serialization/serializer/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_services_mixin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_services_mixin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/connection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/connection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/connection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/connection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/resource.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/resource.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_hacking.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_hacking.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/service_description.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/service_description.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__main__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__main__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_hacking.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_hacking.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_log.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_log.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_services_mixin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_services_mixin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/resource.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/resource.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_log.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/_log.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/service_description.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/service_description.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__main__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__pycache__/__main__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/accelerator_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/accelerator_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/deployable.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/deployable.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/device.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/device.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/device_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/device_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/accelerator_request.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/accelerator_request.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/deployable.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/deployable.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/accelerator_request.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/accelerator_request.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/deployable.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/deployable.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/__pycache__/device_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/accelerator_request.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/v2/accelerator_request.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/accelerator_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/accelerator_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/accelerator_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/accelerator/__pycache__/accelerator_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/_base_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/_base_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/_base_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/_base_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/block_storage_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/block_storage_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/block_storage_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/__pycache__/block_storage_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/_base_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/_base_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/block_storage_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/block_storage_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v2/volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/volume.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/volume.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/snapshot.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/snapshot.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/snapshot.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/snapshot.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/volume.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/volume.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/backup.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/backup.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/backup.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/backup.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/volume.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/volume.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/__pycache__/type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/backup.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/backup.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/snapshot.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/block_storage/v3/snapshot.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/network_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/network_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/network_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/network_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/network_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/network_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/service_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/service_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/subnet.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/subnet.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/floating_ip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/floating_ip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/load_balancer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/load_balancer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/metering_label.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/metering_label.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/vpn_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/vpn_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/agent.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/agent.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/health_monitor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/health_monitor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/listener.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/listener.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_minimum_bandwidth_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_minimum_bandwidth_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/load_balancer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/load_balancer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_bandwidth_limit_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_bandwidth_limit_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_dscp_marking_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_dscp_marking_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/health_monitor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/health_monitor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_ip_availability.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_ip_availability.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool_member.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool_member.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_rule_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_rule_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/agent.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/agent.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_segment_range.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_segment_range.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/listener.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/listener.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port_forwarding.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port_forwarding.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/segment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/segment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/trunk.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/trunk.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/floating_ip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/floating_ip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port_forwarding.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port_forwarding.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_dscp_marking_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_dscp_marking_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/vpn_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/vpn_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/floating_ip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/floating_ip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_minimum_bandwidth_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_minimum_bandwidth_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/router.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/router.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/auto_allocated_topology.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/auto_allocated_topology.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_minimum_bandwidth_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_minimum_bandwidth_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/health_monitor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/health_monitor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool_member.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/pool_member.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/segment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/segment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet_pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet_pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/firewall_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/load_balancer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/load_balancer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_rule_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_rule_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/rbac_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/rbac_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/security_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/auto_allocated_topology.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/auto_allocated_topology.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_segment_range.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_segment_range.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_bandwidth_limit_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_bandwidth_limit_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/rbac_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/rbac_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet_pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/subnet_pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_scope.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_scope.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/agent.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/agent.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/listener.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/listener.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/metering_label_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/qos_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/router.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/router.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/trunk.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/trunk.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/vpn_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/vpn_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_scope.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/address_scope.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_ip_availability.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/network_ip_availability.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/__pycache__/service_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/network_ip_availability.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/network_ip_availability.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/router.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/router.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/security_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/security_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_rule_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_rule_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/rbac_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/rbac_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/auto_allocated_topology.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/auto_allocated_topology.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/firewall_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/firewall_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/firewall_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/firewall_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/pool_member.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/pool_member.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/port_forwarding.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/port_forwarding.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_dscp_marking_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_dscp_marking_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/segment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/segment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/trunk.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/trunk.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/address_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/address_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/metering_label_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/metering_label_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/network_segment_range.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/network_segment_range.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_bandwidth_limit_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_bandwidth_limit_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/service_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/service_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/address_scope.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/address_scope.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/qos_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/firewall_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/firewall_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/security_group_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/security_group_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/subnet_pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/v2/subnet_pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/network/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/software_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/software_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack_environment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack_environment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack_files.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack_files.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack_template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack_template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_deployment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_deployment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_environment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_environment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_environment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_environment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_files.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_files.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/resource.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/resource.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/resource.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/resource.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_deployment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/software_deployment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_files.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/stack_files.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/template.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/template.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/template.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/template.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/resource.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/resource.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/software_deployment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/software_deployment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/stack.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/template.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/v1/template.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/orchestration_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/orchestration_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/orchestration_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/orchestration_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/orchestration_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/orchestration_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/template_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/template_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/template_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/template_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/event_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/event_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/environment_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/environment_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/environment_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/environment_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/event_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/event_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/__pycache__/template_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/environment_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/environment_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/event_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/orchestration/util/event_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/_log.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/_log.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/instance_ha_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/instance_ha_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/instance_ha_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/__pycache__/instance_ha_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/instance_ha_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/instance_ha_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/segment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/segment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/notification.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/notification.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/segment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/segment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/host.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/host.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/host.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/host.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/notification.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__pycache__/notification.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/host.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/host.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/notification.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/notification.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/segment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/segment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/instance_ha/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/message_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/message_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/message_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/__pycache__/message_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/message_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/message_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/subscription.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/subscription.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/claim.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/claim.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/subscription.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/subscription.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/claim.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/claim.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/message.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/message.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/message.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/message.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/queue.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/queue.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/queue.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/queue.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/subscription.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/__pycache__/subscription.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/claim.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/claim.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/message.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/message.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/queue.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/message/v2/queue.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/_services_mixin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/_services_mixin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/baremetal_introspection_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/baremetal_introspection_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/baremetal_introspection_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/baremetal_introspection_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/baremetal_introspection_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/baremetal_introspection_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/introspection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/introspection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/introspection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/introspection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/introspection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/v1/introspection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal_introspection/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/resource.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/resource.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/workflow_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/workflow_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/workflow_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/__pycache__/workflow_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/workflow.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/workflow.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/workflow.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/workflow.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/execution.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/execution.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/execution.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/execution.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/workflow.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/__pycache__/workflow.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/execution.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/v2/execution.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/workflow_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/workflow/workflow_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__main__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/__main__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/_hacking.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/_hacking.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/compute_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/compute_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/compute_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/compute_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/compute_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/compute_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_ip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_ip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_remote_console.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_remote_console.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/volume_attachment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/volume_attachment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_remote_console.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_remote_console.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/aggregate.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/aggregate.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/keypair.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/keypair.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/keypair.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/keypair.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_diagnostics.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_diagnostics.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_diagnostics.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_diagnostics.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_remote_console.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_remote_console.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/volume_attachment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/volume_attachment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/aggregate.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/aggregate.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_interface.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_interface.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_interface.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_interface.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_ip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_ip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/volume_attachment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/volume_attachment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/metadata.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/metadata.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/hypervisor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/hypervisor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/hypervisor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/hypervisor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/metadata.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/metadata.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_ip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__pycache__/server_ip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_interface.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_interface.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_diagnostics.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_diagnostics.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/server_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/metadata.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/metadata.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/keypair.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/keypair.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/aggregate.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/aggregate.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/hypervisor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/v2/hypervisor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/compute/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/database_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/database_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/database_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/database_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/database_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/database_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/database.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/database.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/instance.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/instance.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/instance.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/instance.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/database.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/database.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/database.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/database.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/instance.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/database/v1/instance.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/dns_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/dns_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/dns_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/__pycache__/dns_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/dns_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/dns_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/floating_ip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/floating_ip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/recordset.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/recordset.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone_import.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone_import.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_import.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_import.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/floating_ip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/floating_ip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/recordset.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/recordset.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_transfer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_transfer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_transfer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_transfer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/recordset.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/recordset.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_export.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_export.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_export.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_export.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_import.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/zone_import.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/floating_ip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/__pycache__/floating_ip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone_export.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone_export.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone_transfer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/dns/v2/zone_transfer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/service_description.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/service_description.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/schema.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/schema.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendor-schema.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendor-schema.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/catalyst.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/catalyst.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/dreamcompute.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/dreamcompute.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ovh-us.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ovh-us.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ovh.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ovh.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/rackspace.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/rackspace.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/unitedstack.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/unitedstack.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/zetta.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/zetta.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/entercloudsuite.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/entercloudsuite.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/fuga.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/fuga.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/limestonenetworks.yaml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/limestonenetworks.yaml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/otc.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/otc.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/switchengines.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/switchengines.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ultimum.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ultimum.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/auro.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/auro.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/betacloud.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/betacloud.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/bluebox.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/bluebox.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ibmcloud.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/ibmcloud.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/internap.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/internap.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/vexxhost.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/vexxhost.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/citycloud.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/citycloud.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/conoha.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/conoha.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/elastx.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/vendors/elastx.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/loader.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/loader.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/_util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/_util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/cloud_config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/cloud_config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/cloud_region.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/cloud_region.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/defaults.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/defaults.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/defaults.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/defaults.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/_util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/_util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_region.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_region.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_region.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_region.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/loader.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/loader.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/_util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/_util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/defaults.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/defaults.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/defaults.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/defaults.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/loader.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/loader.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/cloud_config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/config/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/identity_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/identity_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/tenant.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/tenant.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/tenant.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/tenant.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/tenant.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/tenant.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/project.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/project.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/trust.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/trust.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/user.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/user.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/domain.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/domain.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/mapping.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/mapping.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/region.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/region.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/project.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/project.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_user_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_user_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_group_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_group_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_user_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_user_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/endpoint.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/endpoint.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/registered_limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/registered_limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_group_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_group_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_group_assignment.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_group_assignment.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/user.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/user.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/application_credential.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/application_credential.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/identity_provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/identity_provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/registered_limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/registered_limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/mapping.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/mapping.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_group_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_group_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/federation_protocol.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/federation_protocol.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/limit.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/limit.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/identity_provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/identity_provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/trust.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/trust.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/project.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/project.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/region.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/region.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/application_credential.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/application_credential.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/endpoint.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/endpoint.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/federation_protocol.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/federation_protocol.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_user_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_domain_user_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_user_assignment.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/role_project_user_assignment.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/domain.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/domain.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/limit.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/limit.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__pycache__/policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/domain.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/domain.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/endpoint.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/endpoint.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/trust.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/trust.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/user.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/user.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/federation_protocol.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/federation_protocol.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/registered_limit.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/registered_limit.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_project_group_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_project_group_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/identity_provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/identity_provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/mapping.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/mapping.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_domain_group_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_domain_group_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_domain_user_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_domain_user_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_project_user_assignment.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role_project_user_assignment.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/application_credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/application_credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/credential.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/credential.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/region.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/region.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/v3/role.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/identity_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/identity_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/identity_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/identity_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/identity/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/load_balancer_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/load_balancer_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/listener.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/listener.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/load_balancer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/load_balancer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/member.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/member.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/provider.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/provider.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/quota.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/quota.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/load_balancer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/load_balancer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/provider.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/provider.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/health_monitor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/health_monitor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/health_monitor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/health_monitor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/pool.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/pool.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/quota.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/quota.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/load_balancer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/load_balancer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/provider.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/provider.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/amphora.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/amphora.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/listener.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/listener.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/pool.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/pool.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone_profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone_profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/member.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/member.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/quota.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/quota.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/amphora.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/amphora.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor_profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/flavor_profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_rule.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_rule.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/listener.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/listener.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/member.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/member.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_rule.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/__pycache__/l7_rule.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/l7_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/l7_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/flavor_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/flavor_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/health_monitor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/health_monitor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/flavor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/flavor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/amphora.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/amphora.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/pool.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/pool.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/availability_zone_profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/availability_zone_profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/l7_rule.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/v2/l7_rule.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/load_balancer_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/load_balancer_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/load_balancer_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/load_balancer_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/load_balancer/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/object_store_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/object_store_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/object_store_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/__pycache__/object_store_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/object_store_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/object_store_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/container.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/container.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/info.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/info.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/obj.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/obj.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/account.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/account.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/obj.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/obj.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/account.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/account.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/container.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/container.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/info.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__pycache__/info.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/account.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/account.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/container.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/container.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/info.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/info.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/obj.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/obj.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/object_store/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/baremetal_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/baremetal_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/baremetal_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/baremetal_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/configdrive.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/configdrive.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/configdrive.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/configdrive.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/baremetal_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/baremetal_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/configdrive.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/configdrive.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/port.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/port.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/volume_target.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/volume_target.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/deploy_templates.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/deploy_templates.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/driver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/driver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/chassis.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/chassis.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/conductor.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/conductor.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/node.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/node.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/node.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/node.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_connector.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_connector.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/allocation.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/allocation.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/chassis.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/chassis.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/deploy_templates.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/deploy_templates.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/driver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/driver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_connector.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_connector.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/conductor.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/conductor.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/port.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_target.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_target.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_target.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/volume_target.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/allocation.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/__pycache__/allocation.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/chassis.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/chassis.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/conductor.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/conductor.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/node.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/node.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/allocation.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/allocation.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/port_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/port_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/volume_connector.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/volume_connector.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/deploy_templates.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/deploy_templates.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/_common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/_common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/driver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/v1/driver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/baremetal/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/clustering_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/clustering_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/clustering_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/clustering_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/clustering_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/clustering_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/_async_resource.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/_async_resource.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/action.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/action.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/action.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/action.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_attr.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_attr.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_async_resource.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_async_resource.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_async_resource.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_async_resource.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/event.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/event.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/node.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/node.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/receiver.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/receiver.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/build_info.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/build_info.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/build_info.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/build_info.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/node.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/node.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_policy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_policy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/event.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/event.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/receiver.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/receiver.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_attr.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/cluster_attr.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/policy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/__pycache__/profile_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/cluster.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/cluster.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/cluster_attr.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/cluster_attr.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/cluster_policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/cluster_policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/node.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/node.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/action.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/action.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/build_info.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/build_info.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/profile.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/profile.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/profile_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/profile_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/receiver.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/receiver.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/policy_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/policy_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/event.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/event.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/policy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/clustering/v1/policy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/connection.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/connection.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/connection.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/connection.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/connection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/fixture/connection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/secret.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/secret.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/order.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/order.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/order.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/order.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_format.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_format.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_format.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_format.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/secret.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/secret.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/container.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/container.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/container.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/container.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/secret.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/__pycache__/secret.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/_format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/_format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/container.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/container.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/order.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/v1/order.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/key_manager_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/key_manager_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/key_manager_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/__pycache__/key_manager_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/key_manager_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/key_manager/key_manager_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/shared_file_system_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/shared_file_system_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/shared_file_system_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/__pycache__/shared_file_system_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/shared_file_system_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/shared_file_system_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/availability_zone.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/availability_zone.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/availability_zone.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/availability_zone.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/availability_zone.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/shared_file_system/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_identity.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_identity.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_network.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_network.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_shared_file_system.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_shared_file_system.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/inventory.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/inventory.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/inventory.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/inventory.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/inventory.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/cmd/__pycache__/inventory.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/meta.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/meta.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_baremetal.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_baremetal.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_block_storage.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_block_storage.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_coe.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_coe.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/openstackcloud.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/openstackcloud.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_orchestration.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_orchestration.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_security_group.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_security_group.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_accelerator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_accelerator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_network_common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_network_common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_object_store.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_object_store.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_security_group.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_security_group.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/meta.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/meta.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_baremetal.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_baremetal.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_clustering.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_clustering.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_coe.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_coe.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_identity.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_identity.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network_common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network_common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_security_group.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_security_group.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/exc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/exc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_dns.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_dns.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_dns.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_dns.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_floating_ip.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_floating_ip.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/openstackcloud.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/openstackcloud.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_compute.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_compute.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_floating_ip.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_floating_ip.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_identity.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_identity.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/exc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/exc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_accelerator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_accelerator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_clustering.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_clustering.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_coe.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_coe.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/inventory.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/inventory.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_object_store.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_object_store.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_orchestration.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_orchestration.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/meta.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/meta.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_block_storage.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_block_storage.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_compute.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_compute.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_accelerator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_accelerator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_baremetal.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_baremetal.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/inventory.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/inventory.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_block_storage.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_block_storage.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network_common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_network_common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_shared_file_system.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_shared_file_system.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_object_store.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_object_store.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_orchestration.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_orchestration.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_shared_file_system.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_shared_file_system.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_normalize.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_normalize.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_normalize.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/_normalize.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/openstackcloud.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__pycache__/openstackcloud.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_clustering.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_clustering.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_normalize.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_normalize.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_floating_ip.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_floating_ip.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/exc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/exc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/inventory.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/inventory.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_compute.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_compute.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_dns.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/cloud/_dns.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/connection.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/connection.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/format.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/format.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/__pycache__/image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v1/image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/_base_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/_base_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/_download.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/_download.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/image_signer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/image_signer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/iterable_chunked_file.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/iterable_chunked_file.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/image.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/image.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/member.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/member.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/schema.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/schema.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/task.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/task.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/schema.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/schema.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/service_info.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/service_info.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/task.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/task.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/image.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/image.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/member.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/member.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/service_info.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/service_info.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/_proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/_proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/image.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/image.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/member.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/member.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/schema.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/schema.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/service_info.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/service_info.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/task.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/task.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_base_proxy.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_base_proxy.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_download.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_download.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_service.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_service.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_signer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_signer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_signer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_signer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/iterable_chunked_file.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/iterable_chunked_file.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_base_proxy.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_base_proxy.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_download.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/_download.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_service.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/image_service.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/iterable_chunked_file.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/__pycache__/iterable_chunked_file.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/image_service.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/image/image_service.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/proxy.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/proxy.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/py.typed\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/openstack/py.typed: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/options.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/options.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/testr_command.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/testr_command.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/git.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/git.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/packaging.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/packaging.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/version.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/version.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/builddoc.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/builddoc.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/find_package.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/find_package.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/core.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/core.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/pbr_json.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/pbr_json.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/sphinxext.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/sphinxext.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/find_package.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/find_package.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/git.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/git.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/packaging.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/packaging.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/pbr_json.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/pbr_json.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/pbr_json.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/pbr_json.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/sphinxext.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/sphinxext.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/builddoc.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/builddoc.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/extra_files.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/extra_files.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/version.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/version.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/core.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/core.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/core.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/core.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/git.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/git.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/util.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/util.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/options.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/options.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/packaging.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/packaging.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/testr_command.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/testr_command.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/builddoc.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/builddoc.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/find_package.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/find_package.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/sphinxext.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/sphinxext.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/testr_command.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/testr_command.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/util.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/util.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/version.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/version.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/extra_files.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/extra_files.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/options.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/__pycache__/options.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/metadata.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/metadata.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/backwards.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/backwards.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/commands.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/commands.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/commands.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/commands.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/files.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/files.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/backwards.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/backwards.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/files.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/files.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/metadata.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/metadata.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/metadata.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/__pycache__/metadata.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/backwards.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/backwards.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/commands.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/commands.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/files.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/hooks/files.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/util.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/util.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/main.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/main.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/main.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/main.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/main.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/cmd/__pycache__/main.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/extra_files.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pbr/extra_files.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/ansi.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/ansi.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/argparse_custom.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/argparse_custom.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/py_bridge.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/py_bridge.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/transcript.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/transcript.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/history.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/history.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/rl_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/rl_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/decorators.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/decorators.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/py_bridge.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/py_bridge.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/cmd2.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/cmd2.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/cmd2.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/cmd2.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/command_definition.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/command_definition.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/transcript.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/transcript.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/parsing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/parsing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/parsing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/parsing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/rl_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/rl_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/command_definition.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/command_definition.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/history.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/history.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/py_bridge.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/py_bridge.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/table_creator.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/table_creator.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/ansi.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/ansi.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/ansi.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/ansi.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_completer.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_completer.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/clipboard.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/clipboard.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/decorators.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/decorators.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/transcript.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/transcript.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_custom.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_custom.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_custom.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_custom.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/table_creator.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/table_creator.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_completer.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/argparse_completer.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/clipboard.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/clipboard.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/clipboard.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/clipboard.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/rl_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/rl_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/argparse_completer.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/argparse_completer.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/command_definition.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/command_definition.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/decorators.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/decorators.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/parsing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/parsing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/table_creator.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/table_creator.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/cmd2.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/cmd2.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/history.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/cmd2/history.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jmespath-0.10.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyOpenSSL-20.0.1-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/REQUESTED\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/REQUESTED: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/WHEEL\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/WHEEL: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/INSTALLER\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/INSTALLER: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/METADATA\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/METADATA: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/RECORD\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/pyparsing-2.4.7.dist-info/RECORD: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney-0.6.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/jeepney-0.6.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/entry_points.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/entry_points.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/requires.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/netaddr-0.8.0-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/_i18n.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/_i18n.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/_i18n.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/_i18n.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/cliutils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/cliutils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/httpclient.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/httpclient.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/httpclient.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/httpclient.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/cliutils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/cliutils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/constants.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/constants.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/constants.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__pycache__/constants.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/_i18n.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/_i18n.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/apiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/cliutils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/cliutils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/constants.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/constants.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/httpclient.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/httpclient.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_networks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_networks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_scheduler_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_scheduler_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_shares.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_shares.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_shares_listing.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_shares_listing.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_shares_metadata.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_shares_metadata.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_security_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_security_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_servers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_servers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshot_instances_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshot_instances_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshots_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshots_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_common.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_common.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_scheduler_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_scheduler_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_replica_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_replica_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_availability_zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_availability_zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_listing.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_listing.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_common.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_common.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_servers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_servers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_listing.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_listing.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_metadata.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_metadata.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_network_subnets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_network_subnets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_common.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_common.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_servers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_servers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_networks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_networks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshots_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshots_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_scheduler_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_scheduler_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_security_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_security_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_replica_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_replica_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_security_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_security_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_network_subnets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_network_subnets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_metadata.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_shares_metadata.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshots_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshots_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_snapshot_instances_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_availability_zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_availability_zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_networks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__pycache__/test_share_networks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_replica_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_replica_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_network_subnets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_network_subnets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshot_instances.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshot_instances.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/test_shares.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/test_shares.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/test_shares.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__pycache__/test_shares.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/test_shares.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/test_shares.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_availability_zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_availability_zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshot_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_snapshot_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/functional/test_share_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_api_versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_api_versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_api_versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_api_versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_functional_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_functional_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_functional_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_functional_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_api_versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__pycache__/test_api_versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/apiclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/test_httpclient.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/test_httpclient.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/test_httpclient.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/test_httpclient.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/test_httpclient.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/common/__pycache__/test_httpclient.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_functional_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_functional_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/__pycache__/osc_fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/osc_fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/osc_fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/osc_utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/osc_utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_access_rules.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_access_rules.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_type.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_type.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/test_share_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_access_rules.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_access_rules.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_access_rules.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_access_rules.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/test_share_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/osc/v2/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/test_base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_scheduler_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_scheduler_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_security_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_security_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_networks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_networks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_servers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_servers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_share_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_shares.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_shares.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_networks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_networks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_scheduler_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_scheduler_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_networks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_networks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_servers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_servers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_shares.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_shares.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_security_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_security_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_shares.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_shares.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_scheduler_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_scheduler_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_servers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_servers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_share_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_security_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/__pycache__/test_security_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v1/test_quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_servers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_servers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_group_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_group_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_networks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_networks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_security_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_security_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshot_instances.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshot_instances.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_availability_zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_availability_zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_network_subnets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_network_subnets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_networks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_networks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_security_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_security_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_groups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_groups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instance_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instance_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_networks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_networks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instances.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instances.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fake_clients.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fake_clients.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_servers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_servers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instances.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instances.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fake_clients.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fake_clients.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fakes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fakes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_groups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_groups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fakes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/fakes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_servers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_servers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replicas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replicas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instance_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instance_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shares.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shares.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replica_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replica_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shares.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shares.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_scheduler_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_scheduler_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_security_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_security_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instance_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instance_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_network_subnets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_network_subnets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replica_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replica_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_availability_zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_availability_zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instances.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instances.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instance_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_snapshot_instance_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instances.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_instances.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_scheduler_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_scheduler_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_group_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replicas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__pycache__/test_share_replicas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_availability_zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_availability_zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshot_instance_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshot_instance_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_instances.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_instances.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_instance_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_instance_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_network_subnets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_network_subnets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_replicas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_replicas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_shares.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_shares.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/fake_clients.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/fake_clients.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_group_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_group_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_group_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_group_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshot_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshot_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/fakes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/fakes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_scheduler_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_scheduler_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_replica_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_replica_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_groups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/tests/unit/v2/test_share_groups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/list_extensions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/__pycache__/list_extensions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/list_extensions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/contrib/list_extensions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_networks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_networks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/scheduler_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/scheduler_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/scheduler_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/scheduler_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/shares.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/shares.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/security_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/security_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_servers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_servers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/security_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/security_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_networks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_networks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/scheduler_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/scheduler_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_networks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_networks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_servers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/share_servers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/shares.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/shares.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/security_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/security_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_servers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_servers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/share_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/shares.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v1/shares.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/base.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/base.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/config.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/config.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/scheduler_stats.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/scheduler_stats.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_group_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_group_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_networks.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_networks.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_servers.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_servers.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/limits.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/limits.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instances.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instances.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shares.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shares.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_access_rules.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_access_rules.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_access_rules.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_access_rules.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shares.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shares.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instance_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instance_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instance_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instance_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quota_classes.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quota_classes.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_groups.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_groups.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instance_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instance_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replicas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replicas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/scheduler_stats.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/scheduler_stats.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_networks.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_networks.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instances.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instances.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/security_services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/security_services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replica_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replica_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_network_subnets.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_network_subnets.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_networks.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_networks.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instance_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instance_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/security_services.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/security_services.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instances.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_instances.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/availability_zones.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/availability_zones.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/limits.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/limits.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_export_locations.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_export_locations.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/scheduler_stats.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/scheduler_stats.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/services.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/services.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_network_subnets.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_network_subnets.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_group_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_servers.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_servers.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_groups.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_groups.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_servers.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_servers.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instances.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_snapshot_instances.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/availability_zones.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/availability_zones.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quota_classes.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/quota_classes.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replica_export_locations.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replica_export_locations.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replicas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__pycache__/share_replicas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_network_subnets.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_network_subnets.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/limits.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/limits.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshot_instance_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshot_instance_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_group_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_group_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_instance_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_instance_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshot_instances.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshot_instances.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/availability_zones.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/availability_zones.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_groups.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_groups.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_instances.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_instances.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_replica_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_replica_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/quota_classes.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/quota_classes.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/security_services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/security_services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_access_rules.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_access_rules.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_group_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_group_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_replicas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_replicas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshot_export_locations.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_snapshot_export_locations.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/share_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/shares.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/shares.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/shell.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/shell.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/list_extensions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/list_extensions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/__pycache__/list_extensions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/list_extensions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/contrib/list_extensions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/services.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/v2/services.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/api_versions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/api_versions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/base.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/base.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/base.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/base.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/client.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/client.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/config.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/exceptions.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/exceptions.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/shell.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/shell.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/api_versions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/api_versions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/config.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/exceptions.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/exceptions.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/extension.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/extension.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/client.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/client.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/extension.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/extension.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/shell.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/shell.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/exceptions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/exceptions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/extension.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/extension.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/releasenotes\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/releasenotes: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/releasenotes/notes\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/releasenotes/notes: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/releasenotes/notes/bug-1899325-implement-usage-of-c-or-column-without-additional-logic-2970ee294f32bd31.yaml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/releasenotes/notes/bug-1899325-implement-usage-of-c-or-column-without-additional-logic-2970ee294f32bd31.yaml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/api_versions.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/api_versions.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/client.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/client.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_snapshots.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_snapshots.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_type_access.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_type_access.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/messages.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/messages.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_access_rules.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_access_rules.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_type_access.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_type_access.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_types.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_types.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_snapshots.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_snapshots.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_type_access.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_type_access.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/messages.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/messages.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/quotas.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/quotas.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/quotas.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/quotas.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_access_rules.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_access_rules.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_snapshots.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_snapshots.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_types.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/__pycache__/share_types.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/messages.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/messages.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_types.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_types.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/quotas.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/quotas.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_access_rules.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/v2/share_access_rules.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__init__.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__init__.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/plugin.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/plugin.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/plugin.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/utils.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/plugin.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/plugin.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/utils.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/manilaclient/osc/utils.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/PKG-INFO: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/SOURCES.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/dependency_links.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/python3.9/site-packages/ply-3.11-py3.9.egg-info/top_level.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/system\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/system: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/system/crond.service\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/system/crond.service: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/user\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/user: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/user/ssh-agent.service\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/lib/systemd/user/ssh-agent.service: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-openstackclient-lang\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-openstackclient-lang: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-openstackclient-lang/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-openstackclient-lang/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-manilaclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-manilaclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-manilaclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-manilaclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-monotonic\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-monotonic: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-monotonic/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-monotonic/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wcwidth\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wcwidth: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wcwidth/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wcwidth/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-i18n\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-i18n: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-i18n/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-i18n/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/crontabs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/crontabs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/crontabs/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/crontabs/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cffi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cffi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cffi/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cffi/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-log/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-log/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-msgpack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-msgpack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-msgpack/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-msgpack/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyOpenSSL\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyOpenSSL: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyOpenSSL/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyOpenSSL/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisoburn\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisoburn: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisoburn/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisoburn/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cinderclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cinderclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cinderclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cinderclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-appdirs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-appdirs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-appdirs/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-appdirs/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libburn\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libburn: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libburn/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libburn/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-utils-lang\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-utils-lang: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-utils-lang/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-utils-lang/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ujson\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ujson: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ujson/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ujson/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-log-lang\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-log-lang: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-log-lang/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-log-lang/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pytz\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pytz: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pytz/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pytz/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-dogpile-cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-dogpile-cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-dogpile-cache/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-dogpile-cache/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-swiftclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-swiftclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-swiftclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-swiftclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-munch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-munch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-munch/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-munch/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpointer\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpointer: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpointer/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpointer/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-novaclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-novaclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-novaclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-novaclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-utils\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-utils: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-utils/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-utils/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libfido2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libfido2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libfido2/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libfido2/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-warlock\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-warlock: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-warlock/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-warlock/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-debtcollector\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-debtcollector: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-debtcollector/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-debtcollector/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisofs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisofs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisofs/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libisofs/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jmespath\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jmespath: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jmespath/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jmespath/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-secretstorage\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-secretstorage: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-secretstorage/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-secretstorage/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyparsing\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyparsing: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyparsing/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyparsing/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-attrs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-attrs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-attrs/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-attrs/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-service-types\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-service-types: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-service-types/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-service-types/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-mako\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-mako: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-mako/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-mako/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-config/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-config/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pycparser\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pycparser: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pycparser/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pycparser/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/cronie\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/cronie: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/cronie/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/cronie/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneauth1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneauth1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneauth1/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-keystoneauth1/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-netaddr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-netaddr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-netaddr/COPYRIGHT\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-netaddr/COPYRIGHT: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/openssh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/openssh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/openssh/LICENCE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/openssh/LICENCE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ply\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ply: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ply/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ply/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-zipp\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-zipp: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-zipp/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-zipp/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-futurist\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-futurist: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-futurist/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-futurist/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstackclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstackclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstackclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstackclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pbr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pbr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pbr/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pbr/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libedit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libedit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libedit/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libedit/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cmd2\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cmd2: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cmd2/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cmd2/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-serialization\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-serialization: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-serialization/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-serialization/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jeepney\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jeepney: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jeepney/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jeepney/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-stevedore\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-stevedore: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-stevedore/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-stevedore/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/less\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/less: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/less/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/less/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/less/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/less/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging/LICENSE.APACHE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging/LICENSE.APACHE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging/LICENSE.BSD\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-packaging/LICENSE.BSD: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-i18n-lang\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-i18n-lang: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-i18n-lang/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python-oslo-i18n-lang/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-barbicanclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-barbicanclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-barbicanclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-barbicanclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-importlib-metadata\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-importlib-metadata: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-importlib-metadata/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-importlib-metadata/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-context\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-context: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-context/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-oslo-context/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-markupsafe\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-markupsafe: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-markupsafe/LICENSE.rst\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-markupsafe/LICENSE.rst: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cliff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cliff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cliff/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cliff/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-simplejson\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-simplejson: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-simplejson/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-simplejson/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpatch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpatch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpatch/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonpatch/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-octaviaclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-octaviaclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-octaviaclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-octaviaclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/git-core\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/git-core: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/git-core/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/git-core/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-neutronclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-neutronclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-neutronclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-neutronclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-gnocchiclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-gnocchiclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-gnocchiclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-gnocchiclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonschema\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonschema: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonschema/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonschema/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonschema/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-jsonschema/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstacksdk\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstacksdk: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstacksdk/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-openstacksdk/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libcbor\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libcbor: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libcbor/LICENSE.md\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/libcbor/LICENSE.md: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ironicclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ironicclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ironicclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-ironicclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-colorama\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-colorama: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-colorama/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-colorama/LICENSE.txt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography/LICENSE.APACHE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography/LICENSE.APACHE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography/LICENSE.BSD\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-cryptography/LICENSE.BSD: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-designateclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-designateclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-designateclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-designateclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-prettytable\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-prettytable: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-prettytable/COPYING\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-prettytable/COPYING: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyyaml\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyyaml: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyyaml/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyyaml/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-requestsexceptions\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-requestsexceptions: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-requestsexceptions/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-requestsexceptions/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-client-config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-client-config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-client-config/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-os-client-config/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyrsistent\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyrsistent: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyrsistent/LICENCE.mit\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-pyrsistent/LICENCE.mit: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-glanceclient\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-glanceclient: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-glanceclient/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-glanceclient/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-osc-lib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-osc-lib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-osc-lib/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-osc-lib/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wrapt\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wrapt: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wrapt/LICENSE\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/licenses/python3-wrapt/LICENSE: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/completion\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/completion: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/completion/git-completion.tcsh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/completion/git-completion.tcsh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/completion/git-prompt.sh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/completion/git-prompt.sh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/multimail\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/multimail: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/multimail/README.Git\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/multimail/README.Git: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/post-receive-email\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/post-receive-email: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/pre-auto-gc-battery\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/contrib/hooks/pre-auto-gc-battery: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/branches\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/branches: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/description\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/description: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-receive.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-receive.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/post-update.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/post-update.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-push.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-push.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-applypatch.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-applypatch.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-commit.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-commit.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-merge-commit.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/pre-merge-commit.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/push-to-checkout.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/push-to-checkout.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/update.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/update.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/applypatch-msg.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/applypatch-msg.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/commit-msg.sample\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/hooks/commit-msg.sample: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/info\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/info: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/info/exclude\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/git-core/templates/info/exclude: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/buildinfo\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/buildinfo: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/buildinfo/content-sets.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/buildinfo/content-sets.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/buildinfo/labels.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/buildinfo/labels.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion/completions\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion/completions: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion/completions/git\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion/completions/git: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion/completions/gitk\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/bash-completion/completions/gitk: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/wodim.1.gz\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/wodim.1.gz: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/cdrecord.1.gz\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/cdrecord.1.gz: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/genisoimage.1.gz\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/genisoimage.1.gz: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/mkisofs.1.gz\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/man/man1/mkisofs.1.gz: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc/python3-cryptography\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc/python3-cryptography: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc/python3-cryptography/docs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc/python3-cryptography/docs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc/python3-cryptography/docs/_static\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/share/doc/python3-cryptography/docs/_static: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/keyring-python3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/keyring-python3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/neutron\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/neutron: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-add\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-add: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-keyscan\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-keyscan: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-generator-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-generator-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorrecord\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorrecord: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/gnocchi-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/gnocchi-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/less\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/less: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorriso-dd-target\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorriso-dd-target: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-keygen\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-keygen: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/pbr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/pbr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cinder-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cinder-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpatch-3.9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpatch-3.9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/scalar\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/scalar: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cronnext\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cronnext: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-shell\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-shell: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mako-render-3.9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mako-render-3.9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/openstack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/openstack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/netaddr\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/netaddr: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsondiff-3.9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsondiff-3.9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/genisoimage\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/genisoimage: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-upload-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-upload-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cinder\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cinder: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-agent\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-agent: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/convert-json-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/convert-json-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/crontab\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/crontab: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpointer\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpointer: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/neutron-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/neutron-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/scp\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/scp: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/barbican\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/barbican: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/manila-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/manila-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/lessecho\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/lessecho: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonschema\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonschema: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/lesspipe.sh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/lesspipe.sh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mako-render-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mako-render-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/glance-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/glance-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/wodim\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/wodim: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/osirrox\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/osirrox: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/swift\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/swift: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpointer-3.9\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpointer-3.9: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/keyring\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/keyring: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/gnocchi\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/gnocchi: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/barbican-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/barbican-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsondiff\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsondiff: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorriso\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorriso: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/baremetal\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/baremetal: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpointer-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpointer-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorrisofs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/xorrisofs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/manila\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/manila: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-upload-archive\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-upload-archive: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-receive-pack\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/git-receive-pack: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/sftp\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/sftp: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-copy-id\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh-copy-id: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/openstack-inventory\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/openstack-inventory: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpatch\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpatch: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/swift-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/swift-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jp.py\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jp.py: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/nova-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/nova-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cdrecord\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/cdrecord: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/openstack-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/openstack-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/ssh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/convert-json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/convert-json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-validator-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-validator-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mako-render\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mako-render: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsondiff-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsondiff-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-generator\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-generator: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/nova\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/nova: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/run-parts\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/run-parts: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/glance\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/glance: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpatch-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/jsonpatch-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/lesskey\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/lesskey: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-validator\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/oslo-config-validator: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mkisofs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/mkisofs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/pbr-3\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/usr/bin/pbr-3: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd/system\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd/system: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd/system/multi-user.target.wants\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd/system/multi-user.target.wants: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd/system/multi-user.target.wants/crond.service\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/systemd/system/multi-user.target.wants/crond.service: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.daily\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.daily: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/sysconfig\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/sysconfig: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/sysconfig/crond\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/sysconfig/crond: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/sysconfig/run-parts\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/sysconfig/run-parts: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.weekly\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.weekly: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/group\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/group: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/cinder.bash_completion\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/cinder.bash_completion: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/glance\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/glance: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/manila\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/manila: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/nova\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/nova: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/osc.bash_completion\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/bash_completion.d/osc.bash_completion: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/profile.d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/profile.d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/profile.d/less.csh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/profile.d/less.csh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/profile.d/less.sh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/profile.d/less.sh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/moduli\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/moduli: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/ssh_config\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/ssh_config: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/ssh_config.d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/ssh_config.d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/ssh_config.d/50-redhat.conf\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ssh/ssh_config.d/50-redhat.conf: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.monthly\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.monthly: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs-mkisofsman\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs-mkisofsman: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord-cdrecordman\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord-cdrecordman: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord-wodim\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord-wodim: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord-wodimman\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord-wodimman: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs-genisoimageman\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs-genisoimageman: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/cdrecord: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs-genisoimage\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/alternatives/mkisofs-genisoimage: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/gshadow-\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/gshadow-: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/pam.d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/pam.d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/pam.d/crond\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/pam.d/crond: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/group-\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/group-: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/gshadow\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/gshadow: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/crontab\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/crontab: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.deny\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.deny: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/anacrontab\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/anacrontab: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.d\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.d: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.d/0hourly\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.d/0hourly: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.hourly\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.hourly: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.hourly/0anacron\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/cron.hourly/0anacron: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ld.so.cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/etc/ld.so.cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/alternatives\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/alternatives: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/alternatives/mkisofs\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/alternatives/mkisofs: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/alternatives/cdrecord\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/alternatives/cdrecord: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm/rpmdb.sqlite-shm\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm/rpmdb.sqlite-shm: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm/rpmdb.sqlite-wal\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm/rpmdb.sqlite-wal: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm/rpmdb.sqlite\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/rpm/rpmdb.sqlite: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf/history.sqlite-shm\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf/history.sqlite-shm: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf/history.sqlite-wal\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf/history.sqlite-wal: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf/history.sqlite\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/lib/dnf/history.sqlite: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron/cron.monthly\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron/cron.monthly: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron/cron.weekly\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron/cron.weekly: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron/cron.daily\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/anacron/cron.daily: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/cron\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/spool/cron: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/cache/ldconfig\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/cache/ldconfig: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/cache/ldconfig/aux-cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/cache/ldconfig/aux-cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/rhsm\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/rhsm: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/rhsm/rhsm.log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/rhsm/rhsm.log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/dnf.librepo.log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/dnf.librepo.log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/dnf.log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/dnf.log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/dnf.rpm.log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/dnf.rpm.log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/hawkey.log\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/var/log/hawkey.log: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/.cache\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/.cache: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/.cache/python-entrypoints\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/.cache/python-entrypoints: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/.cache/python-entrypoints/8fdc2fa1a0b6ce7509b4332d133e45c9fbb37d999b7191894225c789223db1f7\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/.cache/python-entrypoints/8fdc2fa1a0b6ce7509b4332d133e45c9fbb37d999b7191894225c789223db1f7: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo/content_manifests\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo/content_manifests: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo/content_manifests/content-sets.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo/content_manifests/content-sets.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240078]: time="2025-12-06T09:47:50Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo/labels.json\": lstat /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged/root/buildinfo/labels.json: no such file or directory"
Dec 06 09:47:50 np0005548788.localdomain podman[240890]: 2025-12-06 09:47:50.73929149 +0000 UTC m=+0.310194534 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548788.localdomain podman[240890]: 2025-12-06 09:47:50.7917191 +0000 UTC m=+0.362622084 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:47:50 np0005548788.localdomain rsyslogd[760]: imjournal from <localhost:podman>: begin to drop messages due to rate-limiting
Dec 06 09:47:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2135 DF PROTO=TCP SPT=39618 DPT=9102 SEQ=225540985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED66F00000000001030307) 
Dec 06 09:47:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:47:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d637f2269baed45e71c2a75847c6903ddb47387528b995ac3d0fed5bef2ae572-merged.mount: Deactivated successfully.
Dec 06 09:47:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d637f2269baed45e71c2a75847c6903ddb47387528b995ac3d0fed5bef2ae572-merged.mount: Deactivated successfully.
Dec 06 09:47:53 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:53 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:53 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:47:54 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2136 DF PROTO=TCP SPT=39618 DPT=9102 SEQ=225540985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED76B00000000001030307) 
Dec 06 09:47:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-432c853227b9a47c28cbd9f8638abd2f4ba478bfd57b8f9c2584b83011a05ecd-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-432c853227b9a47c28cbd9f8638abd2f4ba478bfd57b8f9c2584b83011a05ecd-merged.mount: Deactivated successfully.
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.448 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.448 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.478 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.478 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.478 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.495 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.497 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.497 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.497 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.498 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.522 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.523 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.523 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.523 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.524 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:47:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:57.986 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.173 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.175 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13230MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.175 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.175 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.245 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.246 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.265 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.768 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.775 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.794 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.797 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:47:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:58.798 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:47:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14971 DF PROTO=TCP SPT=48584 DPT=9100 SEQ=2210926505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED7FF00000000001030307) 
Dec 06 09:47:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:59.483 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:59.484 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:59.484 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:47:59.484 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:47:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:47:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:48:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:48:01 np0005548788.localdomain systemd[1]: tmp-crun.ZJr2nV.mount: Deactivated successfully.
Dec 06 09:48:01 np0005548788.localdomain podman[240969]: 2025-12-06 09:48:01.156909029 +0000 UTC m=+0.100721382 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:48:01 np0005548788.localdomain sshd[240985]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:48:01 np0005548788.localdomain podman[240969]: 2025-12-06 09:48:01.166535028 +0000 UTC m=+0.110347381 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Dec 06 09:48:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:01 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:03 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:03 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:03 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:48:03 np0005548788.localdomain podman[240986]: 2025-12-06 09:48:03.377861961 +0000 UTC m=+2.215363319 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:48:03 np0005548788.localdomain podman[240986]: 2025-12-06 09:48:03.454123372 +0000 UTC m=+2.291624710 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 06 09:48:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5356 DF PROTO=TCP SPT=46338 DPT=9101 SEQ=246242626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED970E0000000001030307) 
Dec 06 09:48:04 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:04 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:04 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:48:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18409 DF PROTO=TCP SPT=57528 DPT=9882 SEQ=2974905655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5ED97F00000000001030307) 
Dec 06 09:48:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763-merged.mount: Deactivated successfully.
Dec 06 09:48:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5358 DF PROTO=TCP SPT=46338 DPT=9101 SEQ=246242626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDA3300000000001030307) 
Dec 06 09:48:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d637f2269baed45e71c2a75847c6903ddb47387528b995ac3d0fed5bef2ae572-merged.mount: Deactivated successfully.
Dec 06 09:48:08 np0005548788.localdomain sudo[241037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:48:08 np0005548788.localdomain sudo[241037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:48:08 np0005548788.localdomain sudo[241037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d637f2269baed45e71c2a75847c6903ddb47387528b995ac3d0fed5bef2ae572-merged.mount: Deactivated successfully.
Dec 06 09:48:08 np0005548788.localdomain sudo[241055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:48:08 np0005548788.localdomain sudo[241055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:48:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:48:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:48:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:48:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:48:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6419 DF PROTO=TCP SPT=36116 DPT=9105 SEQ=807956150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDADF00000000001030307) 
Dec 06 09:48:10 np0005548788.localdomain sshd[241098]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:48:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:12 np0005548788.localdomain podman[241087]: 2025-12-06 09:48:12.05644502 +0000 UTC m=+2.077390556 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:48:12 np0005548788.localdomain podman[241087]: 2025-12-06 09:48:12.089669163 +0000 UTC m=+2.110614709 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:48:12 np0005548788.localdomain podman[241087]: unhealthy
Dec 06 09:48:12 np0005548788.localdomain sudo[241055]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:12 np0005548788.localdomain sudo[241135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:48:12 np0005548788.localdomain sudo[241135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:48:12 np0005548788.localdomain sudo[241135]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21099 DF PROTO=TCP SPT=43920 DPT=9100 SEQ=3432180465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDB9310000000001030307) 
Dec 06 09:48:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:48:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:14 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:14 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:48:14 np0005548788.localdomain podman[240901]: 2025-12-06 09:47:53.474480757 +0000 UTC m=+2.783006235 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 06 09:48:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:16 np0005548788.localdomain sshd[241098]: Received disconnect from 45.78.194.186 port 36570:11: Bye Bye [preauth]
Dec 06 09:48:16 np0005548788.localdomain sshd[241098]: Disconnected from authenticating user root 45.78.194.186 port 36570 [preauth]
Dec 06 09:48:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:48:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21101 DF PROTO=TCP SPT=43920 DPT=9100 SEQ=3432180465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDC5300000000001030307) 
Dec 06 09:48:16 np0005548788.localdomain podman[241177]: 2025-12-06 09:48:16.416112846 +0000 UTC m=+0.050410769 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 06 09:48:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:48:17 np0005548788.localdomain podman[241190]: 2025-12-06 09:48:17.280309423 +0000 UTC m=+0.094061214 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:48:17 np0005548788.localdomain podman[241190]: 2025-12-06 09:48:17.2925828 +0000 UTC m=+0.106334501 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:48:17 np0005548788.localdomain podman[241190]: unhealthy
Dec 06 09:48:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:48:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe-merged.mount: Deactivated successfully.
Dec 06 09:48:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:48:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5874 DF PROTO=TCP SPT=40198 DPT=9102 SEQ=3882428816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDD0350000000001030307) 
Dec 06 09:48:19 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:19 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:48:19 np0005548788.localdomain podman[241213]: 2025-12-06 09:48:19.707457448 +0000 UTC m=+0.303545130 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:48:19 np0005548788.localdomain podman[241213]: 2025-12-06 09:48:19.744952129 +0000 UTC m=+0.341039821 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:48:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2d61158fc3c4b284541783e4fe7f1d5d2b40ff1423846e3371a6b8c7a7e7763-merged.mount: Deactivated successfully.
Dec 06 09:48:20 np0005548788.localdomain sshd[240014]: Connection closed by 45.78.219.195 port 34494 [preauth]
Dec 06 09:48:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5876 DF PROTO=TCP SPT=40198 DPT=9102 SEQ=3882428816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDDC300000000001030307) 
Dec 06 09:48:22 np0005548788.localdomain podman[241177]: 
Dec 06 09:48:22 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:48:22 np0005548788.localdomain podman[241177]: 2025-12-06 09:48:22.72863712 +0000 UTC m=+6.362935073 container create b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 06 09:48:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:48:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548788.localdomain python3[240876]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 06 09:48:25 np0005548788.localdomain podman[241233]: 2025-12-06 09:48:25.019334564 +0000 UTC m=+0.838382566 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:48:25 np0005548788.localdomain podman[241233]: 2025-12-06 09:48:25.107706307 +0000 UTC m=+0.926754309 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:48:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:48:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5877 DF PROTO=TCP SPT=40198 DPT=9102 SEQ=3882428816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDEBF00000000001030307) 
Dec 06 09:48:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:48:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:27 np0005548788.localdomain sudo[240874]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:48:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21103 DF PROTO=TCP SPT=43920 DPT=9100 SEQ=3432180465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EDF5F00000000001030307) 
Dec 06 09:48:29 np0005548788.localdomain sudo[241386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcafqcrrbghzrkynvvfdmrqjryocmkqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014509.6370885-2186-223001042089099/AnsiballZ_stat.py
Dec 06 09:48:29 np0005548788.localdomain sudo[241386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:48:30 np0005548788.localdomain python3.9[241388]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:48:30 np0005548788.localdomain sudo[241386]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe-merged.mount: Deactivated successfully.
Dec 06 09:48:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe-merged.mount: Deactivated successfully.
Dec 06 09:48:30 np0005548788.localdomain sudo[241499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqajsstnhhhqlrfxlpsxmszpastvanaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.4542654-2213-180199553227177/AnsiballZ_file.py
Dec 06 09:48:30 np0005548788.localdomain sudo[241499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:30 np0005548788.localdomain python3.9[241501]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:30 np0005548788.localdomain sudo[241499]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-37cb8325f9fcc3916540f70c0b29aa44083977f48bf4e198eec6f67fb80ec7c4-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548788.localdomain sudo[241608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsdibzocwxqwbnetjrlpckvwksofgokk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.9725227-2213-247638691322505/AnsiballZ_copy.py
Dec 06 09:48:31 np0005548788.localdomain sudo[241608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548788.localdomain python3.9[241610]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014510.9725227-2213-247638691322505/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:31 np0005548788.localdomain sudo[241608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548788.localdomain sudo[241663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-japqxjhqzprioyuaywrnmbblnxoswhca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.9725227-2213-247638691322505/AnsiballZ_systemd.py
Dec 06 09:48:31 np0005548788.localdomain sudo[241663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:32 np0005548788.localdomain python3.9[241665]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:48:32 np0005548788.localdomain systemd-sysv-generator[241692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:48:32 np0005548788.localdomain systemd-rc-local-generator[241689]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548788.localdomain sudo[241663]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain sudo[241753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmwcvbsjrwqanuefjnhzeibdgrwpmegw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.9725227-2213-247638691322505/AnsiballZ_systemd.py
Dec 06 09:48:33 np0005548788.localdomain sudo[241753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:33 np0005548788.localdomain python3.9[241755]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:48:33 np0005548788.localdomain podman[241758]: 2025-12-06 09:48:33.477261067 +0000 UTC m=+0.099926510 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:48:33 np0005548788.localdomain systemd-rc-local-generator[241798]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:48:33 np0005548788.localdomain systemd-sysv-generator[241802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:48:33 np0005548788.localdomain podman[241758]: 2025-12-06 09:48:33.511836301 +0000 UTC m=+0.134501774 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: Starting openstack_network_exporter container...
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:48:33 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:33 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5878 DF PROTO=TCP SPT=40198 DPT=9102 SEQ=3882428816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE0BF00000000001030307) 
Dec 06 09:48:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37923 DF PROTO=TCP SPT=56412 DPT=9101 SEQ=2412085721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE0C3F0000000001030307) 
Dec 06 09:48:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:48:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:36 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:48:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9f026f871bc3f273d5ab00948a24fe486abf0752f30990748a8f01715db6760/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9f026f871bc3f273d5ab00948a24fe486abf0752f30990748a8f01715db6760/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:48:36 np0005548788.localdomain podman[241813]: 2025-12-06 09:48:36.339901647 +0000 UTC m=+2.597386557 container init b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, distribution-scope=public, release=1755695350)
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *bridge.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *coverage.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *datapath.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *iface.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *memory.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *ovnnorthd.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *ovn.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *ovsdbserver.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *pmd_perf.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *pmd_rxq.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: INFO    09:48:36 main.go:48: registering *vswitch.Collector
Dec 06 09:48:36 np0005548788.localdomain openstack_network_exporter[241839]: NOTICE  09:48:36 main.go:82: listening on http://:9105/metrics
Dec 06 09:48:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:48:36 np0005548788.localdomain podman[241813]: 2025-12-06 09:48:36.381870223 +0000 UTC m=+2.639355123 container start b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, release=1755695350, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Dec 06 09:48:36 np0005548788.localdomain podman[241813]: openstack_network_exporter
Dec 06 09:48:36 np0005548788.localdomain podman[241825]: 2025-12-06 09:48:36.419726424 +0000 UTC m=+1.237847123 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:48:36 np0005548788.localdomain podman[241825]: 2025-12-06 09:48:36.461624977 +0000 UTC m=+1.279745676 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Dec 06 09:48:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-905566068c889f7331da5009d1f816627b96ebe2772c1ba76a7cbfdc783ddfc0-merged.mount: Deactivated successfully.
Dec 06 09:48:37 np0005548788.localdomain systemd[1]: Started openstack_network_exporter container.
Dec 06 09:48:37 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:37 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:48:37 np0005548788.localdomain sudo[241753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:37 np0005548788.localdomain podman[241853]: 2025-12-06 09:48:37.664188276 +0000 UTC m=+1.275239233 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:48:37 np0005548788.localdomain podman[241853]: 2025-12-06 09:48:37.731694105 +0000 UTC m=+1.342745102 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec 06 09:48:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37925 DF PROTO=TCP SPT=56412 DPT=9101 SEQ=2412085721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE18310000000001030307) 
Dec 06 09:48:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:38 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:48:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:39 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:39 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:39 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:39 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:39 np0005548788.localdomain sudo[241993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-patgbherwjgfnftagaknndnwmmbprwcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014519.3780396-2285-132826176206792/AnsiballZ_systemd.py
Dec 06 09:48:39 np0005548788.localdomain sudo[241993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain python3.9[241995]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: Stopping openstack_network_exporter container...
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: libpod-b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.scope: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain podman[241999]: 2025-12-06 09:48:40.190817257 +0000 UTC m=+0.122915307 container died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, release=1755695350, version=9.6, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.timer: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: tmp-crun.JVanxM.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-836e61d982b1d47c704516d9b7a84248dd1953565408f432976e6b13308663a5-merged.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3-userdata-shm.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57878 DF PROTO=TCP SPT=58268 DPT=9105 SEQ=1400097843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE23F00000000001030307) 
Dec 06 09:48:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-37cb8325f9fcc3916540f70c0b29aa44083977f48bf4e198eec6f67fb80ec7c4-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-37cb8325f9fcc3916540f70c0b29aa44083977f48bf4e198eec6f67fb80ec7c4-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f9f026f871bc3f273d5ab00948a24fe486abf0752f30990748a8f01715db6760-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548788.localdomain podman[241999]: 2025-12-06 09:48:41.988539475 +0000 UTC m=+1.920637465 container cleanup b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 06 09:48:41 np0005548788.localdomain podman[241999]: openstack_network_exporter
Dec 06 09:48:42 np0005548788.localdomain podman[242013]: 2025-12-06 09:48:42.054567661 +0000 UTC m=+1.840394597 container cleanup b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter)
Dec 06 09:48:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:43 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:43 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:43 np0005548788.localdomain systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 06 09:48:43 np0005548788.localdomain podman[242027]: 2025-12-06 09:48:43.256017036 +0000 UTC m=+0.067218832 container cleanup b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Dec 06 09:48:43 np0005548788.localdomain podman[242027]: openstack_network_exporter
Dec 06 09:48:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18552 DF PROTO=TCP SPT=43192 DPT=9100 SEQ=6509391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE2E5F0000000001030307) 
Dec 06 09:48:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:44 np0005548788.localdomain podman[240078]: time="2025-12-06T09:48:44Z" level=error msg="Getting root fs size for \"0fb74c4921d530c7417ea3835524f865489f2a9e6d95e8e1a4b78f899a0f7476\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:44 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:44 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: Stopped openstack_network_exporter container.
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: Starting openstack_network_exporter container...
Dec 06 09:48:44 np0005548788.localdomain podman[242038]: 2025-12-06 09:48:44.438487233 +0000 UTC m=+0.081059896 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:48:44 np0005548788.localdomain podman[242038]: 2025-12-06 09:48:44.520420953 +0000 UTC m=+0.162993666 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 06 09:48:44 np0005548788.localdomain podman[242038]: unhealthy
Dec 06 09:48:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:48:45 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9f026f871bc3f273d5ab00948a24fe486abf0752f30990748a8f01715db6760/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:45 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9f026f871bc3f273d5ab00948a24fe486abf0752f30990748a8f01715db6760/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:48:45 np0005548788.localdomain podman[242039]: 2025-12-06 09:48:45.317325139 +0000 UTC m=+0.950809180 container init b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *bridge.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *coverage.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *datapath.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *iface.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *memory.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *ovnnorthd.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *ovn.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *ovsdbserver.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *pmd_perf.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *pmd_rxq.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: INFO    09:48:45 main.go:48: registering *vswitch.Collector
Dec 06 09:48:45 np0005548788.localdomain openstack_network_exporter[242070]: NOTICE  09:48:45 main.go:82: listening on http://:9105/metrics
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:48:45 np0005548788.localdomain podman[242039]: 2025-12-06 09:48:45.386549168 +0000 UTC m=+1.020033209 container start b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal)
Dec 06 09:48:45 np0005548788.localdomain podman[242039]: openstack_network_exporter
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a1fbcfcf1579b596f592db44996077b25388e39ffb03069d7f60e867593509e2-merged.mount: Deactivated successfully.
Dec 06 09:48:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:48:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:48:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:48:46 np0005548788.localdomain systemd[1]: Started openstack_network_exporter container.
Dec 06 09:48:46 np0005548788.localdomain podman[242080]: 2025-12-06 09:48:46.516179575 +0000 UTC m=+1.134717030 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6)
Dec 06 09:48:46 np0005548788.localdomain sudo[241993]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:46 np0005548788.localdomain podman[242080]: 2025-12-06 09:48:46.562523102 +0000 UTC m=+1.181060637 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:48:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18554 DF PROTO=TCP SPT=43192 DPT=9100 SEQ=6509391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE3A700000000001030307) 
Dec 06 09:48:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-905566068c889f7331da5009d1f816627b96ebe2772c1ba76a7cbfdc783ddfc0-merged.mount: Deactivated successfully.
Dec 06 09:48:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:48:47.407 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:48:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:48:47.408 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:48:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:48:47.408 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:48:47 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:48:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:48 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36227 DF PROTO=TCP SPT=49864 DPT=9102 SEQ=1547938782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE45640000000001030307) 
Dec 06 09:48:49 np0005548788.localdomain sudo[242206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvtqzyqdjvpgcewaeadsuqvehlxdhxro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014529.1980867-2309-116982168726533/AnsiballZ_find.py
Dec 06 09:48:49 np0005548788.localdomain sudo[242206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:49 np0005548788.localdomain python3.9[242208]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:48:49 np0005548788.localdomain sudo[242206]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:48:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548788.localdomain podman[242226]: 2025-12-06 09:48:49.987900304 +0000 UTC m=+0.068188712 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:48:50 np0005548788.localdomain podman[242226]: 2025-12-06 09:48:50.025677643 +0000 UTC m=+0.105966081 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:48:50 np0005548788.localdomain podman[242226]: unhealthy
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-836e61d982b1d47c704516d9b7a84248dd1953565408f432976e6b13308663a5-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:50 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:48:50 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:51 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:51 np0005548788.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:51 np0005548788.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:51 np0005548788.localdomain podman[240078]: time="2025-12-06T09:48:51Z" level=error msg="Getting root fs size for \"16a2b62e12de11b9941777fc6f8a7d1c684c8a05ee1abb8610410916033d7c56\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:48:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36229 DF PROTO=TCP SPT=49864 DPT=9102 SEQ=1547938782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE51700000000001030307) 
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:48:52 np0005548788.localdomain podman[242249]: 2025-12-06 09:48:52.795618321 +0000 UTC m=+0.076097927 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:48:52 np0005548788.localdomain podman[242249]: 2025-12-06 09:48:52.808837887 +0000 UTC m=+0.089317483 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5a1b47ffdfc6345204e58005adbebe0d6f3492126b8c0115e7ce0b40d2b42062-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5a1b47ffdfc6345204e58005adbebe0d6f3492126b8c0115e7ce0b40d2b42062-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:53 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:48:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:48:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a1fbcfcf1579b596f592db44996077b25388e39ffb03069d7f60e867593509e2-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a1fbcfcf1579b596f592db44996077b25388e39ffb03069d7f60e867593509e2-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:48:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:48:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:48:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:56.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:56.184 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:48:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:56.184 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:48:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:56.205 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:48:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:48:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36230 DF PROTO=TCP SPT=49864 DPT=9102 SEQ=1547938782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE61300000000001030307) 
Dec 06 09:48:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:48:56 np0005548788.localdomain podman[242266]: 2025-12-06 09:48:56.766427476 +0000 UTC m=+0.096425125 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:48:56 np0005548788.localdomain podman[242266]: 2025-12-06 09:48:56.775656953 +0000 UTC m=+0.105654642 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:48:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.218 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.219 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.219 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.219 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.220 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:48:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.732 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:48:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.940 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.941 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13310MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.941 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:48:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:57.941 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:48:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.008 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.008 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.020 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.493 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.499 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.516 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.519 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:48:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:58.519 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:48:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18556 DF PROTO=TCP SPT=43192 DPT=9100 SEQ=6509391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE69F10000000001030307) 
Dec 06 09:48:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:48:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:59.514 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:59.514 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:59.515 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:59.515 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:59.515 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:48:59.515 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 06 09:49:00 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:00.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:49:01 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:01 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:01 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5a1b47ffdfc6345204e58005adbebe0d6f3492126b8c0115e7ce0b40d2b42062-merged.mount: Deactivated successfully.
Dec 06 09:49:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:49:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd5208a191e3dd329f7e505764b52a58d757ae8eaee9e9d3bc670d6f12b2b08-merged.mount: Deactivated successfully.
Dec 06 09:49:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:49:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:49:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:49:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:49:04 np0005548788.localdomain podman[242332]: 2025-12-06 09:49:04.123241466 +0000 UTC m=+0.094565510 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:49:04 np0005548788.localdomain podman[242332]: 2025-12-06 09:49:04.159719847 +0000 UTC m=+0.131043911 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:49:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57677 DF PROTO=TCP SPT=47672 DPT=9101 SEQ=472199374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE816E0000000001030307) 
Dec 06 09:49:04 np0005548788.localdomain systemd[1]: tmp-crun.X1usr1.mount: Deactivated successfully.
Dec 06 09:49:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17677 DF PROTO=TCP SPT=46246 DPT=9882 SEQ=3048614518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE81F00000000001030307) 
Dec 06 09:49:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:49:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548788.localdomain sshd[242352]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:49:06 np0005548788.localdomain sshd[242352]: Received disconnect from 148.227.3.232 port 59718:11: Bye Bye [preauth]
Dec 06 09:49:06 np0005548788.localdomain sshd[242352]: Disconnected from authenticating user root 148.227.3.232 port 59718 [preauth]
Dec 06 09:49:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.488 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.489 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.489 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.489 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:49:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:49:07 np0005548788.localdomain podman[242355]: 2025-12-06 09:49:07.741476495 +0000 UTC m=+0.089489697 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:49:07 np0005548788.localdomain podman[242355]: 2025-12-06 09:49:07.839413035 +0000 UTC m=+0.187426227 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 09:49:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57679 DF PROTO=TCP SPT=47672 DPT=9101 SEQ=472199374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE8D710000000001030307) 
Dec 06 09:49:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:09 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:49:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:49:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32329 DF PROTO=TCP SPT=49668 DPT=9105 SEQ=3032096163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EE97F00000000001030307) 
Dec 06 09:49:11 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:11 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548788.localdomain sudo[242381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:49:12 np0005548788.localdomain sudo[242381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:49:12 np0005548788.localdomain sudo[242381]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:13 np0005548788.localdomain sudo[242399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:49:13 np0005548788.localdomain sudo[242399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:49:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32524 DF PROTO=TCP SPT=34842 DPT=9100 SEQ=616252261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEA38F0000000001030307) 
Dec 06 09:49:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d623bf221e6e39ec968f36fc3f06f79e6b1927337c95facabcab11b35de0560d-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d623bf221e6e39ec968f36fc3f06f79e6b1927337c95facabcab11b35de0560d-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548788.localdomain sudo[242399]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:15 np0005548788.localdomain sudo[242449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:49:15 np0005548788.localdomain sudo[242449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:49:15 np0005548788.localdomain sudo[242449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:49:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:49:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd5208a191e3dd329f7e505764b52a58d757ae8eaee9e9d3bc670d6f12b2b08-merged.mount: Deactivated successfully.
Dec 06 09:49:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:49:15 np0005548788.localdomain podman[242467]: 2025-12-06 09:49:15.44812983 +0000 UTC m=+0.084127557 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 09:49:15 np0005548788.localdomain podman[242467]: 2025-12-06 09:49:15.531140481 +0000 UTC m=+0.167138238 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 09:49:15 np0005548788.localdomain podman[242467]: unhealthy
Dec 06 09:49:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32526 DF PROTO=TCP SPT=34842 DPT=9100 SEQ=616252261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEAFB00000000001030307) 
Dec 06 09:49:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:49:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:18 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:18 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:49:18 np0005548788.localdomain podman[242485]: 2025-12-06 09:49:18.189707915 +0000 UTC m=+0.374986599 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Dec 06 09:49:18 np0005548788.localdomain podman[242485]: 2025-12-06 09:49:18.232675105 +0000 UTC m=+0.417953799 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 09:49:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42528 DF PROTO=TCP SPT=51766 DPT=9102 SEQ=4288700949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEBA940000000001030307) 
Dec 06 09:49:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:20 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:49:20 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:20 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:49:21 np0005548788.localdomain podman[242503]: 2025-12-06 09:49:21.266624037 +0000 UTC m=+0.087516500 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:49:21 np0005548788.localdomain podman[242503]: 2025-12-06 09:49:21.303591914 +0000 UTC m=+0.124484357 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:49:21 np0005548788.localdomain podman[242503]: unhealthy
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:21 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:21 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:49:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-33d4ba4a6e0259b5150b68a23f46c9e702457315d900e4a8419ae01ffeed1203-merged.mount: Deactivated successfully.
Dec 06 09:49:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42530 DF PROTO=TCP SPT=51766 DPT=9102 SEQ=4288700949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEC6B00000000001030307) 
Dec 06 09:49:22 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:49:23 np0005548788.localdomain podman[242525]: 2025-12-06 09:49:23.256959227 +0000 UTC m=+0.083659193 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:49:23 np0005548788.localdomain podman[242525]: 2025-12-06 09:49:23.265560229 +0000 UTC m=+0.092260205 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 06 09:49:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d623bf221e6e39ec968f36fc3f06f79e6b1927337c95facabcab11b35de0560d-merged.mount: Deactivated successfully.
Dec 06 09:49:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d623bf221e6e39ec968f36fc3f06f79e6b1927337c95facabcab11b35de0560d-merged.mount: Deactivated successfully.
Dec 06 09:49:25 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:49:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:49:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:49:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42531 DF PROTO=TCP SPT=51766 DPT=9102 SEQ=4288700949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EED6710000000001030307) 
Dec 06 09:49:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:27 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:49:27 np0005548788.localdomain podman[242545]: 2025-12-06 09:49:27.527284391 +0000 UTC m=+0.054440281 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:49:27 np0005548788.localdomain podman[242545]: 2025-12-06 09:49:27.533126679 +0000 UTC m=+0.060282629 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:49:27 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:49:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:49:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-33d4ba4a6e0259b5150b68a23f46c9e702457315d900e4a8419ae01ffeed1203-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32528 DF PROTO=TCP SPT=34842 DPT=9100 SEQ=616252261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEDFF00000000001030307) 
Dec 06 09:49:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2-merged.mount: Deactivated successfully.
Dec 06 09:49:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42532 DF PROTO=TCP SPT=51766 DPT=9102 SEQ=4288700949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEF5F10000000001030307) 
Dec 06 09:49:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12839 DF PROTO=TCP SPT=33902 DPT=9101 SEQ=2575386986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EEF69E0000000001030307) 
Dec 06 09:49:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:35 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:49:36 np0005548788.localdomain podman[242567]: 2025-12-06 09:49:36.795505252 +0000 UTC m=+0.099420883 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:49:36 np0005548788.localdomain podman[242567]: 2025-12-06 09:49:36.799008649 +0000 UTC m=+0.102924290 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:49:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12841 DF PROTO=TCP SPT=33902 DPT=9101 SEQ=2575386986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF02B00000000001030307) 
Dec 06 09:49:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:38 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:49:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:49:39 np0005548788.localdomain systemd[1]: tmp-crun.jNdIHk.mount: Deactivated successfully.
Dec 06 09:49:39 np0005548788.localdomain podman[242585]: 2025-12-06 09:49:39.254826587 +0000 UTC m=+0.081482287 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 09:49:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:39 np0005548788.localdomain podman[242585]: 2025-12-06 09:49:39.339734757 +0000 UTC m=+0.166390487 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:49:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:39 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:49:39 np0005548788.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:39 np0005548788.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:40 np0005548788.localdomain podman[240078]: time="2025-12-06T09:49:40Z" level=error msg="Getting root fs size for \"46f4b5049eceea704db6cf14c49b8d2399b13041d3df39be208c4be91c5716f0\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:49:40 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:40 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21256 DF PROTO=TCP SPT=55274 DPT=9105 SEQ=221211691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF0DF00000000001030307) 
Dec 06 09:49:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-43c17fb190c64dffbdc1c529311c2298d3f0181f4231cb69a5ba1058447bd2b2-merged.mount: Deactivated successfully.
Dec 06 09:49:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-20be973d9ceea79ef95f10e6d248e592805035801284dea3de186096ff60ff28-merged.mount: Deactivated successfully.
Dec 06 09:49:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42522 DF PROTO=TCP SPT=38928 DPT=9100 SEQ=676714648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF18C00000000001030307) 
Dec 06 09:49:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:49:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:49:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:49:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42524 DF PROTO=TCP SPT=38928 DPT=9100 SEQ=676714648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF24B00000000001030307) 
Dec 06 09:49:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:49:47.409 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:49:47.409 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:49:47.410 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:49:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:49:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:49:48 np0005548788.localdomain podman[242608]: 2025-12-06 09:49:48.325255835 +0000 UTC m=+0.097551677 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:49:48 np0005548788.localdomain podman[242608]: 2025-12-06 09:49:48.357751936 +0000 UTC m=+0.130047808 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:49:48 np0005548788.localdomain podman[242608]: unhealthy
Dec 06 09:49:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64624 DF PROTO=TCP SPT=35678 DPT=9102 SEQ=1106288390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF2FC50000000001030307) 
Dec 06 09:49:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:49:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:49:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:49:50 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:50 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:49:50 np0005548788.localdomain podman[242626]: 2025-12-06 09:49:50.42363231 +0000 UTC m=+0.102294481 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:49:50 np0005548788.localdomain podman[242626]: 2025-12-06 09:49:50.463118524 +0000 UTC m=+0.141780675 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6)
Dec 06 09:49:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:49:51 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:49:51 np0005548788.localdomain podman[242648]: 2025-12-06 09:49:51.908872603 +0000 UTC m=+0.072141221 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:49:51 np0005548788.localdomain podman[242648]: 2025-12-06 09:49:51.945722528 +0000 UTC m=+0.108991136 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:49:51 np0005548788.localdomain podman[242648]: unhealthy
Dec 06 09:49:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64626 DF PROTO=TCP SPT=35678 DPT=9102 SEQ=1106288390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF3BB10000000001030307) 
Dec 06 09:49:52 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:52 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:49:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:53 np0005548788.localdomain sshd[242670]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:54 np0005548788.localdomain sshd[242670]: Received disconnect from 45.78.219.195 port 49270:11: Bye Bye [preauth]
Dec 06 09:49:54 np0005548788.localdomain sshd[242670]: Disconnected from authenticating user root 45.78.219.195 port 49270 [preauth]
Dec 06 09:49:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:49:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:49:55 np0005548788.localdomain podman[242672]: 2025-12-06 09:49:55.540999541 +0000 UTC m=+0.087914712 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:49:55 np0005548788.localdomain podman[242672]: 2025-12-06 09:49:55.592696328 +0000 UTC m=+0.139611449 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Dec 06 09:49:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:56.175 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:56.209 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:56.210 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:49:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:56.210 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:49:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:56.227 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: tmp-crun.KqfX1I.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-20be973d9ceea79ef95f10e6d248e592805035801284dea3de186096ff60ff28-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64627 DF PROTO=TCP SPT=35678 DPT=9102 SEQ=1106288390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF4B700000000001030307) 
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.203 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.204 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.204 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.205 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.205 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:49:58 np0005548788.localdomain podman[242690]: 2025-12-06 09:49:58.228133475 +0000 UTC m=+0.060975742 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:49:58 np0005548788.localdomain podman[242690]: 2025-12-06 09:49:58.235747867 +0000 UTC m=+0.068590124 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:49:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e09d5332b76f4ff47d3f47a68c6210f077613857ce41c59afde7fed1d48f940c-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e09d5332b76f4ff47d3f47a68c6210f077613857ce41c59afde7fed1d48f940c-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.691 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:49:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42526 DF PROTO=TCP SPT=38928 DPT=9100 SEQ=676714648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF53F00000000001030307) 
Dec 06 09:49:58 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.885 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.886 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13163MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.887 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.887 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.961 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.962 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:49:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:58.986 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:49:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:59.454 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:49:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:59.462 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:49:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:59.479 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:49:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:59.481 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:49:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:49:59.481 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:00.481 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:00.482 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:00.482 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:00.482 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 06 09:50:01 np0005548788.localdomain sshd[240985]: fatal: Timeout before authentication for 101.47.142.76 port 47604
Dec 06 09:50:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:01.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:50:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:50:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46391 DF PROTO=TCP SPT=40414 DPT=9101 SEQ=3522846137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF6BCE0000000001030307) 
Dec 06 09:50:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64628 DF PROTO=TCP SPT=35678 DPT=9102 SEQ=1106288390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF6BF00000000001030307) 
Dec 06 09:50:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-42c49c691de3a79d7c64c1e2b1baf2d52b814d8ec4049f7fba3b2602f1480e6a-merged.mount: Deactivated successfully.
Dec 06 09:50:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:50:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46393 DF PROTO=TCP SPT=40414 DPT=9101 SEQ=3522846137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF77F00000000001030307) 
Dec 06 09:50:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:50:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548788.localdomain podman[242757]: 2025-12-06 09:50:08.816122211 +0000 UTC m=+0.092954917 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:50:08 np0005548788.localdomain podman[242757]: 2025-12-06 09:50:08.848662533 +0000 UTC m=+0.125495229 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:50:09 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:09 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:50:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:50:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5302 writes, 23K keys, 5302 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5302 writes, 773 syncs, 6.86 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:50:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:50:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:50:09 np0005548788.localdomain podman[242791]: 2025-12-06 09:50:09.78849211 +0000 UTC m=+0.108930264 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 09:50:09 np0005548788.localdomain podman[242791]: 2025-12-06 09:50:09.901823256 +0000 UTC m=+0.222261410 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:50:09 np0005548788.localdomain sudo[242887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjwnbejskexckzokyirecjmgflzpnvnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014609.6417317-2568-39767511331536/AnsiballZ_podman_container_info.py
Dec 06 09:50:09 np0005548788.localdomain sudo[242887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:50:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e09d5332b76f4ff47d3f47a68c6210f077613857ce41c59afde7fed1d48f940c-merged.mount: Deactivated successfully.
Dec 06 09:50:10 np0005548788.localdomain python3.9[242889]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 06 09:50:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41334 DF PROTO=TCP SPT=35488 DPT=9105 SEQ=4283171124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF81F00000000001030307) 
Dec 06 09:50:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e09d5332b76f4ff47d3f47a68c6210f077613857ce41c59afde7fed1d48f940c-merged.mount: Deactivated successfully.
Dec 06 09:50:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:11 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:50:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548788.localdomain sudo[242887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3470 DF PROTO=TCP SPT=59248 DPT=9100 SEQ=1859320994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF8DEF0000000001030307) 
Dec 06 09:50:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:50:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.2 total, 600.0 interval
                                                          Cumulative writes: 5340 writes, 23K keys, 5340 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5340 writes, 664 syncs, 8.04 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:50:14 np0005548788.localdomain sudo[243010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brfjxhygfknrnkyodnfdbuoshyfoadgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014613.9413602-2576-262121294801148/AnsiballZ_podman_container_exec.py
Dec 06 09:50:14 np0005548788.localdomain sudo[243010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:14 np0005548788.localdomain python3.9[243012]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:14 np0005548788.localdomain systemd[1]: Started libpod-conmon-948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.scope.
Dec 06 09:50:14 np0005548788.localdomain podman[243013]: 2025-12-06 09:50:14.63011718 +0000 UTC m=+0.132552324 container exec 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:50:14 np0005548788.localdomain podman[243013]: 2025-12-06 09:50:14.666581862 +0000 UTC m=+0.169016996 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:50:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:15 np0005548788.localdomain sudo[243041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:50:15 np0005548788.localdomain sudo[243041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:50:15 np0005548788.localdomain sudo[243041]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:15 np0005548788.localdomain sudo[243059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:50:15 np0005548788.localdomain sudo[243059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:50:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:50:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3472 DF PROTO=TCP SPT=59248 DPT=9100 SEQ=1859320994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EF99F00000000001030307) 
Dec 06 09:50:16 np0005548788.localdomain sudo[243010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-99147ebdc96193466a769a6087d1e6a493b0ef22c4a7c12a11b9ab309a2616d8-merged.mount: Deactivated successfully.
Dec 06 09:50:17 np0005548788.localdomain sudo[243200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txyoubtzejsuczwcdllmctuesaotpwej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014617.020764-2584-83397198956530/AnsiballZ_podman_container_exec.py
Dec 06 09:50:17 np0005548788.localdomain sudo[243200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:17 np0005548788.localdomain python3.9[243202]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: libpod-conmon-948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.scope: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.scope.
Dec 06 09:50:18 np0005548788.localdomain podman[243203]: 2025-12-06 09:50:18.165858767 +0000 UTC m=+0.572437561 container exec 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:50:18 np0005548788.localdomain podman[243203]: 2025-12-06 09:50:18.195327517 +0000 UTC m=+0.601906311 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 09:50:18 np0005548788.localdomain sudo[243059]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:18 np0005548788.localdomain sudo[243250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:50:18 np0005548788.localdomain sudo[243250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:50:18 np0005548788.localdomain sudo[243250]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:19 np0005548788.localdomain sudo[243200]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61280 DF PROTO=TCP SPT=54906 DPT=9102 SEQ=1367158156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFA4F50000000001030307) 
Dec 06 09:50:19 np0005548788.localdomain sudo[243377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpiuhlljypnbcgwxdlcpawtxuscboiau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014619.2492619-2592-193860359927773/AnsiballZ_file.py
Dec 06 09:50:19 np0005548788.localdomain sudo[243377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:19 np0005548788.localdomain python3.9[243379]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:19 np0005548788.localdomain systemd[1]: libpod-conmon-948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.scope: Deactivated successfully.
Dec 06 09:50:19 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:19 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:19 np0005548788.localdomain sudo[243377]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:20 np0005548788.localdomain sudo[243487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbindfvyafqrznhhuqnntxspsknhvobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014619.9829936-2601-196181644425458/AnsiballZ_podman_container_info.py
Dec 06 09:50:20 np0005548788.localdomain sudo[243487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:20 np0005548788.localdomain python3.9[243489]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 06 09:50:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:50:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-433650af555efa19b8507439c2ca3ea6c59dd70b79e1a09f2868d9a23cd1c65b-merged.mount: Deactivated successfully.
Dec 06 09:50:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-433650af555efa19b8507439c2ca3ea6c59dd70b79e1a09f2868d9a23cd1c65b-merged.mount: Deactivated successfully.
Dec 06 09:50:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:21 np0005548788.localdomain podman[243501]: 2025-12-06 09:50:21.391456755 +0000 UTC m=+0.436422395 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Dec 06 09:50:21 np0005548788.localdomain podman[243501]: 2025-12-06 09:50:21.426771014 +0000 UTC m=+0.471736614 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:50:21 np0005548788.localdomain podman[243501]: unhealthy
Dec 06 09:50:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:50:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-42c49c691de3a79d7c64c1e2b1baf2d52b814d8ec4049f7fba3b2602f1480e6a-merged.mount: Deactivated successfully.
Dec 06 09:50:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61282 DF PROTO=TCP SPT=54906 DPT=9102 SEQ=1367158156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFB0F10000000001030307) 
Dec 06 09:50:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:50:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:50:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:50:24 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:24 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:50:24 np0005548788.localdomain sudo[243487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:24 np0005548788.localdomain podman[243532]: 2025-12-06 09:50:24.504839596 +0000 UTC m=+1.331181978 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:50:24 np0005548788.localdomain podman[243519]: 2025-12-06 09:50:24.551814546 +0000 UTC m=+2.623180654 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Dec 06 09:50:24 np0005548788.localdomain podman[243519]: 2025-12-06 09:50:24.598226386 +0000 UTC m=+2.669592454 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:50:24 np0005548788.localdomain podman[243532]: 2025-12-06 09:50:24.655053709 +0000 UTC m=+1.481396111 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:50:24 np0005548788.localdomain podman[243532]: unhealthy
Dec 06 09:50:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:25 np0005548788.localdomain sudo[243670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejxcfynoepxnqlhskfghcmnxdngwxpxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014624.7348099-2609-264906356506545/AnsiballZ_podman_container_exec.py
Dec 06 09:50:25 np0005548788.localdomain sudo[243670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:25 np0005548788.localdomain python3.9[243672]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61283 DF PROTO=TCP SPT=54906 DPT=9102 SEQ=1367158156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFC0B10000000001030307) 
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:50:26 np0005548788.localdomain podman[240078]: time="2025-12-06T09:50:26Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged: invalid argument"
Dec 06 09:50:26 np0005548788.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:26 np0005548788.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:26 np0005548788.localdomain podman[240078]: time="2025-12-06T09:50:26Z" level=error msg="Getting root fs size for \"77d096a8e0174836926c2d9c31426e5dd37427b5a5ec70f26f1b7283adc88732\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": creating overlay mount to /var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/H7YOTL3SFJRONSRX22VQHLOVZ2:/var/lib/containers/storage/overlay/l/J3YUH3ZFFKJKK3VHQQB2HTHNU7,upperdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/diff,workdir=/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/work,nodev,metacopy=on\": no such file or directory"
Dec 06 09:50:26 np0005548788.localdomain systemd[1]: Started libpod-conmon-66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.scope.
Dec 06 09:50:26 np0005548788.localdomain podman[243673]: 2025-12-06 09:50:26.79428977 +0000 UTC m=+1.491977795 container exec 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:50:26 np0005548788.localdomain podman[243682]: 2025-12-06 09:50:26.832301305 +0000 UTC m=+0.493919217 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 09:50:26 np0005548788.localdomain podman[243682]: 2025-12-06 09:50:26.841509237 +0000 UTC m=+0.503127119 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:50:26 np0005548788.localdomain podman[243673]: 2025-12-06 09:50:26.926975226 +0000 UTC m=+1.624663271 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:50:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3474 DF PROTO=TCP SPT=59248 DPT=9100 SEQ=1859320994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFC9F00000000001030307) 
Dec 06 09:50:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:50:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437-merged.mount: Deactivated successfully.
Dec 06 09:50:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437-merged.mount: Deactivated successfully.
Dec 06 09:50:29 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:50:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:29 np0005548788.localdomain sudo[243670]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:29 np0005548788.localdomain podman[243723]: 2025-12-06 09:50:29.448389773 +0000 UTC m=+0.348450967 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:50:29 np0005548788.localdomain podman[243723]: 2025-12-06 09:50:29.45964987 +0000 UTC m=+0.359711044 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:50:29 np0005548788.localdomain sudo[243851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugybybiwsgloxiwrmnylgovcrevbohmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014629.584818-2617-231026215027050/AnsiballZ_podman_container_exec.py
Dec 06 09:50:29 np0005548788.localdomain sudo[243851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:30 np0005548788.localdomain python3.9[243853]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:50:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:50:32 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:32 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:32 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:50:32 np0005548788.localdomain systemd[1]: libpod-conmon-66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.scope: Deactivated successfully.
Dec 06 09:50:32 np0005548788.localdomain systemd[1]: Started libpod-conmon-66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.scope.
Dec 06 09:50:32 np0005548788.localdomain podman[243854]: 2025-12-06 09:50:32.584292728 +0000 UTC m=+2.429665289 container exec 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:50:32 np0005548788.localdomain podman[243854]: 2025-12-06 09:50:32.617644865 +0000 UTC m=+2.463017436 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:50:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:33 np0005548788.localdomain sshd[243885]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:33 np0005548788.localdomain sshd[243887]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14927 DF PROTO=TCP SPT=48788 DPT=9101 SEQ=3252361628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFE0FE0000000001030307) 
Dec 06 09:50:34 np0005548788.localdomain sudo[243851]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13776 DF PROTO=TCP SPT=51660 DPT=9882 SEQ=2664945044 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFE1F00000000001030307) 
Dec 06 09:50:35 np0005548788.localdomain sudo[243996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvxlrjlbkoadeimkdmcichtdlkgqjjba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014635.1027308-2625-99792293570014/AnsiballZ_file.py
Dec 06 09:50:35 np0005548788.localdomain sudo[243996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:35 np0005548788.localdomain python3.9[243998]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:35 np0005548788.localdomain sudo[243996]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:36 np0005548788.localdomain sudo[244106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quxwpxjjbcoyvsiyviaoxegpfycpzieo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014635.9136477-2634-115056798104380/AnsiballZ_podman_container_info.py
Dec 06 09:50:36 np0005548788.localdomain sudo[244106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:36 np0005548788.localdomain python3.9[244108]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 06 09:50:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:50:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-99147ebdc96193466a769a6087d1e6a493b0ef22c4a7c12a11b9ab309a2616d8-merged.mount: Deactivated successfully.
Dec 06 09:50:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-99147ebdc96193466a769a6087d1e6a493b0ef22c4a7c12a11b9ab309a2616d8-merged.mount: Deactivated successfully.
Dec 06 09:50:37 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:37 np0005548788.localdomain systemd[1]: libpod-conmon-66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.scope: Deactivated successfully.
Dec 06 09:50:37 np0005548788.localdomain sshd[243885]: Received disconnect from 101.47.142.76 port 54544:11: Bye Bye [preauth]
Dec 06 09:50:37 np0005548788.localdomain sshd[243885]: Disconnected from authenticating user root 101.47.142.76 port 54544 [preauth]
Dec 06 09:50:37 np0005548788.localdomain sshd[243887]: Received disconnect from 45.78.194.186 port 53022:11: Bye Bye [preauth]
Dec 06 09:50:37 np0005548788.localdomain sshd[243887]: Disconnected from authenticating user root 45.78.194.186 port 53022 [preauth]
Dec 06 09:50:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14929 DF PROTO=TCP SPT=48788 DPT=9101 SEQ=3252361628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFECF00000000001030307) 
Dec 06 09:50:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:38 np0005548788.localdomain podman[240078]: time="2025-12-06T09:50:38Z" level=error msg="Getting root fs size for \"7885261b200386cd0ffbdda6d15f235b055340ef042f651d449d8e70b2af3d07\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Dec 06 09:50:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:39 np0005548788.localdomain sudo[244106]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:50:39 np0005548788.localdomain podman[244120]: 2025-12-06 09:50:39.483035622 +0000 UTC m=+0.067544052 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 09:50:39 np0005548788.localdomain podman[244120]: 2025-12-06 09:50:39.519588251 +0000 UTC m=+0.104096721 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 06 09:50:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:40 np0005548788.localdomain sudo[244244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awxhtbdbsmxuapmkseiakmugxfckxshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014639.876184-2642-271194281852883/AnsiballZ_podman_container_exec.py
Dec 06 09:50:40 np0005548788.localdomain sudo[244244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:40 np0005548788.localdomain python3.9[244246]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11663 DF PROTO=TCP SPT=33300 DPT=9105 SEQ=896202297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5EFF7F00000000001030307) 
Dec 06 09:50:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:50:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b72aa0b501df4a0ecf9e2c06f7d460b982575e1ef57771c6fbb48ab40682d902-merged.mount: Deactivated successfully.
Dec 06 09:50:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b72aa0b501df4a0ecf9e2c06f7d460b982575e1ef57771c6fbb48ab40682d902-merged.mount: Deactivated successfully.
Dec 06 09:50:41 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:41 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:50:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:50:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.scope.
Dec 06 09:50:41 np0005548788.localdomain podman[244247]: 2025-12-06 09:50:41.715268642 +0000 UTC m=+1.219334814 container exec 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:50:41 np0005548788.localdomain podman[244247]: 2025-12-06 09:50:41.745008064 +0000 UTC m=+1.249074236 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 06 09:50:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:42 np0005548788.localdomain podman[240078]: time="2025-12-06T09:50:42Z" level=error msg="Getting root fs size for \"7c5b6999572159d1f5c9c48354900660d66f1be920144c07dc38cc9116a87486\": getting diffsize of layer \"a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966: replacing mount point \"/var/lib/containers/storage/overlay/a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966/merged\": device or resource busy"
Dec 06 09:50:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-433650af555efa19b8507439c2ca3ea6c59dd70b79e1a09f2868d9a23cd1c65b-merged.mount: Deactivated successfully.
Dec 06 09:50:42 np0005548788.localdomain sudo[244244]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:42 np0005548788.localdomain podman[244259]: 2025-12-06 09:50:42.787436988 +0000 UTC m=+1.087612637 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:50:42 np0005548788.localdomain podman[244259]: 2025-12-06 09:50:42.869170239 +0000 UTC m=+1.169345868 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:50:43 np0005548788.localdomain sudo[244405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzgsfagcdtsgmosajaqgqkmyqidutwpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014642.9194894-2650-87328646271371/AnsiballZ_podman_container_exec.py
Dec 06 09:50:43 np0005548788.localdomain sudo[244405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:43 np0005548788.localdomain python3.9[244407]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13491 DF PROTO=TCP SPT=46904 DPT=9100 SEQ=2044465080 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F003200000000001030307) 
Dec 06 09:50:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6697f986b9c629bb25e894d1903f9f626e3727e4d7efcca4b461d4a3e304ed0e-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:50:45 np0005548788.localdomain systemd[1]: libpod-conmon-6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.scope: Deactivated successfully.
Dec 06 09:50:45 np0005548788.localdomain systemd[1]: Started libpod-conmon-6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.scope.
Dec 06 09:50:45 np0005548788.localdomain podman[244408]: 2025-12-06 09:50:45.653278742 +0000 UTC m=+2.201176595 container exec 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:50:45 np0005548788.localdomain podman[244408]: 2025-12-06 09:50:45.689760779 +0000 UTC m=+2.237658652 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:50:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:50:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:50:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13493 DF PROTO=TCP SPT=46904 DPT=9100 SEQ=2044465080 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F00F300000000001030307) 
Dec 06 09:50:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:50:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:50:47.410 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:50:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:50:47.410 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:50:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:50:47.411 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:48 np0005548788.localdomain sudo[244405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:48 np0005548788.localdomain sudo[244542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adtomlyoyonwlvgsvpybnpnomuljaeog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014648.4982626-2658-1734111894590/AnsiballZ_file.py
Dec 06 09:50:48 np0005548788.localdomain sudo[244542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:48 np0005548788.localdomain python3.9[244544]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:49 np0005548788.localdomain sudo[244542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53037 DF PROTO=TCP SPT=57200 DPT=9102 SEQ=1475014878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F01A250000000001030307) 
Dec 06 09:50:49 np0005548788.localdomain sudo[244652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khzlkaiotbctuopdnzhtoqwayymujbix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014649.2161121-2667-9934589970922/AnsiballZ_podman_container_info.py
Dec 06 09:50:49 np0005548788.localdomain sudo[244652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548788.localdomain systemd[1]: libpod-conmon-6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.scope: Deactivated successfully.
Dec 06 09:50:49 np0005548788.localdomain python3.9[244654]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 06 09:50:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:50 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53039 DF PROTO=TCP SPT=57200 DPT=9102 SEQ=1475014878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F026300000000001030307) 
Dec 06 09:50:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:53 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:53 np0005548788.localdomain sudo[244652]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:54 np0005548788.localdomain sudo[244774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oowwktrduirmdlhzphkttioibfdnjucu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014654.5039263-2675-236494850465117/AnsiballZ_podman_container_exec.py
Dec 06 09:50:54 np0005548788.localdomain sudo[244774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:50:54 np0005548788.localdomain systemd[1]: tmp-crun.sHiJmr.mount: Deactivated successfully.
Dec 06 09:50:54 np0005548788.localdomain podman[244776]: 2025-12-06 09:50:54.98690197 +0000 UTC m=+0.116369950 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:50:55 np0005548788.localdomain podman[244776]: 2025-12-06 09:50:55.021724654 +0000 UTC m=+0.151192654 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:50:55 np0005548788.localdomain podman[244776]: unhealthy
Dec 06 09:50:55 np0005548788.localdomain python3.9[244777]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.230 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.231 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.231 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.252 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.252 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.252 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 09:50:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:56.280 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53040 DF PROTO=TCP SPT=57200 DPT=9102 SEQ=1475014878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F035F00000000001030307) 
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cb7df739b31e25ca2bf30085ecbc7a2f1ac88d5541406f15567163c59dcda437-merged.mount: Deactivated successfully.
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:56 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Failed with result 'exit-code'.
Dec 06 09:50:56 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:56 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:56 np0005548788.localdomain podman[244804]: 2025-12-06 09:50:56.978517953 +0000 UTC m=+0.184224451 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:50:57 np0005548788.localdomain podman[244805]: 2025-12-06 09:50:57.019032146 +0000 UTC m=+0.220963144 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:50:57 np0005548788.localdomain podman[244805]: 2025-12-06 09:50:57.035648073 +0000 UTC m=+0.237579051 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Dec 06 09:50:57 np0005548788.localdomain podman[244804]: 2025-12-06 09:50:57.062820495 +0000 UTC m=+0.268526993 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:50:57 np0005548788.localdomain podman[244804]: unhealthy
Dec 06 09:50:57 np0005548788.localdomain systemd[1]: Started libpod-conmon-2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.scope.
Dec 06 09:50:57 np0005548788.localdomain podman[244794]: 2025-12-06 09:50:57.16362727 +0000 UTC m=+2.084008442 container exec 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 09:50:57 np0005548788.localdomain podman[244794]: 2025-12-06 09:50:57.198671822 +0000 UTC m=+2.119052984 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:50:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.241 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.268 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.269 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.269 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.269 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.270 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:50:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.697 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.952 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.955 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13015MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.956 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:50:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:58.956 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.094 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.094 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:50:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13495 DF PROTO=TCP SPT=46904 DPT=9100 SEQ=2044465080 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F03FF00000000001030307) 
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.160 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.179 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.179 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.194 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.212 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NODE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.289 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:50:59 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:59 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:50:59 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:59 np0005548788.localdomain sudo[244774]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.791 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.799 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.821 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.824 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:50:59 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:50:59.824 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:59 np0005548788.localdomain sudo[245025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxjkzsoepjrwofqozcikzvufktchrsat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014659.6670418-2683-178233193598710/AnsiballZ_podman_container_exec.py
Dec 06 09:50:59 np0005548788.localdomain sudo[245025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:00 np0005548788.localdomain python3.9[245027]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:00.758 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:00.760 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:00.760 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:51:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:01.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:01.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:01.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:01.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:01.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548788.localdomain systemd[1]: libpod-conmon-2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.scope: Deactivated successfully.
Dec 06 09:51:01 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:01 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:01 np0005548788.localdomain podman[244904]: 2025-12-06 09:51:01.970465773 +0000 UTC m=+2.511042759 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 09:51:02 np0005548788.localdomain podman[244904]: 2025-12-06 09:51:02.008875321 +0000 UTC m=+2.549452337 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 09:51:02 np0005548788.localdomain systemd[1]: Started libpod-conmon-2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.scope.
Dec 06 09:51:02 np0005548788.localdomain podman[245028]: 2025-12-06 09:51:02.036058982 +0000 UTC m=+1.799871775 container exec 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:51:02 np0005548788.localdomain podman[245028]: 2025-12-06 09:51:02.069708128 +0000 UTC m=+1.833521001 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:51:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:51:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:51:03 np0005548788.localdomain sudo[245025]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:03 np0005548788.localdomain podman[245064]: 2025-12-06 09:51:03.192376466 +0000 UTC m=+0.295955913 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:51:03 np0005548788.localdomain podman[245064]: 2025-12-06 09:51:03.231589159 +0000 UTC m=+0.335168576 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:51:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:04 np0005548788.localdomain sudo[245196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xempsaiccksbusevixdtueufvohytrup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014663.332602-2691-111144741647451/AnsiballZ_file.py
Dec 06 09:51:04 np0005548788.localdomain sudo[245196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:04 np0005548788.localdomain python3.9[245198]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:04 np0005548788.localdomain sudo[245196]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:04 np0005548788.localdomain auditd[728]: Audit daemon rotating log files
Dec 06 09:51:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53041 DF PROTO=TCP SPT=57200 DPT=9102 SEQ=1475014878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F055F00000000001030307) 
Dec 06 09:51:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41628 DF PROTO=TCP SPT=39846 DPT=9101 SEQ=1816263762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0562E0000000001030307) 
Dec 06 09:51:04 np0005548788.localdomain sudo[245306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xudhzldpqvnmjcdjaytunhqasmaisqao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014664.63325-2700-258811745077022/AnsiballZ_podman_container_info.py
Dec 06 09:51:04 np0005548788.localdomain sudo[245306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:05 np0005548788.localdomain python3.9[245308]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 06 09:51:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-55120ee4b4cc047c306789233609c6fd6d29f8fc75a55d7a3544aaf3d7f6ad35-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-55120ee4b4cc047c306789233609c6fd6d29f8fc75a55d7a3544aaf3d7f6ad35-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:51:05 np0005548788.localdomain systemd[1]: libpod-conmon-2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.scope: Deactivated successfully.
Dec 06 09:51:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:51:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41630 DF PROTO=TCP SPT=39846 DPT=9101 SEQ=1816263762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F062310000000001030307) 
Dec 06 09:51:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:51:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b72aa0b501df4a0ecf9e2c06f7d460b982575e1ef57771c6fbb48ab40682d902-merged.mount: Deactivated successfully.
Dec 06 09:51:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b72aa0b501df4a0ecf9e2c06f7d460b982575e1ef57771c6fbb48ab40682d902-merged.mount: Deactivated successfully.
Dec 06 09:51:09 np0005548788.localdomain sudo[245306]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:09 np0005548788.localdomain sudo[245427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bywkmhdjsbxeycsmmmnzkkbviwgdsjwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014669.5755372-2708-113780604957739/AnsiballZ_podman_container_exec.py
Dec 06 09:51:09 np0005548788.localdomain sudo[245427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548788.localdomain python3.9[245429]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548788.localdomain systemd[1]: Started libpod-conmon-315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.scope.
Dec 06 09:51:10 np0005548788.localdomain podman[245430]: 2025-12-06 09:51:10.243420788 +0000 UTC m=+0.136493638 container exec 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:10 np0005548788.localdomain podman[245430]: 2025-12-06 09:51:10.273575984 +0000 UTC m=+0.166648864 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:51:10 np0005548788.localdomain sudo[245427]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548788.localdomain sudo[245566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcrbbrjmqtocjpubuzhxchwktcdynoew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014671.1737409-2716-103688093032031/AnsiballZ_podman_container_exec.py
Dec 06 09:51:11 np0005548788.localdomain sudo[245566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: libpod-conmon-315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.scope: Deactivated successfully.
Dec 06 09:51:11 np0005548788.localdomain python3.9[245568]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: Started libpod-conmon-315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.scope.
Dec 06 09:51:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:51:11 np0005548788.localdomain podman[245569]: 2025-12-06 09:51:11.844954345 +0000 UTC m=+0.132360016 container exec 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:51:11 np0005548788.localdomain podman[245569]: 2025-12-06 09:51:11.876926728 +0000 UTC m=+0.164332439 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:51:11 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41631 DF PROTO=TCP SPT=39846 DPT=9101 SEQ=1816263762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F071F10000000001030307) 
Dec 06 09:51:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:51:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6697f986b9c629bb25e894d1903f9f626e3727e4d7efcca4b461d4a3e304ed0e-merged.mount: Deactivated successfully.
Dec 06 09:51:12 np0005548788.localdomain sudo[245566]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:12 np0005548788.localdomain podman[245583]: 2025-12-06 09:51:12.564602907 +0000 UTC m=+0.719134597 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:51:12 np0005548788.localdomain podman[245583]: 2025-12-06 09:51:12.634798403 +0000 UTC m=+0.789330103 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 06 09:51:12 np0005548788.localdomain sudo[245722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unuxfxrrtwnnplckviuqllibsvzyxjpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014672.700752-2724-139891387553846/AnsiballZ_file.py
Dec 06 09:51:12 np0005548788.localdomain sudo[245722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:13 np0005548788.localdomain python3.9[245724]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:13 np0005548788.localdomain sudo[245722]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7053 DF PROTO=TCP SPT=53534 DPT=9100 SEQ=2728563586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0784F0000000001030307) 
Dec 06 09:51:13 np0005548788.localdomain sudo[245832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eztdiicrnewqknaqjktwrseoyzsuldev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014673.3920267-2733-207376892485653/AnsiballZ_podman_container_info.py
Dec 06 09:51:13 np0005548788.localdomain sudo[245832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:13 np0005548788.localdomain systemd[1]: libpod-conmon-315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.scope: Deactivated successfully.
Dec 06 09:51:13 np0005548788.localdomain python3.9[245834]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 06 09:51:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:14 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:51:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:51:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91-merged.mount: Deactivated successfully.
Dec 06 09:51:16 np0005548788.localdomain sudo[245832]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:16 np0005548788.localdomain podman[245849]: 2025-12-06 09:51:16.21177181 +0000 UTC m=+0.115285437 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:51:16 np0005548788.localdomain podman[245849]: 2025-12-06 09:51:16.264568682 +0000 UTC m=+0.168082259 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 06 09:51:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7055 DF PROTO=TCP SPT=53534 DPT=9100 SEQ=2728563586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F084700000000001030307) 
Dec 06 09:51:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:51:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:17 np0005548788.localdomain sudo[245980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upwzlfevyjktioxbeldpmegdekcbirxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014676.9732547-2741-274406221947821/AnsiballZ_podman_container_exec.py
Dec 06 09:51:17 np0005548788.localdomain sudo[245980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:17 np0005548788.localdomain python3.9[245982]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548788.localdomain sudo[245995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:51:19 np0005548788.localdomain sudo[245995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:51:19 np0005548788.localdomain sudo[245995]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:19 np0005548788.localdomain sudo[246013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:51:19 np0005548788.localdomain sudo[246013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:51:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:51:19 np0005548788.localdomain systemd[1]: Started libpod-conmon-b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.scope.
Dec 06 09:51:19 np0005548788.localdomain podman[245983]: 2025-12-06 09:51:19.338932227 +0000 UTC m=+1.854289711 container exec b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:19 np0005548788.localdomain podman[245983]: 2025-12-06 09:51:19.369767455 +0000 UTC m=+1.885124908 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:51:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48524 DF PROTO=TCP SPT=41990 DPT=9102 SEQ=2907473014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F08F540000000001030307) 
Dec 06 09:51:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:51:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:21 np0005548788.localdomain sudo[245980]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:21 np0005548788.localdomain sudo[246169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xreneqrhgampqjkjgtdyzzoeoarlscyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014681.558065-2749-85329123250369/AnsiballZ_podman_container_exec.py
Dec 06 09:51:21 np0005548788.localdomain sudo[246169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:22 np0005548788.localdomain python3.9[246171]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48526 DF PROTO=TCP SPT=41990 DPT=9102 SEQ=2907473014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F09B700000000001030307) 
Dec 06 09:51:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:22 np0005548788.localdomain systemd[1]: libpod-conmon-b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.scope: Deactivated successfully.
Dec 06 09:51:22 np0005548788.localdomain systemd[1]: Started libpod-conmon-b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.scope.
Dec 06 09:51:22 np0005548788.localdomain podman[246172]: 2025-12-06 09:51:22.867693485 +0000 UTC m=+0.799540205 container exec b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:51:22 np0005548788.localdomain podman[246172]: 2025-12-06 09:51:22.90065344 +0000 UTC m=+0.832500130 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:23 np0005548788.localdomain sudo[246013]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:23 np0005548788.localdomain sudo[246220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:51:23 np0005548788.localdomain sudo[246220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:51:23 np0005548788.localdomain sudo[246220]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548788.localdomain systemd[1]: libpod-conmon-b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.scope: Deactivated successfully.
Dec 06 09:51:25 np0005548788.localdomain sudo[246169]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:25 np0005548788.localdomain sudo[246345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtkhsduhfndrigcqyzbczxstqedqtnpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014685.4437244-2757-129927353829971/AnsiballZ_file.py
Dec 06 09:51:25 np0005548788.localdomain sudo[246345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548788.localdomain python3.9[246347]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:25 np0005548788.localdomain sudo[246345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48527 DF PROTO=TCP SPT=41990 DPT=9102 SEQ=2907473014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0AB300000000001030307) 
Dec 06 09:51:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:26 np0005548788.localdomain sudo[246455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgfykykwqvmtrvjmdhnssayslynqgxsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014686.4396358-2766-244526995449262/AnsiballZ_podman_container_info.py
Dec 06 09:51:26 np0005548788.localdomain sudo[246455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:26 np0005548788.localdomain python3.9[246457]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 06 09:51:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:51:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-55120ee4b4cc047c306789233609c6fd6d29f8fc75a55d7a3544aaf3d7f6ad35-merged.mount: Deactivated successfully.
Dec 06 09:51:28 np0005548788.localdomain podman[246468]: 2025-12-06 09:51:28.281401607 +0000 UTC m=+1.298490821 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible)
Dec 06 09:51:28 np0005548788.localdomain podman[246468]: 2025-12-06 09:51:28.291514386 +0000 UTC m=+1.308603570 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 09:51:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7057 DF PROTO=TCP SPT=53534 DPT=9100 SEQ=2728563586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0B3F10000000001030307) 
Dec 06 09:51:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:51:29 np0005548788.localdomain sudo[246455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:29 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:30 np0005548788.localdomain podman[246545]: 2025-12-06 09:51:30.032680558 +0000 UTC m=+0.109533644 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:51:30 np0005548788.localdomain podman[246545]: 2025-12-06 09:51:30.068725441 +0000 UTC m=+0.145578547 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:51:30 np0005548788.localdomain podman[246545]: unhealthy
Dec 06 09:51:30 np0005548788.localdomain podman[246546]: 2025-12-06 09:51:30.110334918 +0000 UTC m=+0.184488869 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git)
Dec 06 09:51:30 np0005548788.localdomain podman[246546]: 2025-12-06 09:51:30.143962435 +0000 UTC m=+0.218116386 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Dec 06 09:51:30 np0005548788.localdomain sudo[246637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvtwurgjcgrnwimmlfywrpkrckjgyluc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014689.4699442-2774-222753744795526/AnsiballZ_podman_container_exec.py
Dec 06 09:51:30 np0005548788.localdomain sudo[246637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:30 np0005548788.localdomain python3.9[246639]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:51:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: Started libpod-conmon-b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.scope.
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Dec 06 09:51:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Dec 06 09:51:32 np0005548788.localdomain podman[246640]: 2025-12-06 09:51:32.350844325 +0000 UTC m=+1.868461604 container exec b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 09:51:32 np0005548788.localdomain podman[246640]: 2025-12-06 09:51:32.359540208 +0000 UTC m=+1.877157427 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64)
Dec 06 09:51:32 np0005548788.localdomain sudo[246637]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:33 np0005548788.localdomain sudo[246776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luttrdiabsxueokhnidrymbtmnnyiakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014692.7619085-2782-166966420675354/AnsiballZ_podman_container_exec.py
Dec 06 09:51:33 np0005548788.localdomain sudo[246776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:51:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:51:33 np0005548788.localdomain systemd[1]: libpod-conmon-b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.scope: Deactivated successfully.
Dec 06 09:51:33 np0005548788.localdomain podman[246779]: 2025-12-06 09:51:33.270660633 +0000 UTC m=+0.088367739 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible)
Dec 06 09:51:33 np0005548788.localdomain podman[246779]: 2025-12-06 09:51:33.281807043 +0000 UTC m=+0.099514169 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:51:33 np0005548788.localdomain python3.9[246778]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:51:34 np0005548788.localdomain systemd[1]: tmp-crun.MeWN7Y.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1415 DF PROTO=TCP SPT=39374 DPT=9101 SEQ=1764078799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0CB5E0000000001030307) 
Dec 06 09:51:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8095 DF PROTO=TCP SPT=35798 DPT=9882 SEQ=629987170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0CBF00000000001030307) 
Dec 06 09:51:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:35 np0005548788.localdomain sshd[246807]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:51:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:51:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:36 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:36 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:36 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:51:36 np0005548788.localdomain systemd[1]: Started libpod-conmon-b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.scope.
Dec 06 09:51:36 np0005548788.localdomain podman[246796]: 2025-12-06 09:51:36.09343963 +0000 UTC m=+2.752115247 container exec b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, architecture=x86_64)
Dec 06 09:51:36 np0005548788.localdomain podman[246809]: 2025-12-06 09:51:36.121306046 +0000 UTC m=+0.174469426 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:36 np0005548788.localdomain podman[246809]: 2025-12-06 09:51:36.15392937 +0000 UTC m=+0.207092710 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:51:36 np0005548788.localdomain podman[246796]: 2025-12-06 09:51:36.177790401 +0000 UTC m=+2.836465988 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350)
Dec 06 09:51:36 np0005548788.localdomain sshd[246807]: Received disconnect from 148.227.3.232 port 59776:11: Bye Bye [preauth]
Dec 06 09:51:36 np0005548788.localdomain sshd[246807]: Disconnected from authenticating user root 148.227.3.232 port 59776 [preauth]
Dec 06 09:51:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1417 DF PROTO=TCP SPT=39374 DPT=9101 SEQ=1764078799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0D7700000000001030307) 
Dec 06 09:51:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:38 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:51:38 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:38 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:38 np0005548788.localdomain sudo[246776]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:38 np0005548788.localdomain sudo[246959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smbdudqbvzqlvuamzsxcwszlngpfmomx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014698.6639903-2790-64677586359811/AnsiballZ_file.py
Dec 06 09:51:38 np0005548788.localdomain sudo[246959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548788.localdomain python3.9[246961]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:39 np0005548788.localdomain sudo[246959]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:39 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:39 np0005548788.localdomain systemd[1]: libpod-conmon-b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.scope: Deactivated successfully.
Dec 06 09:51:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51735 DF PROTO=TCP SPT=33510 DPT=9105 SEQ=3229338825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0E1F00000000001030307) 
Dec 06 09:51:40 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a2e885810f243f9dc065bb26a649bea50c2f3286c2facda521bb1a0b66a38d91-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:51:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:51:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:51:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3b38e1b3ae4684200fe2ade3176809882413b165be2193c39c12f1ac0f693972-merged.mount: Deactivated successfully.
Dec 06 09:51:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14640 DF PROTO=TCP SPT=44014 DPT=9100 SEQ=2857095558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0ED7F0000000001030307) 
Dec 06 09:51:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:51:44 np0005548788.localdomain podman[246979]: 2025-12-06 09:51:44.394703426 +0000 UTC m=+0.096077540 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:51:44 np0005548788.localdomain podman[246979]: 2025-12-06 09:51:44.427394034 +0000 UTC m=+0.128768108 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:51:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:45 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:51:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14642 DF PROTO=TCP SPT=44014 DPT=9100 SEQ=2857095558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F0F9710000000001030307) 
Dec 06 09:51:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:51:47.411 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:51:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:51:47.412 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:51:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:51:47.412 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:51:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b131d40d9f7e4c77f700140905b9f02aae8796a802130306ce053e341f06500e-merged.mount: Deactivated successfully.
Dec 06 09:51:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:51:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:51:49 np0005548788.localdomain systemd[1]: tmp-crun.LmDR9P.mount: Deactivated successfully.
Dec 06 09:51:49 np0005548788.localdomain podman[246998]: 2025-12-06 09:51:49.407300535 +0000 UTC m=+0.107030595 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:51:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4622 DF PROTO=TCP SPT=47176 DPT=9102 SEQ=2509426486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F104840000000001030307) 
Dec 06 09:51:49 np0005548788.localdomain podman[246998]: 2025-12-06 09:51:49.450748511 +0000 UTC m=+0.150478571 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 06 09:51:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:51:50 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:51:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:51:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4624 DF PROTO=TCP SPT=47176 DPT=9102 SEQ=2509426486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F110700000000001030307) 
Dec 06 09:51:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1e604deea57dbda554a168861cff1238f93b8c6c69c863c43aed37d9d99c5fed-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:52 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:51:56 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:56.177 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4625 DF PROTO=TCP SPT=47176 DPT=9102 SEQ=2509426486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F120300000000001030307) 
Dec 06 09:51:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9f8522de440bc74dcb16bf54418e2bbfa2c88457c079b686ca75647399637b8a-merged.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:57.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:57.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:51:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:57.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:51:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:51:57.208 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:51:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:58 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:58 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14644 DF PROTO=TCP SPT=44014 DPT=9100 SEQ=2857095558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F129F00000000001030307) 
Dec 06 09:51:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:51:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548788.localdomain systemd[1]: tmp-crun.2FzvzM.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548788.localdomain podman[247023]: 2025-12-06 09:51:59.656417522 +0000 UTC m=+0.094795621 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:51:59 np0005548788.localdomain podman[247023]: 2025-12-06 09:51:59.667484899 +0000 UTC m=+0.105863008 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 09:51:59 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:59 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:59 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:59 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.209 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.210 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.210 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.210 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.211 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:52:00 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:00 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.715 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.919 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.920 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13102MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.921 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.921 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.978 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:52:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:00.979 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:52:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:01.000 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:52:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:01.470 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:52:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:01.478 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:52:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:01.506 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:52:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:01.509 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:52:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:01.510 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:52:02 np0005548788.localdomain podman[247087]: 2025-12-06 09:52:02.38859016 +0000 UTC m=+0.093007034 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64)
Dec 06 09:52:02 np0005548788.localdomain podman[247087]: 2025-12-06 09:52:02.401562188 +0000 UTC m=+0.105979062 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 09:52:02 np0005548788.localdomain podman[247086]: 2025-12-06 09:52:02.370508512 +0000 UTC m=+0.082364700 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:52:02 np0005548788.localdomain podman[247086]: 2025-12-06 09:52:02.453743217 +0000 UTC m=+0.165599355 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:52:02 np0005548788.localdomain podman[247086]: unhealthy
Dec 06 09:52:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:02.510 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:02.511 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:02.511 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3b38e1b3ae4684200fe2ade3176809882413b165be2193c39c12f1ac0f693972-merged.mount: Deactivated successfully.
Dec 06 09:52:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:03.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:03.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:03 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:52:03 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:52:03 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:52:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3b38e1b3ae4684200fe2ade3176809882413b165be2193c39c12f1ac0f693972-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4626 DF PROTO=TCP SPT=47176 DPT=9102 SEQ=2509426486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F13FF00000000001030307) 
Dec 06 09:52:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23426 DF PROTO=TCP SPT=52718 DPT=9101 SEQ=919948797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1408E0000000001030307) 
Dec 06 09:52:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:05 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:05 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:05 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-07b91d8553238d978835abe46e21a46235f67e708d84225f3bfb3e48b6b851c7-merged.mount: Deactivated successfully.
Dec 06 09:52:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:52:06 np0005548788.localdomain podman[247127]: 2025-12-06 09:52:06.275089087 +0000 UTC m=+0.104127853 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 09:52:06 np0005548788.localdomain podman[247127]: 2025-12-06 09:52:06.290783961 +0000 UTC m=+0.119822747 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:52:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:52:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b131d40d9f7e4c77f700140905b9f02aae8796a802130306ce053e341f06500e-merged.mount: Deactivated successfully.
Dec 06 09:52:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b131d40d9f7e4c77f700140905b9f02aae8796a802130306ce053e341f06500e-merged.mount: Deactivated successfully.
Dec 06 09:52:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23428 DF PROTO=TCP SPT=52718 DPT=9101 SEQ=919948797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F14CB00000000001030307) 
Dec 06 09:52:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:08 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:52:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:52:08 np0005548788.localdomain podman[247146]: 2025-12-06 09:52:08.79326334 +0000 UTC m=+0.117271827 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:52:08 np0005548788.localdomain podman[247146]: 2025-12-06 09:52:08.831595785 +0000 UTC m=+0.155604192 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:52:09 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:10 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:10 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:52:10 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55324 DF PROTO=TCP SPT=44116 DPT=9105 SEQ=4235590202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F157F10000000001030307) 
Dec 06 09:52:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:12 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:12 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:13 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:13 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:13 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:13 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:13 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10301 DF PROTO=TCP SPT=57776 DPT=9100 SEQ=3291861304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F162B00000000001030307) 
Dec 06 09:52:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:52:15 np0005548788.localdomain podman[247169]: 2025-12-06 09:52:15.945126494 +0000 UTC m=+0.099029213 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:52:15 np0005548788.localdomain podman[247169]: 2025-12-06 09:52:15.98253702 +0000 UTC m=+0.136439769 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:52:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5d687d9df5c689c1a5e0b342e3438942cdf880bd73e43f7e4784d4fa7fbfaa52-merged.mount: Deactivated successfully.
Dec 06 09:52:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:16 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10303 DF PROTO=TCP SPT=57776 DPT=9100 SEQ=3291861304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F16EB00000000001030307) 
Dec 06 09:52:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9f8522de440bc74dcb16bf54418e2bbfa2c88457c079b686ca75647399637b8a-merged.mount: Deactivated successfully.
Dec 06 09:52:16 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:52:16 np0005548788.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:16 np0005548788.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:16 np0005548788.localdomain podman[240078]: time="2025-12-06T09:52:16Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7/merged: invalid argument"
Dec 06 09:52:16 np0005548788.localdomain podman[240078]: time="2025-12-06T09:52:16Z" level=error msg="Getting root fs size for \"e1d97f740d6a527f38f1c38f998a6190aed624bb2e344bc38d54370a831e8bbf\": getting diffsize of layer \"cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7\" and its parent \"d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c\": creating overlay mount to /var/lib/containers/storage/overlay/cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/Q4JFRVGDZGIO2PJAKMI7B5CHKQ:/var/lib/containers/storage/overlay/l/MA7UDNJTZNFDVWMCIVVA55RYY5:/var/lib/containers/storage/overlay/l/H7YOTL3SFJRONSRX22VQHLOVZ2:/var/lib/containers/storage/overlay/l/J3YUH3ZFFKJKK3VHQQB2HTHNU7,upperdir=/var/lib/containers/storage/overlay/cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7/diff,workdir=/var/lib/containers/storage/overlay/cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7/work,nodev,metacopy=on\": no such file or directory"
Dec 06 09:52:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c957b2bd9594b3144e0926de62e77156dbabeefc8e2acf756f315a98b85b5f52-merged.mount: Deactivated successfully.
Dec 06 09:52:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29034 DF PROTO=TCP SPT=33760 DPT=9102 SEQ=121088017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F179B50000000001030307) 
Dec 06 09:52:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:52:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548788.localdomain systemd[1]: tmp-crun.CFOg9h.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548788.localdomain podman[247188]: 2025-12-06 09:52:20.273358874 +0000 UTC m=+0.090890087 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:52:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548788.localdomain podman[247188]: 2025-12-06 09:52:20.342611501 +0000 UTC m=+0.160142674 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 09:52:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:21 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:52:21 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29036 DF PROTO=TCP SPT=33760 DPT=9102 SEQ=121088017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F185B00000000001030307) 
Dec 06 09:52:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:22 np0005548788.localdomain sshd[247213]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548788.localdomain sudo[247215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:52:23 np0005548788.localdomain sudo[247215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:52:23 np0005548788.localdomain sudo[247215]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:23 np0005548788.localdomain sudo[247233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:52:23 np0005548788.localdomain sudo[247233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:52:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-07b91d8553238d978835abe46e21a46235f67e708d84225f3bfb3e48b6b851c7-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:25 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:26 np0005548788.localdomain sudo[247233]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:26 np0005548788.localdomain sshd[247213]: Received disconnect from 45.78.219.195 port 57520:11: Bye Bye [preauth]
Dec 06 09:52:26 np0005548788.localdomain sshd[247213]: Disconnected from authenticating user root 45.78.219.195 port 57520 [preauth]
Dec 06 09:52:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29037 DF PROTO=TCP SPT=33760 DPT=9102 SEQ=121088017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F195700000000001030307) 
Dec 06 09:52:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:27 np0005548788.localdomain sudo[247283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:52:27 np0005548788.localdomain sudo[247283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:52:27 np0005548788.localdomain sudo[247283]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:28 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10305 DF PROTO=TCP SPT=57776 DPT=9100 SEQ=3291861304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F19DF00000000001030307) 
Dec 06 09:52:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df-merged.mount: Deactivated successfully.
Dec 06 09:52:30 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:30 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:52:30 np0005548788.localdomain podman[247301]: 2025-12-06 09:52:30.171321744 +0000 UTC m=+0.102923565 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute)
Dec 06 09:52:30 np0005548788.localdomain podman[247301]: 2025-12-06 09:52:30.211527908 +0000 UTC m=+0.143129759 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:52:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:32 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:32 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:52:34 np0005548788.localdomain podman[247320]: 2025-12-06 09:52:34.213121842 +0000 UTC m=+0.082856614 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:52:34 np0005548788.localdomain podman[247320]: 2025-12-06 09:52:34.23370023 +0000 UTC m=+0.103435012 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:52:34 np0005548788.localdomain podman[247320]: unhealthy
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34346 DF PROTO=TCP SPT=46144 DPT=9101 SEQ=1567719217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1B5BE0000000001030307) 
Dec 06 09:52:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59190 DF PROTO=TCP SPT=56940 DPT=9882 SEQ=1906470182 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1B5F00000000001030307) 
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:35 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:52:35 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:52:35 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:35 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:35 np0005548788.localdomain podman[247321]: 2025-12-06 09:52:35.298786004 +0000 UTC m=+1.165086628 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec 06 09:52:35 np0005548788.localdomain podman[247321]: 2025-12-06 09:52:35.311837573 +0000 UTC m=+1.178138137 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:52:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:36 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:36 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:36 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:52:36 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34348 DF PROTO=TCP SPT=46144 DPT=9101 SEQ=1567719217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1C1B00000000001030307) 
Dec 06 09:52:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:52:38 np0005548788.localdomain systemd[1]: tmp-crun.f38D2R.mount: Deactivated successfully.
Dec 06 09:52:38 np0005548788.localdomain podman[247361]: 2025-12-06 09:52:38.778381083 +0000 UTC m=+0.107574242 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 09:52:38 np0005548788.localdomain podman[247361]: 2025-12-06 09:52:38.79072615 +0000 UTC m=+0.119919319 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 06 09:52:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5d687d9df5c689c1a5e0b342e3438942cdf880bd73e43f7e4784d4fa7fbfaa52-merged.mount: Deactivated successfully.
Dec 06 09:52:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:40 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61253 DF PROTO=TCP SPT=50024 DPT=9105 SEQ=2818832455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1CBF00000000001030307) 
Dec 06 09:52:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d-merged.mount: Deactivated successfully.
Dec 06 09:52:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:52:40 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:52:40 np0005548788.localdomain podman[247380]: 2025-12-06 09:52:40.844780097 +0000 UTC m=+0.079932473 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:52:40 np0005548788.localdomain podman[247380]: 2025-12-06 09:52:40.882583465 +0000 UTC m=+0.117735791 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:52:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:43 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27002 DF PROTO=TCP SPT=48478 DPT=9100 SEQ=3434844840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1D7E40000000001030307) 
Dec 06 09:52:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:43 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:52:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:45 np0005548788.localdomain sudo[247493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qciwlstitsgjrlsuagnjvehxbnjtszjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014765.7026248-3035-10787355553638/AnsiballZ_file.py
Dec 06 09:52:45 np0005548788.localdomain sudo[247493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:46 np0005548788.localdomain python3.9[247495]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:46 np0005548788.localdomain sudo[247493]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:46 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27004 DF PROTO=TCP SPT=48478 DPT=9100 SEQ=3434844840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1E3F00000000001030307) 
Dec 06 09:52:47 np0005548788.localdomain sudo[247603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgwwslwtqhzqlsqqdkfzuosfkymdtmnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014766.6011708-3062-138654983280671/AnsiballZ_stat.py
Dec 06 09:52:47 np0005548788.localdomain sudo[247603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548788.localdomain podman[247606]: 2025-12-06 09:52:47.169585506 +0000 UTC m=+0.106944342 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:52:47 np0005548788.localdomain podman[247606]: 2025-12-06 09:52:47.17479061 +0000 UTC m=+0.112149466 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:52:47 np0005548788.localdomain python3.9[247605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:47 np0005548788.localdomain sudo[247603]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:52:47.412 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:52:47.413 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:52:47.413 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:52:47 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:47 np0005548788.localdomain sudo[247705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfshufkhpdbuqyhmwoncbopgcpqagctt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014766.6011708-3062-138654983280671/AnsiballZ_copy.py
Dec 06 09:52:47 np0005548788.localdomain sudo[247705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:47 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:47 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:47 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:47 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:47 np0005548788.localdomain python3.9[247707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014766.6011708-3062-138654983280671/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:47 np0005548788.localdomain sudo[247705]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:48 np0005548788.localdomain sudo[247815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjiagkcodcmaehnkwonjzwjswyzzmzbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014768.3024375-3110-238293230097358/AnsiballZ_file.py
Dec 06 09:52:48 np0005548788.localdomain sudo[247815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:48 np0005548788.localdomain python3.9[247817]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:48 np0005548788.localdomain sudo[247815]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:49 np0005548788.localdomain sudo[247925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-albccuovfoebuauytkfnuziraqznfhvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014769.0502706-3134-94328874193480/AnsiballZ_stat.py
Dec 06 09:52:49 np0005548788.localdomain sudo[247925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10028 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=1956773802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1EEE40000000001030307) 
Dec 06 09:52:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c957b2bd9594b3144e0926de62e77156dbabeefc8e2acf756f315a98b85b5f52-merged.mount: Deactivated successfully.
Dec 06 09:52:49 np0005548788.localdomain python3.9[247927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c957b2bd9594b3144e0926de62e77156dbabeefc8e2acf756f315a98b85b5f52-merged.mount: Deactivated successfully.
Dec 06 09:52:49 np0005548788.localdomain sudo[247925]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:49 np0005548788.localdomain sudo[247982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcxapovvgfacdvzhanfvghlptuvenioz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014769.0502706-3134-94328874193480/AnsiballZ_file.py
Dec 06 09:52:49 np0005548788.localdomain sudo[247982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:50 np0005548788.localdomain python3.9[247984]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:50 np0005548788.localdomain sudo[247982]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:50 np0005548788.localdomain sudo[248092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgecgshixjamhsijsrcwhdcgepkqpgtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014770.3161275-3170-162912553033035/AnsiballZ_stat.py
Dec 06 09:52:50 np0005548788.localdomain sudo[248092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:50 np0005548788.localdomain python3.9[248094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:50 np0005548788.localdomain sudo[248092]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:51 np0005548788.localdomain sudo[248149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lypwcsdcjyvaccqfkushzyacoppqfohn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014770.3161275-3170-162912553033035/AnsiballZ_file.py
Dec 06 09:52:51 np0005548788.localdomain sudo[248149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:51 np0005548788.localdomain python3.9[248151]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.4cu0f0px recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:51 np0005548788.localdomain sudo[248149]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7-merged.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548788.localdomain sudo[248259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owtfpkpjunihzlgopmreftoyjgntwvfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014771.525414-3206-18921630970974/AnsiballZ_stat.py
Dec 06 09:52:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:52:51 np0005548788.localdomain sudo[248259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:51 np0005548788.localdomain systemd[1]: tmp-crun.stHCK7.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548788.localdomain podman[248261]: 2025-12-06 09:52:51.97198496 +0000 UTC m=+0.114512240 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:52:52 np0005548788.localdomain python3.9[248262]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:52 np0005548788.localdomain podman[248261]: 2025-12-06 09:52:52.053642626 +0000 UTC m=+0.196169886 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 09:52:52 np0005548788.localdomain sudo[248259]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:52 np0005548788.localdomain sudo[248342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awhsbjnhvdqaqatggftkryzvplwyusjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014771.525414-3206-18921630970974/AnsiballZ_file.py
Dec 06 09:52:52 np0005548788.localdomain sudo[248342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:52 np0005548788.localdomain python3.9[248344]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:52 np0005548788.localdomain sudo[248342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10030 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=1956773802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F1FAF00000000001030307) 
Dec 06 09:52:53 np0005548788.localdomain sudo[248452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntsdhnuantafgmuaqaetsnegdvfebgwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014772.7843504-3245-138160460184643/AnsiballZ_command.py
Dec 06 09:52:53 np0005548788.localdomain sudo[248452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:53 np0005548788.localdomain python3.9[248454]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:53 np0005548788.localdomain sudo[248452]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:54 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:54 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:54 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:52:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:54 np0005548788.localdomain sudo[248563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaqablwsrubrrizwmhfgptrbdpcbepgh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014774.2944262-3269-96014981082184/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:52:54 np0005548788.localdomain sudo[248563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:55 np0005548788.localdomain python3[248565]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:52:55 np0005548788.localdomain sudo[248563]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:55 np0005548788.localdomain podman[240078]: time="2025-12-06T09:52:55Z" level=error msg="Getting root fs size for \"e62048566cd1c7d3a1c2840ab6c3c5542bcbdd8474a644a29f7b75ba551ea199\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Dec 06 09:52:55 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:55 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:55 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:55 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10031 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=1956773802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F20AB00000000001030307) 
Dec 06 09:52:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:56 np0005548788.localdomain sudo[248673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqlhlqwnfphznoodtqxzcmyzbxzxsjsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014775.236335-3293-127328874029887/AnsiballZ_stat.py
Dec 06 09:52:56 np0005548788.localdomain sudo[248673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:56 np0005548788.localdomain python3.9[248675]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:56 np0005548788.localdomain sudo[248673]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:57.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:57.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:52:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:57.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:52:57 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:52:57.197 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:52:57 np0005548788.localdomain sudo[248730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odcdvmnjwkponlxbpqqlqcxjhnzllyer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014775.236335-3293-127328874029887/AnsiballZ_file.py
Dec 06 09:52:57 np0005548788.localdomain sudo[248730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:57 np0005548788.localdomain sshd[248733]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:57 np0005548788.localdomain python3.9[248732]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:57 np0005548788.localdomain sudo[248730]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:58 np0005548788.localdomain sudo[248842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkjmracwsthdizmzfihflbjdepgqcvqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014777.6812077-3329-135075804102044/AnsiballZ_stat.py
Dec 06 09:52:58 np0005548788.localdomain sudo[248842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:58 np0005548788.localdomain python3.9[248844]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:58 np0005548788.localdomain sudo[248842]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6ffe2488499d51ab122e14390d43a8a544352d677e189fc91feb36bff22402df-merged.mount: Deactivated successfully.
Dec 06 09:52:58 np0005548788.localdomain sudo[248899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykoffieesdlllqgyfbqtraypcpjwkpqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014777.6812077-3329-135075804102044/AnsiballZ_file.py
Dec 06 09:52:58 np0005548788.localdomain sudo[248899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:58 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:58 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:58 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:58 np0005548788.localdomain python3.9[248901]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:58 np0005548788.localdomain sudo[248899]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:58 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27006 DF PROTO=TCP SPT=48478 DPT=9100 SEQ=3434844840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F213F00000000001030307) 
Dec 06 09:52:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:59 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:59 np0005548788.localdomain sudo[249009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkrpyagvvppcqmhcdckxtchnncttqfjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014779.0144584-3365-176153715166086/AnsiballZ_stat.py
Dec 06 09:52:59 np0005548788.localdomain sudo[249009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:59 np0005548788.localdomain python3.9[249011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:59 np0005548788.localdomain sudo[249009]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:59 np0005548788.localdomain sudo[249066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtovchzttotfmpvsmvhmrllibqyohyqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014779.0144584-3365-176153715166086/AnsiballZ_file.py
Dec 06 09:52:59 np0005548788.localdomain sudo[249066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:00 np0005548788.localdomain python3.9[249068]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:00 np0005548788.localdomain sudo[249066]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:00.190 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:00 np0005548788.localdomain sudo[249176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgxewfiywypdejoopzntmdgaiotfbocj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014780.3461132-3401-190937874511888/AnsiballZ_stat.py
Dec 06 09:53:00 np0005548788.localdomain sudo[249176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:00 np0005548788.localdomain python3.9[249178]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:00 np0005548788.localdomain sudo[249176]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:01 np0005548788.localdomain sudo[249233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmwfydkfqagebvoyrgxvkkdpfamjcfqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014780.3461132-3401-190937874511888/AnsiballZ_file.py
Dec 06 09:53:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:01.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:01.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:01 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:01.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:53:01 np0005548788.localdomain sudo[249233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:01 np0005548788.localdomain python3.9[249235]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:01 np0005548788.localdomain sudo[249233]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b0d334edc17c9aa8f0b180da1f8a718e4ac8b472875066b766a2e2e11ebb80c3-merged.mount: Deactivated successfully.
Dec 06 09:53:02 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:53:02 np0005548788.localdomain sudo[249343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fotlodubuubplvvedosdvaztzjvxchix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014781.7248662-3437-64635611596658/AnsiballZ_stat.py
Dec 06 09:53:02 np0005548788.localdomain sudo[249343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.211 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.212 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.212 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.212 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.213 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:53:02 np0005548788.localdomain python3.9[249345]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:02 np0005548788.localdomain sudo[249343]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.678 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:53:02 np0005548788.localdomain sudo[249455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcjrmewovavntvwfmocjddvuchianzdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014781.7248662-3437-64635611596658/AnsiballZ_copy.py
Dec 06 09:53:02 np0005548788.localdomain sudo[249455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.855 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.857 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=13039MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.858 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.859 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:02 np0005548788.localdomain python3.9[249457]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014781.7248662-3437-64635611596658/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:02 np0005548788.localdomain sudo[249455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.946 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.946 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:53:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:02.973 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:53:03 np0005548788.localdomain podman[249477]: 2025-12-06 09:53:03.203904584 +0000 UTC m=+0.108421619 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 09:53:03 np0005548788.localdomain podman[249477]: 2025-12-06 09:53:03.245520542 +0000 UTC m=+0.150037567 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:03.422 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:53:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:03.429 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:03.456 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:53:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:03.459 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:53:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:03.460 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:03 np0005548788.localdomain sshd[248733]: Received disconnect from 45.78.194.186 port 47260:11: Bye Bye [preauth]
Dec 06 09:53:03 np0005548788.localdomain sshd[248733]: Disconnected from authenticating user root 45.78.194.186 port 47260 [preauth]
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:53:03 np0005548788.localdomain sudo[249606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iynepyqxsqxoerrgidrkpejyxucetscx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014783.2753782-3482-133276607277149/AnsiballZ_file.py
Dec 06 09:53:03 np0005548788.localdomain sudo[249606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:03 np0005548788.localdomain python3.9[249608]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:03 np0005548788.localdomain sudo[249606]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:04 np0005548788.localdomain sudo[249716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfwvmjsyayvxfpesxpfggqmfopumkmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014784.047604-3506-215561750769197/AnsiballZ_command.py
Dec 06 09:53:04 np0005548788.localdomain sudo[249716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:04.460 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:04 np0005548788.localdomain python3.9[249718]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:53:04 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:04 np0005548788.localdomain sudo[249716]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45952 DF PROTO=TCP SPT=40838 DPT=9101 SEQ=3909095480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F22AEE0000000001030307) 
Dec 06 09:53:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62832 DF PROTO=TCP SPT=51152 DPT=9882 SEQ=2816698620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F22BF00000000001030307) 
Dec 06 09:53:05 np0005548788.localdomain sudo[249829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgfjqbbmpfyhtqetuqpdnblduioizkec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014784.7878327-3530-143874106835533/AnsiballZ_blockinfile.py
Dec 06 09:53:05 np0005548788.localdomain sudo[249829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:05.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:53:05 np0005548788.localdomain python3.9[249831]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:05 np0005548788.localdomain podman[249832]: 2025-12-06 09:53:05.387649316 +0000 UTC m=+0.059609805 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:53:05 np0005548788.localdomain sudo[249829]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:05 np0005548788.localdomain podman[249832]: 2025-12-06 09:53:05.401699198 +0000 UTC m=+0.073659717 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:53:05 np0005548788.localdomain podman[249832]: unhealthy
Dec 06 09:53:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:05 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:05 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:53:05 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:53:05 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:53:05 np0005548788.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:53:06 np0005548788.localdomain sudo[249960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfmiqklzodukizqlufabdtsjwrpnonyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014785.785349-3557-186571017376319/AnsiballZ_command.py
Dec 06 09:53:06 np0005548788.localdomain sudo[249960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:06 np0005548788.localdomain python3.9[249962]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:53:06 np0005548788.localdomain sudo[249960]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:06 np0005548788.localdomain sudo[250071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnhljzuppsdlkclypikkffmsbjldcjeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014786.5268936-3581-145905080993101/AnsiballZ_stat.py
Dec 06 09:53:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:53:06 np0005548788.localdomain sudo[250071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:53:06 np0005548788.localdomain podman[250073]: 2025-12-06 09:53:06.990649856 +0000 UTC m=+0.095674207 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:53:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-75558001a48f3327b2401ddbe77f60b90335541a0f5b86a4118ca1d021773542-merged.mount: Deactivated successfully.
Dec 06 09:53:07 np0005548788.localdomain podman[250073]: 2025-12-06 09:53:07.029191827 +0000 UTC m=+0.134216198 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public)
Dec 06 09:53:07 np0005548788.localdomain python3.9[250074]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:07 np0005548788.localdomain sudo[250071]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:07 np0005548788.localdomain podman[240078]: time="2025-12-06T09:53:07Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Dec 06 09:53:07 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:47:33 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.489 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:53:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45954 DF PROTO=TCP SPT=40838 DPT=9101 SEQ=3909095480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F236F00000000001030307) 
Dec 06 09:53:08 np0005548788.localdomain sudo[250203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqxouykhfregyuvnkgbztcmzriculpif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014787.8636367-3605-22668222079922/AnsiballZ_command.py
Dec 06 09:53:08 np0005548788.localdomain sudo[250203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:08 np0005548788.localdomain python3.9[250205]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:53:08 np0005548788.localdomain sudo[250203]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-39242897018fbece3ba082861febd3dccb972b1a8576e686a582ad627dc9df9d-merged.mount: Deactivated successfully.
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:53:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:53:08 np0005548788.localdomain sudo[250319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qebhjlmvvidogxkwwhvcwhvmasbatwku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014788.5961735-3629-118073881106041/AnsiballZ_file.py
Dec 06 09:53:08 np0005548788.localdomain sudo[250319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:08 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:53:09 np0005548788.localdomain python3.9[250321]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:09 np0005548788.localdomain sudo[250319]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:09 np0005548788.localdomain sshd[230233]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:53:09 np0005548788.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Dec 06 09:53:09 np0005548788.localdomain systemd[1]: session-56.scope: Consumed 1min 34.413s CPU time.
Dec 06 09:53:09 np0005548788.localdomain systemd-logind[765]: Session 56 logged out. Waiting for processes to exit.
Dec 06 09:53:09 np0005548788.localdomain systemd-logind[765]: Removed session 56.
Dec 06 09:53:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:53:11 np0005548788.localdomain systemd[1]: tmp-crun.I5sdGm.mount: Deactivated successfully.
Dec 06 09:53:11 np0005548788.localdomain podman[250339]: 2025-12-06 09:53:11.277720703 +0000 UTC m=+0.101478370 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec 06 09:53:11 np0005548788.localdomain podman[250339]: 2025-12-06 09:53:11.29193341 +0000 UTC m=+0.115691097 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:53:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:12 np0005548788.localdomain sshd[250358]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:12 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:53:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:53:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:14 np0005548788.localdomain podman[250359]: 2025-12-06 09:53:14.051358125 +0000 UTC m=+0.113394475 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:53:14 np0005548788.localdomain podman[250359]: 2025-12-06 09:53:14.062632239 +0000 UTC m=+0.124668519 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:53:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:14 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:53:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:16 np0005548788.localdomain sshd[250380]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:16 np0005548788.localdomain sshd[250380]: Accepted publickey for zuul from 192.168.122.30 port 39890 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:53:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:16 np0005548788.localdomain systemd-logind[765]: New session 57 of user zuul.
Dec 06 09:53:16 np0005548788.localdomain systemd[1]: Started Session 57 of User zuul.
Dec 06 09:53:16 np0005548788.localdomain sshd[250380]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:53:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:17 np0005548788.localdomain sudo[250491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igjaaqlakdfhxsgvvbtawdwdreahiazw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014796.8062372-26-60362354005851/AnsiballZ_file.py
Dec 06 09:53:17 np0005548788.localdomain sudo[250491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:17 np0005548788.localdomain python3.9[250493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:17 np0005548788.localdomain sudo[250491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:53:18 np0005548788.localdomain podman[250565]: 2025-12-06 09:53:18.020377575 +0000 UTC m=+0.093338304 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 06 09:53:18 np0005548788.localdomain podman[250565]: 2025-12-06 09:53:18.02849479 +0000 UTC m=+0.101455579 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:53:18 np0005548788.localdomain sudo[250618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqmdgjqpjfxzitibyqfrmeqawbbabtuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014797.773769-26-83831682506044/AnsiballZ_file.py
Dec 06 09:53:18 np0005548788.localdomain sudo[250618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:18 np0005548788.localdomain python3.9[250620]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:18 np0005548788.localdomain sudo[250618]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:18 np0005548788.localdomain sudo[250728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtismbbwzjfiyqljoffyemuqaymwymej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014798.4271815-26-188475284265089/AnsiballZ_file.py
Dec 06 09:53:18 np0005548788.localdomain sudo[250728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:18 np0005548788.localdomain python3.9[250730]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:18 np0005548788.localdomain sudo[250728]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45989 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F264140000000001030307) 
Dec 06 09:53:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:19 np0005548788.localdomain python3.9[250838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-332d257016d793510f24c0e1bad47d346cdd257645bff71a8f039983cfc7a0b7-merged.mount: Deactivated successfully.
Dec 06 09:53:20 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:53:20 np0005548788.localdomain python3.9[250924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014799.1479158-104-161030250661266/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45990 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F268300000000001030307) 
Dec 06 09:53:21 np0005548788.localdomain python3.9[251032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10033 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=1956773802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F26BF10000000001030307) 
Dec 06 09:53:21 np0005548788.localdomain python3.9[251118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014800.624324-149-147413863320543/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45991 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F270300000000001030307) 
Dec 06 09:53:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:23 np0005548788.localdomain python3.9[251226]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29040 DF PROTO=TCP SPT=33760 DPT=9102 SEQ=121088017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F273F00000000001030307) 
Dec 06 09:53:23 np0005548788.localdomain python3.9[251312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014801.9078772-149-26942796898004/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:24 np0005548788.localdomain python3.9[251420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:25 np0005548788.localdomain python3.9[251506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014803.9257412-149-17725777042074/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=1ebc629ffcdf3bb28ec78d56bc2d671366cb24e5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:53:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:25 np0005548788.localdomain podman[251507]: 2025-12-06 09:53:25.143781684 +0000 UTC m=+0.068306499 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:53:25 np0005548788.localdomain podman[251507]: 2025-12-06 09:53:25.226316497 +0000 UTC m=+0.150841312 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:53:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:25 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:53:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45992 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F27FF00000000001030307) 
Dec 06 09:53:26 np0005548788.localdomain python3.9[251638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:27 np0005548788.localdomain python3.9[251724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014806.0940115-323-154174191947055/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=b14e40a972b9e05d0e95a7e875b3201eda2c4b6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:27 np0005548788.localdomain sudo[251794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:53:27 np0005548788.localdomain sudo[251794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:27 np0005548788.localdomain sudo[251794]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:27 np0005548788.localdomain sudo[251814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:53:27 np0005548788.localdomain sudo[251814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:27 np0005548788.localdomain python3.9[251868]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:28 np0005548788.localdomain sudo[251990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcbwyfcizutgxosibbkitvzykgisixpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014808.279857-395-89699491609753/AnsiballZ_file.py
Dec 06 09:53:28 np0005548788.localdomain sudo[251990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:28 np0005548788.localdomain python3.9[251992]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:28 np0005548788.localdomain sudo[251990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:29 np0005548788.localdomain sudo[252100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiubknwzawnehazarawnhuptuaxfjcoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014809.1211183-419-217152425305843/AnsiballZ_stat.py
Dec 06 09:53:29 np0005548788.localdomain sudo[252100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:29 np0005548788.localdomain python3.9[252102]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:29 np0005548788.localdomain sudo[252100]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:29 np0005548788.localdomain sudo[252157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnszsqftsxhumjhijibfyqanopmwkubp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014809.1211183-419-217152425305843/AnsiballZ_file.py
Dec 06 09:53:29 np0005548788.localdomain sudo[252157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:30 np0005548788.localdomain python3.9[252159]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:30 np0005548788.localdomain sudo[252157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b0d334edc17c9aa8f0b180da1f8a718e4ac8b472875066b766a2e2e11ebb80c3-merged.mount: Deactivated successfully.
Dec 06 09:53:30 np0005548788.localdomain sudo[252267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjmlilzrarylhpuxpcqjnknagzhafryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014810.3758302-419-195613166810003/AnsiballZ_stat.py
Dec 06 09:53:30 np0005548788.localdomain sudo[252267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:30 np0005548788.localdomain python3.9[252269]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:30 np0005548788.localdomain sudo[252267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:30 np0005548788.localdomain sudo[251814]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:31 np0005548788.localdomain sudo[252289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:53:31 np0005548788.localdomain sudo[252289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:31 np0005548788.localdomain sudo[252289]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:31 np0005548788.localdomain sudo[252327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:53:31 np0005548788.localdomain sudo[252327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:31 np0005548788.localdomain sudo[252368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaclkqujfinfkogwablsmctduqrflcfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014810.3758302-419-195613166810003/AnsiballZ_file.py
Dec 06 09:53:31 np0005548788.localdomain sudo[252368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:31 np0005548788.localdomain python3.9[252371]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:31 np0005548788.localdomain sudo[252368]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:31 np0005548788.localdomain sudo[252492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlooadmvdpsrnsjvqwlpueomcgnqltfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014811.643894-488-84648178484385/AnsiballZ_file.py
Dec 06 09:53:31 np0005548788.localdomain sudo[252492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:53:32 np0005548788.localdomain python3.9[252494]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:32 np0005548788.localdomain sudo[252492]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:53:32 np0005548788.localdomain sudo[252327]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:32 np0005548788.localdomain sudo[252621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcaenvacyfoippitedbtrvhsejfdcrkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014812.3509715-512-60143071654472/AnsiballZ_stat.py
Dec 06 09:53:32 np0005548788.localdomain sudo[252621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:32 np0005548788.localdomain python3.9[252623]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:32 np0005548788.localdomain sudo[252621]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:33 np0005548788.localdomain sudo[252642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:53:33 np0005548788.localdomain sudo[252642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:33 np0005548788.localdomain sudo[252642]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:33 np0005548788.localdomain sudo[252696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jswcvijhcgxzbvinrlowekhutwadsmgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014812.3509715-512-60143071654472/AnsiballZ_file.py
Dec 06 09:53:33 np0005548788.localdomain sudo[252696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:33 np0005548788.localdomain python3.9[252698]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:33 np0005548788.localdomain sudo[252696]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:34 np0005548788.localdomain sudo[252806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkqiqjqjvajbqpmiwejajqmmfdlzucjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014813.6421492-548-162352069511321/AnsiballZ_stat.py
Dec 06 09:53:34 np0005548788.localdomain sudo[252806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:53:34 np0005548788.localdomain podman[252809]: 2025-12-06 09:53:34.2180758 +0000 UTC m=+0.174996947 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:53:34 np0005548788.localdomain podman[252809]: 2025-12-06 09:53:34.236039654 +0000 UTC m=+0.192960821 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Dec 06 09:53:34 np0005548788.localdomain python3.9[252808]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:34 np0005548788.localdomain sudo[252806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:34 np0005548788.localdomain sudo[252882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvtvmjzsgwsabhmsexwnengbcgyfhlig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014813.6421492-548-162352069511321/AnsiballZ_file.py
Dec 06 09:53:34 np0005548788.localdomain sudo[252882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45993 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F29FF00000000001030307) 
Dec 06 09:53:34 np0005548788.localdomain python3.9[252884]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:34 np0005548788.localdomain sudo[252882]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:35 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:53:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:53:36 np0005548788.localdomain podman[252956]: 2025-12-06 09:53:36.270163253 +0000 UTC m=+0.098771085 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:53:36 np0005548788.localdomain podman[252956]: 2025-12-06 09:53:36.284680834 +0000 UTC m=+0.113288686 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:53:36 np0005548788.localdomain podman[252956]: unhealthy
Dec 06 09:53:36 np0005548788.localdomain sudo[253015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzptnmcorgotmdjnzggwjudfymguhafo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014815.0378184-584-261197143971374/AnsiballZ_systemd.py
Dec 06 09:53:36 np0005548788.localdomain sudo[253015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:36 np0005548788.localdomain python3.9[253017]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:53:36 np0005548788.localdomain systemd-rc-local-generator[253044]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:36 np0005548788.localdomain systemd-sysv-generator[253047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:47:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 140641 "" "Go-http-client/1.1"
Dec 06 09:53:36 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:53:36.950Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 06 09:53:36 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:53:36.951Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 06 09:53:36 np0005548788.localdomain podman_exporter[240298]: ts=2025-12-06T09:53:36.952Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Dec 06 09:53:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:53:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-75558001a48f3327b2401ddbe77f60b90335541a0f5b86a4118ca1d021773542-merged.mount: Deactivated successfully.
Dec 06 09:53:37 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:53:37 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Failed with result 'exit-code'.
Dec 06 09:53:37 np0005548788.localdomain sudo[253015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:37 np0005548788.localdomain sudo[253163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txhazfqgkyicwgiknjiqcnxpawoolobh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014817.344257-608-139183641042015/AnsiballZ_stat.py
Dec 06 09:53:37 np0005548788.localdomain sudo[253163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:37 np0005548788.localdomain python3.9[253165]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:37 np0005548788.localdomain sudo[253163]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:38 np0005548788.localdomain sudo[253220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvbsmfiexvzkbnyakrpywbwfpmfasoyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014817.344257-608-139183641042015/AnsiballZ_file.py
Dec 06 09:53:38 np0005548788.localdomain sudo[253220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:53:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:53:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:53:39 np0005548788.localdomain python3.9[253222]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:39 np0005548788.localdomain sudo[253220]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:53:39 np0005548788.localdomain podman[253242]: 2025-12-06 09:53:39.2768745 +0000 UTC m=+0.097352172 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Dec 06 09:53:39 np0005548788.localdomain podman[253242]: 2025-12-06 09:53:39.315083018 +0000 UTC m=+0.135560710 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter)
Dec 06 09:53:39 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:53:39 np0005548788.localdomain sudo[253354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrwvgsyoumvzdoqgxcpmmenxvulgvndd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014819.2971978-644-135966747464994/AnsiballZ_stat.py
Dec 06 09:53:39 np0005548788.localdomain sudo[253354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:39 np0005548788.localdomain python3.9[253356]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:39 np0005548788.localdomain sudo[253354]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:40 np0005548788.localdomain sudo[253411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmzycznalbrvhffrjmjbhuhkfnmpddza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014819.2971978-644-135966747464994/AnsiballZ_file.py
Dec 06 09:53:40 np0005548788.localdomain sudo[253411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:40 np0005548788.localdomain python3.9[253413]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:40 np0005548788.localdomain sudo[253411]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:40 np0005548788.localdomain sudo[253521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqcvjlminmhtmqvkxaqgdsqyufbsfqdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014820.5065122-680-239733633972337/AnsiballZ_systemd.py
Dec 06 09:53:40 np0005548788.localdomain sudo[253521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:41 np0005548788.localdomain python3.9[253523]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:53:41 np0005548788.localdomain systemd-rc-local-generator[253548]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:41 np0005548788.localdomain systemd-sysv-generator[253553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:53:41 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:53:41 np0005548788.localdomain sudo[253521]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:53:42 np0005548788.localdomain podman[253620]: 2025-12-06 09:53:42.271727087 +0000 UTC m=+0.093359022 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:53:42 np0005548788.localdomain podman[253620]: 2025-12-06 09:53:42.317744711 +0000 UTC m=+0.139376676 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 09:53:42 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:53:42 np0005548788.localdomain sudo[253689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msbemxlkrxmlhohvkuxtcrzzohtocckt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014822.092834-710-200998546299057/AnsiballZ_file.py
Dec 06 09:53:42 np0005548788.localdomain sudo[253689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:42 np0005548788.localdomain python3.9[253691]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:42 np0005548788.localdomain sudo[253689]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:43 np0005548788.localdomain sudo[253799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpzbsagyibnninbialjxswcspaedncca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014822.8737493-734-20896623585022/AnsiballZ_stat.py
Dec 06 09:53:43 np0005548788.localdomain sudo[253799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:43 np0005548788.localdomain python3.9[253801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:43 np0005548788.localdomain sudo[253799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:43 np0005548788.localdomain sudo[253887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjxogpccnscwbzqjygggcnhhcsaglozm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014822.8737493-734-20896623585022/AnsiballZ_copy.py
Dec 06 09:53:43 np0005548788.localdomain sudo[253887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:43 np0005548788.localdomain python3.9[253889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014822.8737493-734-20896623585022/.source.json _original_basename=.s5tl3bs7 follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:43 np0005548788.localdomain sudo[253887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:44 np0005548788.localdomain sudo[253997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lggwesoigzgrwrrpiztlnxtkxicvfaqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014824.18006-779-208398783697372/AnsiballZ_file.py
Dec 06 09:53:44 np0005548788.localdomain sudo[253997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:53:44 np0005548788.localdomain podman[253999]: 2025-12-06 09:53:44.62756658 +0000 UTC m=+0.098045354 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:53:44 np0005548788.localdomain podman[253999]: 2025-12-06 09:53:44.638634665 +0000 UTC m=+0.109113499 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:53:44 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:53:44 np0005548788.localdomain python3.9[254000]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:44 np0005548788.localdomain sudo[253997]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:45 np0005548788.localdomain sudo[254128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdllpduiwrjitswlmyboxupejydimlxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014824.9605353-803-126627178995743/AnsiballZ_stat.py
Dec 06 09:53:45 np0005548788.localdomain sudo[254128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:45 np0005548788.localdomain sudo[254128]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:45 np0005548788.localdomain sudo[254216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sonogcbbeucdcrmtvbrckiljeaakkxfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014824.9605353-803-126627178995743/AnsiballZ_copy.py
Dec 06 09:53:45 np0005548788.localdomain sudo[254216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:46 np0005548788.localdomain sudo[254216]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:46 np0005548788.localdomain sudo[254326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxnxtbcsztxajrgzvuafhxtqtrmeqdop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014826.440541-854-233792424455281/AnsiballZ_container_config_data.py
Dec 06 09:53:46 np0005548788.localdomain sudo[254326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:47 np0005548788.localdomain python3.9[254328]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Dec 06 09:53:47 np0005548788.localdomain sudo[254326]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:53:47.414 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:53:47.416 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:53:47.417 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:47 np0005548788.localdomain sudo[254436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obnkjyoizabhesiubxupmvplraknokfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014827.3813243-881-195878675208275/AnsiballZ_container_config_hash.py
Dec 06 09:53:47 np0005548788.localdomain sudo[254436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:48 np0005548788.localdomain python3.9[254438]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:53:48 np0005548788.localdomain sudo[254436]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64005 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F2D9450000000001030307) 
Dec 06 09:53:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:53:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:53:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:53:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142286 "" "Go-http-client/1.1"
Dec 06 09:53:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:53:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15861 "" "Go-http-client/1.1"
Dec 06 09:53:49 np0005548788.localdomain sudo[254547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aykaxukgcdaxmvmdncutkcgpixxcbcrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014828.631097-908-4549365981351/AnsiballZ_podman_container_info.py
Dec 06 09:53:49 np0005548788.localdomain sudo[254547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:53:50 np0005548788.localdomain python3.9[254549]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:53:50 np0005548788.localdomain podman[254550]: 2025-12-06 09:53:50.263263292 +0000 UTC m=+0.085497764 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:53:50 np0005548788.localdomain podman[254550]: 2025-12-06 09:53:50.274438559 +0000 UTC m=+0.096673001 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:53:50 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:53:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64006 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F2DD300000000001030307) 
Dec 06 09:53:50 np0005548788.localdomain sudo[254547]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45994 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F2DFF10000000001030307) 
Dec 06 09:53:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64007 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F2E5300000000001030307) 
Dec 06 09:53:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10034 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=1956773802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F2E9F10000000001030307) 
Dec 06 09:53:54 np0005548788.localdomain sudo[254702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chzlnlnfoeyeconvmwamzrjoxhknocey ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014833.5949752-947-9344296826404/AnsiballZ_edpm_container_manage.py
Dec 06 09:53:54 np0005548788.localdomain sudo[254702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:54 np0005548788.localdomain python3[254704]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:53:54 np0005548788.localdomain podman[254740]: 
Dec 06 09:53:54 np0005548788.localdomain podman[254740]: 2025-12-06 09:53:54.742095168 +0000 UTC m=+0.092736033 container create 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_sriov_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Dec 06 09:53:54 np0005548788.localdomain podman[254740]: 2025-12-06 09:53:54.696586538 +0000 UTC m=+0.047227443 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 06 09:53:54 np0005548788.localdomain python3[254704]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 06 09:53:54 np0005548788.localdomain sudo[254702]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:55 np0005548788.localdomain sudo[254886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhjfswkxuuiwjtjfoqadktwcakkcmfwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014835.1530037-971-88291908521064/AnsiballZ_stat.py
Dec 06 09:53:55 np0005548788.localdomain sudo[254886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:55 np0005548788.localdomain python3.9[254888]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:55 np0005548788.localdomain sudo[254886]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:53:56 np0005548788.localdomain sudo[254998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntzifrevmjymhkehhjwngfcdbugevdyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014835.9149323-998-132125962243623/AnsiballZ_file.py
Dec 06 09:53:56 np0005548788.localdomain sudo[254998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:56 np0005548788.localdomain podman[254999]: 2025-12-06 09:53:56.328090301 +0000 UTC m=+0.144655486 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:53:56 np0005548788.localdomain podman[254999]: 2025-12-06 09:53:56.375650503 +0000 UTC m=+0.192215688 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:53:56 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:53:56 np0005548788.localdomain python3.9[255011]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:56 np0005548788.localdomain sudo[254998]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64008 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F2F4F00000000001030307) 
Dec 06 09:53:56 np0005548788.localdomain sudo[255076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvweulfwcigtvryjxmusuhpmiijfjtif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014835.9149323-998-132125962243623/AnsiballZ_stat.py
Dec 06 09:53:56 np0005548788.localdomain sudo[255076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:56 np0005548788.localdomain python3.9[255078]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:56 np0005548788.localdomain sudo[255076]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:57 np0005548788.localdomain sudo[255185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfsykxpeskykkzkhxkqjrmnltingknzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014836.920858-998-254314187564800/AnsiballZ_copy.py
Dec 06 09:53:57 np0005548788.localdomain sudo[255185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:57 np0005548788.localdomain python3.9[255187]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014836.920858-998-254314187564800/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:57 np0005548788.localdomain sudo[255185]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:57 np0005548788.localdomain sudo[255240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdmsnjhhmveswlxxyazhnqkocoosygop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014836.920858-998-254314187564800/AnsiballZ_systemd.py
Dec 06 09:53:57 np0005548788.localdomain sudo[255240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:58.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:58.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:53:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:58.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:53:58 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:53:58.203 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:53:58 np0005548788.localdomain python3.9[255242]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:53:58 np0005548788.localdomain systemd-rc-local-generator[255267]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:58 np0005548788.localdomain systemd-sysv-generator[255271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548788.localdomain sudo[255240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:58 np0005548788.localdomain sudo[255331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfunvtlmwixzlxsftkravqysmuwkvbxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014836.920858-998-254314187564800/AnsiballZ_systemd.py
Dec 06 09:53:58 np0005548788.localdomain sudo[255331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:59 np0005548788.localdomain python3.9[255333]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:53:59 np0005548788.localdomain systemd-sysv-generator[255362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:59 np0005548788.localdomain systemd-rc-local-generator[255358]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: Starting neutron_sriov_agent container...
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:53:59 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3013571bc4b61308728ca0b56eefbbbf7c1bce4596c17ce45084019e3b1def7a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:53:59 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3013571bc4b61308728ca0b56eefbbbf7c1bce4596c17ce45084019e3b1def7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:53:59 np0005548788.localdomain podman[255374]: 2025-12-06 09:53:59.83425771 +0000 UTC m=+0.131066745 container init 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:53:59 np0005548788.localdomain podman[255374]: 2025-12-06 09:53:59.846296815 +0000 UTC m=+0.143105880 container start 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:53:59 np0005548788.localdomain podman[255374]: neutron_sriov_agent
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + sudo -E kolla_set_configs
Dec 06 09:53:59 np0005548788.localdomain systemd[1]: Started neutron_sriov_agent container.
Dec 06 09:53:59 np0005548788.localdomain sudo[255331]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Validating config file
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Copying service configuration files
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Writing out command to execute
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: ++ cat /run_command
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + CMD=/usr/bin/neutron-sriov-nic-agent
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + ARGS=
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + sudo kolla_copy_cacerts
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + [[ ! -n '' ]]
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + . kolla_extend_start
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + umask 0022
Dec 06 09:53:59 np0005548788.localdomain neutron_sriov_agent[255389]: + exec /usr/bin/neutron-sriov-nic-agent
Dec 06 09:54:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:00.197 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:00 np0005548788.localdomain sudo[255510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdrtbwtbfbalqgyrdbncsrwapmnurpxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014840.1810658-1082-199103617111705/AnsiballZ_systemd.py
Dec 06 09:54:00 np0005548788.localdomain sudo[255510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:00 np0005548788.localdomain python3.9[255513]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:54:00 np0005548788.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Dec 06 09:54:00 np0005548788.localdomain systemd[1]: libpod-5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9.scope: Deactivated successfully.
Dec 06 09:54:00 np0005548788.localdomain systemd[1]: libpod-5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9.scope: Consumed 1.113s CPU time.
Dec 06 09:54:00 np0005548788.localdomain podman[255517]: 2025-12-06 09:54:00.975914042 +0000 UTC m=+0.082088040 container died 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:54:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9-userdata-shm.mount: Deactivated successfully.
Dec 06 09:54:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3013571bc4b61308728ca0b56eefbbbf7c1bce4596c17ce45084019e3b1def7a-merged.mount: Deactivated successfully.
Dec 06 09:54:01 np0005548788.localdomain podman[255517]: 2025-12-06 09:54:01.015664438 +0000 UTC m=+0.121838426 container cleanup 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=neutron_sriov_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 09:54:01 np0005548788.localdomain podman[255517]: neutron_sriov_agent
Dec 06 09:54:01 np0005548788.localdomain podman[255544]: 2025-12-06 09:54:01.109402819 +0000 UTC m=+0.059492074 container cleanup 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=neutron_sriov_agent, managed_by=edpm_ansible, config_id=neutron_sriov_agent, io.buildah.version=1.41.3)
Dec 06 09:54:01 np0005548788.localdomain podman[255544]: neutron_sriov_agent
Dec 06 09:54:01 np0005548788.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Dec 06 09:54:01 np0005548788.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Dec 06 09:54:01 np0005548788.localdomain systemd[1]: Starting neutron_sriov_agent container...
Dec 06 09:54:01 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:54:01 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3013571bc4b61308728ca0b56eefbbbf7c1bce4596c17ce45084019e3b1def7a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:01 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3013571bc4b61308728ca0b56eefbbbf7c1bce4596c17ce45084019e3b1def7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:01 np0005548788.localdomain podman[255555]: 2025-12-06 09:54:01.256388086 +0000 UTC m=+0.115630928 container init 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + sudo -E kolla_set_configs
Dec 06 09:54:01 np0005548788.localdomain podman[255555]: 2025-12-06 09:54:01.289331854 +0000 UTC m=+0.148574696 container start 5f457c026f62a30b51fa9869aea227b9ed5fdf15452e12b36b76bb31a0864ab9 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6602e32685aab85974a8e6746769774e5739b89b74b92188e5d8c93fce95ad66'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:54:01 np0005548788.localdomain podman[255555]: neutron_sriov_agent
Dec 06 09:54:01 np0005548788.localdomain systemd[1]: Started neutron_sriov_agent container.
Dec 06 09:54:01 np0005548788.localdomain sudo[255510]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Validating config file
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Copying service configuration files
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Writing out command to execute
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: ++ cat /run_command
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + CMD=/usr/bin/neutron-sriov-nic-agent
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + ARGS=
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + sudo kolla_copy_cacerts
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + [[ ! -n '' ]]
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + . kolla_extend_start
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + umask 0022
Dec 06 09:54:01 np0005548788.localdomain neutron_sriov_agent[255571]: + exec /usr/bin/neutron-sriov-nic-agent
Dec 06 09:54:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:02.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:02.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:02.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:02 np0005548788.localdomain sshd[250380]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:54:02 np0005548788.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Dec 06 09:54:02 np0005548788.localdomain systemd[1]: session-57.scope: Consumed 25.003s CPU time.
Dec 06 09:54:02 np0005548788.localdomain systemd-logind[765]: Session 57 logged out. Waiting for processes to exit.
Dec 06 09:54:02 np0005548788.localdomain systemd-logind[765]: Removed session 57.
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.163 2 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.164 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.164 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.164 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.164 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.165 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.165 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005548788.localdomain'}
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.165 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-e8d63cbc-1831-4613-8cfe-a3044d4e6038 - - - - - -] RPC agent_id: nic-switch-agent.np0005548788.localdomain
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.170 2 INFO neutron.agent.agent_extensions_manager [None req-e8d63cbc-1831-4613-8cfe-a3044d4e6038 - - - - - -] Loaded agent extensions: ['qos']
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.171 2 INFO neutron.agent.agent_extensions_manager [None req-e8d63cbc-1831-4613-8cfe-a3044d4e6038 - - - - - -] Initializing agent extension 'qos'
Dec 06 09:54:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:03.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:03.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.433 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-e8d63cbc-1831-4613-8cfe-a3044d4e6038 - - - - - -] Agent initialized successfully, now running... 
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.434 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-e8d63cbc-1831-4613-8cfe-a3044d4e6038 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Dec 06 09:54:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 09:54:03.434 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-e8d63cbc-1831-4613-8cfe-a3044d4e6038 - - - - - -] Agent out of sync with plugin!
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.206 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.207 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.208 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.208 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.209 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.759 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.970 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.971 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12962MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.972 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:54:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:04.972 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:54:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64009 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F315F00000000001030307) 
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.089 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.089 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.110 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:54:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:54:05 np0005548788.localdomain podman[255627]: 2025-12-06 09:54:05.2957493 +0000 UTC m=+0.096580360 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 06 09:54:05 np0005548788.localdomain podman[255627]: 2025-12-06 09:54:05.334303268 +0000 UTC m=+0.135134348 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 09:54:05 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.632 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.639 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.653 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.656 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:54:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:05.656 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:54:06 np0005548788.localdomain sshd[255667]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:06.658 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:54:06.659 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:06 np0005548788.localdomain sshd[255667]: Received disconnect from 148.227.3.232 port 47000:11: Bye Bye [preauth]
Dec 06 09:54:06 np0005548788.localdomain sshd[255667]: Disconnected from authenticating user root 148.227.3.232 port 47000 [preauth]
Dec 06 09:54:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:54:07 np0005548788.localdomain systemd[1]: tmp-crun.vW3vfm.mount: Deactivated successfully.
Dec 06 09:54:07 np0005548788.localdomain podman[255669]: 2025-12-06 09:54:07.255134983 +0000 UTC m=+0.078959085 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:54:07 np0005548788.localdomain podman[255669]: 2025-12-06 09:54:07.263536898 +0000 UTC m=+0.087361050 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:54:07 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:54:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:54:09 np0005548788.localdomain rsyslogd[760]: imjournal: 10407 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 06 09:54:09 np0005548788.localdomain sshd[255693]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:09 np0005548788.localdomain sshd[255693]: Accepted publickey for zuul from 192.168.122.30 port 56948 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:54:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:54:09 np0005548788.localdomain systemd-logind[765]: New session 58 of user zuul.
Dec 06 09:54:09 np0005548788.localdomain systemd[1]: Started Session 58 of User zuul.
Dec 06 09:54:09 np0005548788.localdomain sshd[255693]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:54:09 np0005548788.localdomain podman[255696]: 2025-12-06 09:54:09.512303575 +0000 UTC m=+0.091774323 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, vcs-type=git, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 09:54:09 np0005548788.localdomain podman[255696]: 2025-12-06 09:54:09.525711812 +0000 UTC m=+0.105182550 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:54:09 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:54:10 np0005548788.localdomain python3.9[255824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:54:11 np0005548788.localdomain sudo[255936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnarabdcdozhyixcaulfoxwaejgmwrji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014851.2561977-65-233436417440300/AnsiballZ_setup.py
Dec 06 09:54:11 np0005548788.localdomain sudo[255936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:11 np0005548788.localdomain python3.9[255938]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:54:12 np0005548788.localdomain sudo[255936]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:12 np0005548788.localdomain sudo[255999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haxxalujqmvfpigirzkwrbcbgnhqbvlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014851.2561977-65-233436417440300/AnsiballZ_dnf.py
Dec 06 09:54:12 np0005548788.localdomain sudo[255999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:54:12 np0005548788.localdomain systemd[1]: tmp-crun.lYQCxC.mount: Deactivated successfully.
Dec 06 09:54:12 np0005548788.localdomain podman[256002]: 2025-12-06 09:54:12.835672422 +0000 UTC m=+0.112205103 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Dec 06 09:54:12 np0005548788.localdomain podman[256002]: 2025-12-06 09:54:12.852609195 +0000 UTC m=+0.129141896 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:54:12 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:54:12 np0005548788.localdomain python3.9[256001]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:54:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:54:15 np0005548788.localdomain podman[256023]: 2025-12-06 09:54:15.305418329 +0000 UTC m=+0.130603311 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:54:15 np0005548788.localdomain podman[256023]: 2025-12-06 09:54:15.318767543 +0000 UTC m=+0.143952515 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:54:15 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:54:16 np0005548788.localdomain sudo[255999]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:16 np0005548788.localdomain sudo[256154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glpkhunwvgyajptbcboyjajfvfxcfiar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014856.4317937-101-189059789272442/AnsiballZ_systemd.py
Dec 06 09:54:16 np0005548788.localdomain sudo[256154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:17 np0005548788.localdomain python3.9[256156]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:54:18 np0005548788.localdomain sudo[256154]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:19 np0005548788.localdomain sudo[256267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnypprhyqfiiwpdjlmcspmrsvkdbgwat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014858.6578023-128-27736674451088/AnsiballZ_file.py
Dec 06 09:54:19 np0005548788.localdomain sudo[256267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:19 np0005548788.localdomain python3.9[256269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:19 np0005548788.localdomain sudo[256267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25510 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F34E750000000001030307) 
Dec 06 09:54:19 np0005548788.localdomain podman[240078]: time="2025-12-06T09:54:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:54:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:54:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144242 "" "Go-http-client/1.1"
Dec 06 09:54:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:54:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16306 "" "Go-http-client/1.1"
Dec 06 09:54:19 np0005548788.localdomain sudo[256380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjewkcscqswvstdykczgjvzkuijepzik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014859.4728813-128-28524068004382/AnsiballZ_file.py
Dec 06 09:54:19 np0005548788.localdomain sudo[256380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:19 np0005548788.localdomain python3.9[256382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:19 np0005548788.localdomain sudo[256380]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:20 np0005548788.localdomain sudo[256490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loevpufuyhixfrviifuuldzhzsglckuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014860.0896358-128-229504552066439/AnsiballZ_file.py
Dec 06 09:54:20 np0005548788.localdomain sudo[256490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:54:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25511 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F352700000000001030307) 
Dec 06 09:54:20 np0005548788.localdomain podman[256493]: 2025-12-06 09:54:20.479412671 +0000 UTC m=+0.072953993 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:54:20 np0005548788.localdomain podman[256493]: 2025-12-06 09:54:20.487488105 +0000 UTC m=+0.081029427 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:54:20 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:54:20 np0005548788.localdomain python3.9[256492]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:20 np0005548788.localdomain sudo[256490]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:21 np0005548788.localdomain sudo[256619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqqyscjbzuifvgxdowlvfuwlsxhnojfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014860.773465-128-76360845645528/AnsiballZ_file.py
Dec 06 09:54:21 np0005548788.localdomain sudo[256619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:21 np0005548788.localdomain python3.9[256621]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:21 np0005548788.localdomain sudo[256619]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64010 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F355F00000000001030307) 
Dec 06 09:54:21 np0005548788.localdomain sudo[256729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvtkbhrwixuetihffflvdmyahctixdjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014861.494235-128-138860881119541/AnsiballZ_file.py
Dec 06 09:54:21 np0005548788.localdomain sudo[256729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:22 np0005548788.localdomain python3.9[256731]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:22 np0005548788.localdomain sudo[256729]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:22 np0005548788.localdomain sudo[256839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azupnfxrezrktxilfegdlgmpcgtsjhdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014862.1919305-128-252292529251894/AnsiballZ_file.py
Dec 06 09:54:22 np0005548788.localdomain sudo[256839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25512 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F35A700000000001030307) 
Dec 06 09:54:22 np0005548788.localdomain python3.9[256841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:22 np0005548788.localdomain sudo[256839]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:23 np0005548788.localdomain sudo[256949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuzhscultujrgovarirjruflwtvkvznd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014862.931024-128-227539088475008/AnsiballZ_file.py
Dec 06 09:54:23 np0005548788.localdomain sudo[256949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45995 DF PROTO=TCP SPT=53510 DPT=9102 SEQ=1935502135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F35DF00000000001030307) 
Dec 06 09:54:23 np0005548788.localdomain python3.9[256951]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:23 np0005548788.localdomain sudo[256949]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:24 np0005548788.localdomain sudo[257059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxubqxbfnvhylfjfwhankpslpolnixlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014863.6242137-278-191283950758483/AnsiballZ_stat.py
Dec 06 09:54:24 np0005548788.localdomain sudo[257059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:24 np0005548788.localdomain python3.9[257061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:24 np0005548788.localdomain sudo[257059]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:24 np0005548788.localdomain sudo[257147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkcxuyymxdjbnijubwqdzsplecgrnjrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014863.6242137-278-191283950758483/AnsiballZ_copy.py
Dec 06 09:54:24 np0005548788.localdomain sudo[257147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:25 np0005548788.localdomain python3.9[257149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014863.6242137-278-191283950758483/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:25 np0005548788.localdomain sudo[257147]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:25 np0005548788.localdomain python3.9[257257]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25513 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F36A300000000001030307) 
Dec 06 09:54:26 np0005548788.localdomain python3.9[257343]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014865.3339956-323-156535765219266/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:54:27 np0005548788.localdomain systemd[1]: tmp-crun.WL0KRy.mount: Deactivated successfully.
Dec 06 09:54:27 np0005548788.localdomain podman[257399]: 2025-12-06 09:54:27.263827149 +0000 UTC m=+0.084040819 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 09:54:27 np0005548788.localdomain podman[257399]: 2025-12-06 09:54:27.306107422 +0000 UTC m=+0.126321032 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller)
Dec 06 09:54:27 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:54:27 np0005548788.localdomain python3.9[257475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:28 np0005548788.localdomain python3.9[257561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014867.1195729-323-210618144504284/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:29 np0005548788.localdomain python3.9[257669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:29 np0005548788.localdomain python3.9[257755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014868.818932-323-158851904496681/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=efc6a4b49acb52b810ce79f6faa60a625f482425 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:31 np0005548788.localdomain python3.9[257863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:31 np0005548788.localdomain python3.9[257949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014870.628778-497-157128698329531/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=b14e40a972b9e05d0e95a7e875b3201eda2c4b6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:32 np0005548788.localdomain python3.9[258057]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:32 np0005548788.localdomain python3.9[258143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014871.889383-542-7169107709721/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:33 np0005548788.localdomain sudo[258177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:54:33 np0005548788.localdomain sudo[258177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:54:33 np0005548788.localdomain sudo[258177]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:33 np0005548788.localdomain sudo[258217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:54:33 np0005548788.localdomain sudo[258217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:54:33 np0005548788.localdomain python3.9[258287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:34 np0005548788.localdomain sudo[258217]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:34 np0005548788.localdomain python3.9[258405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014873.2453797-542-281045643695448/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:34 np0005548788.localdomain sudo[258468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:54:34 np0005548788.localdomain sudo[258468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:54:34 np0005548788.localdomain sudo[258468]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25514 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F389F00000000001030307) 
Dec 06 09:54:34 np0005548788.localdomain python3.9[258531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:35 np0005548788.localdomain python3.9[258586]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:54:35 np0005548788.localdomain systemd[1]: tmp-crun.VAMu2s.mount: Deactivated successfully.
Dec 06 09:54:35 np0005548788.localdomain podman[258587]: 2025-12-06 09:54:35.562473335 +0000 UTC m=+0.109422688 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 09:54:35 np0005548788.localdomain podman[258587]: 2025-12-06 09:54:35.574164029 +0000 UTC m=+0.121113352 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, config_id=edpm)
Dec 06 09:54:35 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:54:36 np0005548788.localdomain python3.9[258712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:36 np0005548788.localdomain python3.9[258798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014875.6236532-629-182284141891173/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:37 np0005548788.localdomain python3.9[258906]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:54:38 np0005548788.localdomain sudo[259016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otavmritaxxkzwneksxeufoztuqrxoyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014877.6686397-734-207448979038322/AnsiballZ_file.py
Dec 06 09:54:38 np0005548788.localdomain sudo[259016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:54:38 np0005548788.localdomain systemd[1]: tmp-crun.W3M9tm.mount: Deactivated successfully.
Dec 06 09:54:38 np0005548788.localdomain podman[259019]: 2025-12-06 09:54:38.14695613 +0000 UTC m=+0.110256173 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:54:38 np0005548788.localdomain podman[259019]: 2025-12-06 09:54:38.158759889 +0000 UTC m=+0.122059972 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:54:38 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:54:38 np0005548788.localdomain python3.9[259018]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:38 np0005548788.localdomain sudo[259016]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:38 np0005548788.localdomain sudo[259148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebzaxqowkqdtrbozknlteghjlkoilolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014878.519597-758-157489529442228/AnsiballZ_stat.py
Dec 06 09:54:38 np0005548788.localdomain sudo[259148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:54:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:54:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:54:39 np0005548788.localdomain python3.9[259150]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:39 np0005548788.localdomain sudo[259148]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:39 np0005548788.localdomain sudo[259205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwlxteipddsytquatgmrpbwfqywfpqif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014878.519597-758-157489529442228/AnsiballZ_file.py
Dec 06 09:54:39 np0005548788.localdomain sudo[259205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:39 np0005548788.localdomain python3.9[259207]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:39 np0005548788.localdomain sudo[259205]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:40 np0005548788.localdomain sudo[259315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgjwnitvzeebwpysfddtilrqeqvrztlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014879.7159734-758-257088056165213/AnsiballZ_stat.py
Dec 06 09:54:40 np0005548788.localdomain sudo[259315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:54:40 np0005548788.localdomain podman[259318]: 2025-12-06 09:54:40.141754677 +0000 UTC m=+0.093048212 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:54:40 np0005548788.localdomain podman[259318]: 2025-12-06 09:54:40.181708169 +0000 UTC m=+0.133001684 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, vcs-type=git)
Dec 06 09:54:40 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:54:40 np0005548788.localdomain python3.9[259317]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:40 np0005548788.localdomain sudo[259315]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:40 np0005548788.localdomain sudo[259391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlouxbpgmflcduqkjxqqfkwqysdlhsde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014879.7159734-758-257088056165213/AnsiballZ_file.py
Dec 06 09:54:40 np0005548788.localdomain sudo[259391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:40 np0005548788.localdomain python3.9[259393]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:40 np0005548788.localdomain sudo[259391]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:41 np0005548788.localdomain sudo[259501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuoktoxuhibodmjpljgvlorgfwzqdmns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014880.9378934-827-192291966810356/AnsiballZ_file.py
Dec 06 09:54:41 np0005548788.localdomain sudo[259501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:41 np0005548788.localdomain python3.9[259503]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:41 np0005548788.localdomain sudo[259501]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:42 np0005548788.localdomain sudo[259611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uybgibupbkgukliwxamrphzxgmzobmwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014881.7246766-851-62130219374849/AnsiballZ_stat.py
Dec 06 09:54:42 np0005548788.localdomain sudo[259611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:42 np0005548788.localdomain python3.9[259613]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:42 np0005548788.localdomain sudo[259611]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:42 np0005548788.localdomain sudo[259668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rozipqdieifmfflccxiholopibyokafv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014881.7246766-851-62130219374849/AnsiballZ_file.py
Dec 06 09:54:42 np0005548788.localdomain sudo[259668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:42 np0005548788.localdomain python3.9[259670]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:42 np0005548788.localdomain sudo[259668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:54:43 np0005548788.localdomain podman[259742]: 2025-12-06 09:54:43.264304926 +0000 UTC m=+0.089477104 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 09:54:43 np0005548788.localdomain sudo[259795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-domawhcrokzkbzlxmvmrzzentwfapgnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014882.9987657-887-26941539895374/AnsiballZ_stat.py
Dec 06 09:54:43 np0005548788.localdomain sudo[259795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:43 np0005548788.localdomain podman[259742]: 2025-12-06 09:54:43.308618229 +0000 UTC m=+0.133790377 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 09:54:43 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:54:43 np0005548788.localdomain python3.9[259797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:43 np0005548788.localdomain sudo[259795]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:43 np0005548788.localdomain sudo[259852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbhlzguunlnhaqfqvamfcypajrbqtjuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014882.9987657-887-26941539895374/AnsiballZ_file.py
Dec 06 09:54:43 np0005548788.localdomain sudo[259852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:44 np0005548788.localdomain python3.9[259854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:44 np0005548788.localdomain sudo[259852]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:44 np0005548788.localdomain sudo[259962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrhtpymxwomvjueatxgfanrborccejcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014884.2325296-923-189527003887493/AnsiballZ_systemd.py
Dec 06 09:54:44 np0005548788.localdomain sudo[259962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:44 np0005548788.localdomain python3.9[259964]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:54:44 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:54:44 np0005548788.localdomain systemd-rc-local-generator[259987]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:44 np0005548788.localdomain systemd-sysv-generator[259991]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548788.localdomain sudo[259962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:45 np0005548788.localdomain sudo[260110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrgctxbcsqfvtyctmtfsodeqlqbcvzvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014885.5174212-947-165375247470045/AnsiballZ_stat.py
Dec 06 09:54:45 np0005548788.localdomain sudo[260110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:54:45 np0005548788.localdomain podman[260113]: 2025-12-06 09:54:45.966162369 +0000 UTC m=+0.093115204 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:54:45 np0005548788.localdomain podman[260113]: 2025-12-06 09:54:45.981580707 +0000 UTC m=+0.108533542 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:54:45 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:54:46 np0005548788.localdomain python3.9[260112]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:46 np0005548788.localdomain sudo[260110]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:46 np0005548788.localdomain sudo[260191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiflwzmtmxoonryjzvmhogitztfjirne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014885.5174212-947-165375247470045/AnsiballZ_file.py
Dec 06 09:54:46 np0005548788.localdomain sudo[260191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:46 np0005548788.localdomain python3.9[260193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:46 np0005548788.localdomain sudo[260191]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:47 np0005548788.localdomain sudo[260301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsprzdxpitvsmbscglvhgauzchharhnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014886.8975112-983-166559220318380/AnsiballZ_stat.py
Dec 06 09:54:47 np0005548788.localdomain sudo[260301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:47 np0005548788.localdomain python3.9[260303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:54:47.414 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:54:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:54:47.415 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:54:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:54:47.415 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:54:47 np0005548788.localdomain sudo[260301]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:47 np0005548788.localdomain sudo[260358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njrcqgdschqabbxtlpktlvvzxzlhbjii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014886.8975112-983-166559220318380/AnsiballZ_file.py
Dec 06 09:54:47 np0005548788.localdomain sudo[260358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:47 np0005548788.localdomain python3.9[260360]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:47 np0005548788.localdomain sudo[260358]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:48 np0005548788.localdomain sudo[260468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxkuvcoephoysoemzsslivbexxshzxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014888.1323643-1019-252142372009480/AnsiballZ_systemd.py
Dec 06 09:54:48 np0005548788.localdomain sudo[260468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:48 np0005548788.localdomain python3.9[260470]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:54:48 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:54:48 np0005548788.localdomain systemd-rc-local-generator[260494]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:48 np0005548788.localdomain systemd-sysv-generator[260501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:54:49 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:54:49 np0005548788.localdomain sudo[260468]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54507 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3C3A50000000001030307) 
Dec 06 09:54:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:54:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:54:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:54:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144242 "" "Go-http-client/1.1"
Dec 06 09:54:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:54:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16312 "" "Go-http-client/1.1"
Dec 06 09:54:50 np0005548788.localdomain sudo[260620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxvqladxdwycyaocgmqhrqvtoauyryzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014889.8099248-1049-200337854958638/AnsiballZ_file.py
Dec 06 09:54:50 np0005548788.localdomain sudo[260620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:50 np0005548788.localdomain sshd[260623]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:50 np0005548788.localdomain python3.9[260622]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:50 np0005548788.localdomain sudo[260620]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54508 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3C7B10000000001030307) 
Dec 06 09:54:50 np0005548788.localdomain sudo[260732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahqldzauzpszthopimzjebdxtbwgzbda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014890.5345123-1073-55351592694241/AnsiballZ_stat.py
Dec 06 09:54:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:54:50 np0005548788.localdomain sudo[260732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:50 np0005548788.localdomain systemd[1]: tmp-crun.EjkvxK.mount: Deactivated successfully.
Dec 06 09:54:50 np0005548788.localdomain podman[260734]: 2025-12-06 09:54:50.969589021 +0000 UTC m=+0.104942823 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:54:51 np0005548788.localdomain podman[260734]: 2025-12-06 09:54:51.003273762 +0000 UTC m=+0.138627504 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:54:51 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:54:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25515 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3C9F00000000001030307) 
Dec 06 09:54:51 np0005548788.localdomain python3.9[260735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:51 np0005548788.localdomain sudo[260732]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:51 np0005548788.localdomain sudo[260838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usbbxjosktrirufggngifidzlgjijcdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014890.5345123-1073-55351592694241/AnsiballZ_copy.py
Dec 06 09:54:51 np0005548788.localdomain sudo[260838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:51 np0005548788.localdomain python3.9[260840]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014890.5345123-1073-55351592694241/.source.json _original_basename=.nokqqgxj follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:51 np0005548788.localdomain sudo[260838]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:52 np0005548788.localdomain sudo[260948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjgqhxkinapszipozqoplwfiikqqqwyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014891.8454523-1118-260290054066051/AnsiballZ_file.py
Dec 06 09:54:52 np0005548788.localdomain sudo[260948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:52 np0005548788.localdomain python3.9[260950]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:52 np0005548788.localdomain sudo[260948]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54509 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3CFB00000000001030307) 
Dec 06 09:54:52 np0005548788.localdomain sudo[261058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmrgqqyhjpzejeztlswndqfhgzfldstv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014892.5686073-1142-230207338728452/AnsiballZ_stat.py
Dec 06 09:54:52 np0005548788.localdomain sudo[261058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:53 np0005548788.localdomain sudo[261058]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:53 np0005548788.localdomain sudo[261146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vccbrwdeocnmmsjxjfaqixavzopxkbrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014892.5686073-1142-230207338728452/AnsiballZ_copy.py
Dec 06 09:54:53 np0005548788.localdomain sudo[261146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64011 DF PROTO=TCP SPT=51176 DPT=9102 SEQ=3178875552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3D3F00000000001030307) 
Dec 06 09:54:53 np0005548788.localdomain sudo[261146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:54 np0005548788.localdomain sudo[261256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsqrluznpihweilhojrhennyvokljvxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014894.2473853-1193-256367722258498/AnsiballZ_container_config_data.py
Dec 06 09:54:54 np0005548788.localdomain sudo[261256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:54 np0005548788.localdomain python3.9[261258]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Dec 06 09:54:54 np0005548788.localdomain sudo[261256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:55 np0005548788.localdomain sudo[261366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyxrspazwcdfcnwvhzlxlkpgjungdjht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014895.1543753-1220-235624939844817/AnsiballZ_container_config_hash.py
Dec 06 09:54:55 np0005548788.localdomain sudo[261366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:55 np0005548788.localdomain python3.9[261368]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:54:55 np0005548788.localdomain sudo[261366]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:56 np0005548788.localdomain sshd[260623]: Connection closed by 45.78.219.195 port 49906 [preauth]
Dec 06 09:54:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54510 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3DF710000000001030307) 
Dec 06 09:54:56 np0005548788.localdomain sudo[261476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhnhgayqqgxvyorwucyovwsxdlwxfouq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014896.2519317-1247-252331285151995/AnsiballZ_podman_container_info.py
Dec 06 09:54:56 np0005548788.localdomain sudo[261476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:56 np0005548788.localdomain python3.9[261478]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:54:57 np0005548788.localdomain sudo[261476]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:54:58 np0005548788.localdomain systemd[1]: tmp-crun.ampm1z.mount: Deactivated successfully.
Dec 06 09:54:58 np0005548788.localdomain podman[261522]: 2025-12-06 09:54:58.292587367 +0000 UTC m=+0.112959006 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:54:58 np0005548788.localdomain podman[261522]: 2025-12-06 09:54:58.341907642 +0000 UTC m=+0.162279341 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:54:58 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:55:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:00.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:00.185 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:55:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:00.186 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:55:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:00.385 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:55:01 np0005548788.localdomain sudo[261637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iirbhkxjwmwmdcbymjrkhvklqchzwwlo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014900.7120655-1286-77833501011473/AnsiballZ_edpm_container_manage.py
Dec 06 09:55:01 np0005548788.localdomain sudo[261637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:01 np0005548788.localdomain python3[261639]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:55:01 np0005548788.localdomain podman[261675]: 
Dec 06 09:55:01 np0005548788.localdomain podman[261675]: 2025-12-06 09:55:01.831815277 +0000 UTC m=+0.104461948 container create a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_id=neutron_dhcp)
Dec 06 09:55:01 np0005548788.localdomain podman[261675]: 2025-12-06 09:55:01.781944074 +0000 UTC m=+0.054590795 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:55:01 np0005548788.localdomain python3[261639]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:55:02 np0005548788.localdomain sudo[261637]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:02.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:02 np0005548788.localdomain sudo[261819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrfyfvfwtxghrfihpmsdnubbkgdnptcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014902.2667625-1310-76221389869111/AnsiballZ_stat.py
Dec 06 09:55:02 np0005548788.localdomain sudo[261819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:02 np0005548788.localdomain python3.9[261821]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:02 np0005548788.localdomain sudo[261819]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:03.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:03.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:03.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:55:03 np0005548788.localdomain sudo[261931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrmcvzhqntfmfknimqckzkindpgunqhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014903.1113958-1337-236004675003221/AnsiballZ_file.py
Dec 06 09:55:03 np0005548788.localdomain sudo[261931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:03 np0005548788.localdomain python3.9[261933]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:03 np0005548788.localdomain sudo[261931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:03 np0005548788.localdomain sudo[261986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaellkxtqetjkmbvjvccjqbfwkyzyrem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014903.1113958-1337-236004675003221/AnsiballZ_stat.py
Dec 06 09:55:03 np0005548788.localdomain sudo[261986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:04 np0005548788.localdomain python3.9[261988]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.176 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:04 np0005548788.localdomain sudo[261986]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.212 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.212 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.213 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.213 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.214 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.679 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:55:04 np0005548788.localdomain sudo[262117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izrhvlkfnkfqnyckcqkjriflrwasaepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014904.2565167-1337-85368388600096/AnsiballZ_copy.py
Dec 06 09:55:04 np0005548788.localdomain sudo[262117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54511 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F3FFF00000000001030307) 
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.923 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.925 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12979MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.925 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.926 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.982 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:55:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:04.982 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:55:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:05.005 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:55:05 np0005548788.localdomain python3.9[262119]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014904.2565167-1337-85368388600096/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:05 np0005548788.localdomain sudo[262117]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:05 np0005548788.localdomain sudo[262192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvssfumuobiqbtulrkgwpcqzwqkknzzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014904.2565167-1337-85368388600096/AnsiballZ_systemd.py
Dec 06 09:55:05 np0005548788.localdomain sudo[262192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:05.503 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:55:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:05.511 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:55:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:05.529 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:55:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:05.532 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:55:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:05.532 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:55:05 np0005548788.localdomain python3.9[262194]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:55:05 np0005548788.localdomain systemd-rc-local-generator[262237]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:05 np0005548788.localdomain systemd-sysv-generator[262240]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:05 np0005548788.localdomain podman[262198]: 2025-12-06 09:55:05.839587563 +0000 UTC m=+0.107572913 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:55:05 np0005548788.localdomain podman[262198]: 2025-12-06 09:55:05.871475709 +0000 UTC m=+0.139461109 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:55:06 np0005548788.localdomain sudo[262192]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:06.534 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:06 np0005548788.localdomain sudo[262303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rupfvhznaygzacewjyzhivdgjtovhjwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014904.2565167-1337-85368388600096/AnsiballZ_systemd.py
Dec 06 09:55:06 np0005548788.localdomain sudo[262303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:06 np0005548788.localdomain python3.9[262305]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:55:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:07.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:55:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:07 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:55:08 np0005548788.localdomain systemd-sysv-generator[262336]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:08 np0005548788.localdomain systemd-rc-local-generator[262332]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:55:08.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: tmp-crun.1b4aC8.mount: Deactivated successfully.
Dec 06 09:55:08 np0005548788.localdomain podman[262345]: 2025-12-06 09:55:08.437487004 +0000 UTC m=+0.108311725 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:55:08 np0005548788.localdomain podman[262345]: 2025-12-06 09:55:08.449580611 +0000 UTC m=+0.120405342 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:55:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f83b472a54abe4b775c1740fc46366c397bdf2a4e2bba74e6e2610f6c8578559/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f83b472a54abe4b775c1740fc46366c397bdf2a4e2bba74e6e2610f6c8578559/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:08 np0005548788.localdomain podman[262352]: 2025-12-06 09:55:08.499148964 +0000 UTC m=+0.148168874 container init a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, org.label-schema.build-date=20251125)
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + sudo -E kolla_set_configs
Dec 06 09:55:08 np0005548788.localdomain podman[262352]: 2025-12-06 09:55:08.510658262 +0000 UTC m=+0.159678152 container start a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_dhcp, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:55:08 np0005548788.localdomain podman[262352]: neutron_dhcp_agent
Dec 06 09:55:08 np0005548788.localdomain systemd[1]: Started neutron_dhcp_agent container.
Dec 06 09:55:08 np0005548788.localdomain sudo[262303]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Validating config file
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Copying service configuration files
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Writing out command to execute
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: ++ cat /run_command
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + CMD=/usr/bin/neutron-dhcp-agent
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + ARGS=
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + sudo kolla_copy_cacerts
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + [[ ! -n '' ]]
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + . kolla_extend_start
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: Running command: '/usr/bin/neutron-dhcp-agent'
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + umask 0022
Dec 06 09:55:08 np0005548788.localdomain neutron_dhcp_agent[262383]: + exec /usr/bin/neutron-dhcp-agent
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:55:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:55:09 np0005548788.localdomain sudo[262505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywfqndakmdjxldweqoidriimwbjlvfax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014908.7729406-1421-153137957630868/AnsiballZ_systemd.py
Dec 06 09:55:09 np0005548788.localdomain sudo[262505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:09 np0005548788.localdomain python3.9[262507]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: libpod-a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df.scope: Deactivated successfully.
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: libpod-a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df.scope: Consumed 1.103s CPU time.
Dec 06 09:55:09 np0005548788.localdomain podman[262511]: 2025-12-06 09:55:09.629761911 +0000 UTC m=+0.091011280 container died a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:55:09 np0005548788.localdomain podman[262511]: 2025-12-06 09:55:09.684131439 +0000 UTC m=+0.145380758 container cleanup a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:55:09 np0005548788.localdomain podman[262511]: neutron_dhcp_agent
Dec 06 09:55:09 np0005548788.localdomain podman[262552]: error opening file `/run/crun/a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df/status`: No such file or directory
Dec 06 09:55:09 np0005548788.localdomain podman[262540]: 2025-12-06 09:55:09.780673046 +0000 UTC m=+0.065335602 container cleanup a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:55:09 np0005548788.localdomain podman[262540]: neutron_dhcp_agent
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:55:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f83b472a54abe4b775c1740fc46366c397bdf2a4e2bba74e6e2610f6c8578559/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f83b472a54abe4b775c1740fc46366c397bdf2a4e2bba74e6e2610f6c8578559/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:09 np0005548788.localdomain podman[262554]: 2025-12-06 09:55:09.923678121 +0000 UTC m=+0.107644754 container init a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:55:09 np0005548788.localdomain podman[262554]: 2025-12-06 09:55:09.933731946 +0000 UTC m=+0.117698579 container start a5467f0cdff889f04bbca63f9bb52e9ac42811f82d97064068be9fb3661059df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'f6679be553031bd59304d98bb09a9d35a1b61ccaf92bf9beb0f84ea0995b7899'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.license=GPLv2, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:55:09 np0005548788.localdomain podman[262554]: neutron_dhcp_agent
Dec 06 09:55:09 np0005548788.localdomain neutron_dhcp_agent[262568]: + sudo -E kolla_set_configs
Dec 06 09:55:09 np0005548788.localdomain systemd[1]: Started neutron_dhcp_agent container.
Dec 06 09:55:09 np0005548788.localdomain sudo[262505]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Validating config file
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Copying service configuration files
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Writing out command to execute
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: ++ cat /run_command
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + CMD=/usr/bin/neutron-dhcp-agent
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + ARGS=
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + sudo kolla_copy_cacerts
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + [[ ! -n '' ]]
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + . kolla_extend_start
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: Running command: '/usr/bin/neutron-dhcp-agent'
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + umask 0022
Dec 06 09:55:10 np0005548788.localdomain neutron_dhcp_agent[262568]: + exec /usr/bin/neutron-dhcp-agent
Dec 06 09:55:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:55:10 np0005548788.localdomain systemd[1]: tmp-crun.AZdrWu.mount: Deactivated successfully.
Dec 06 09:55:10 np0005548788.localdomain podman[262599]: 2025-12-06 09:55:10.526277162 +0000 UTC m=+0.102089337 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 09:55:10 np0005548788.localdomain podman[262599]: 2025-12-06 09:55:10.540727979 +0000 UTC m=+0.116540184 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:55:10 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:55:10 np0005548788.localdomain sshd[255693]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:55:10 np0005548788.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Dec 06 09:55:10 np0005548788.localdomain systemd[1]: session-58.scope: Consumed 37.469s CPU time.
Dec 06 09:55:10 np0005548788.localdomain systemd-logind[765]: Session 58 logged out. Waiting for processes to exit.
Dec 06 09:55:10 np0005548788.localdomain systemd-logind[765]: Removed session 58.
Dec 06 09:55:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 09:55:11.309 262572 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:55:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 09:55:11.310 262572 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Dec 06 09:55:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 09:55:11.751 262572 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 06 09:55:12 np0005548788.localdomain sshd[250358]: fatal: Timeout before authentication for 101.47.142.76 port 45466
Dec 06 09:55:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 09:55:12.773 262572 INFO neutron.agent.dhcp.agent [None req-56448651-b485-4592-ab61-4c83ec4f6c61 - - - - - -] All active networks have been fetched through RPC.
Dec 06 09:55:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 09:55:12.773 262572 INFO neutron.agent.dhcp.agent [None req-56448651-b485-4592-ab61-4c83ec4f6c61 - - - - - -] Synchronizing state complete
Dec 06 09:55:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 09:55:12.882 262572 INFO neutron.agent.dhcp.agent [None req-56448651-b485-4592-ab61-4c83ec4f6c61 - - - - - -] DHCP agent started
Dec 06 09:55:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:55:13 np0005548788.localdomain systemd[1]: tmp-crun.sElD51.mount: Deactivated successfully.
Dec 06 09:55:13 np0005548788.localdomain podman[262621]: 2025-12-06 09:55:13.712393777 +0000 UTC m=+0.101152548 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:55:13 np0005548788.localdomain podman[262621]: 2025-12-06 09:55:13.724415081 +0000 UTC m=+0.113173852 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:55:13 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:55:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:55:15.579 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:55:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:55:15.580 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 09:55:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:55:15.581 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:55:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:55:16 np0005548788.localdomain podman[262640]: 2025-12-06 09:55:16.278552086 +0000 UTC m=+0.103981523 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:55:16 np0005548788.localdomain podman[262640]: 2025-12-06 09:55:16.31725756 +0000 UTC m=+0.142687027 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:55:16 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:55:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7641 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F438D40000000001030307) 
Dec 06 09:55:19 np0005548788.localdomain podman[240078]: time="2025-12-06T09:55:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:55:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:55:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:55:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:55:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16751 "" "Go-http-client/1.1"
Dec 06 09:55:20 np0005548788.localdomain sshd[262662]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7642 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F43CF00000000001030307) 
Dec 06 09:55:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:55:21 np0005548788.localdomain systemd[1]: tmp-crun.jm5cwo.mount: Deactivated successfully.
Dec 06 09:55:21 np0005548788.localdomain podman[262664]: 2025-12-06 09:55:21.263258759 +0000 UTC m=+0.088626297 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:55:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54512 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F43FF10000000001030307) 
Dec 06 09:55:21 np0005548788.localdomain podman[262664]: 2025-12-06 09:55:21.29361412 +0000 UTC m=+0.118981658 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 06 09:55:21 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:55:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7643 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F444F00000000001030307) 
Dec 06 09:55:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25516 DF PROTO=TCP SPT=41790 DPT=9102 SEQ=2563339886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F447F10000000001030307) 
Dec 06 09:55:23 np0005548788.localdomain sshd[262680]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:23 np0005548788.localdomain sshd[262680]: Accepted publickey for zuul from 192.168.122.30 port 59806 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:55:23 np0005548788.localdomain systemd-logind[765]: New session 59 of user zuul.
Dec 06 09:55:23 np0005548788.localdomain systemd[1]: Started Session 59 of User zuul.
Dec 06 09:55:23 np0005548788.localdomain sshd[262680]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:55:24 np0005548788.localdomain python3.9[262791]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:55:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7644 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F454B00000000001030307) 
Dec 06 09:55:26 np0005548788.localdomain python3.9[262903]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:55:26 np0005548788.localdomain network[262920]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:55:26 np0005548788.localdomain network[262921]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:55:26 np0005548788.localdomain network[262922]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:55:27 np0005548788.localdomain sshd[262662]: Received disconnect from 45.78.194.186 port 39790:11: Bye Bye [preauth]
Dec 06 09:55:27 np0005548788.localdomain sshd[262662]: Disconnected from authenticating user root 45.78.194.186 port 39790 [preauth]
Dec 06 09:55:28 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:55:28 np0005548788.localdomain podman[262980]: 2025-12-06 09:55:28.494726042 +0000 UTC m=+0.104976064 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:55:28 np0005548788.localdomain podman[262980]: 2025-12-06 09:55:28.568863289 +0000 UTC m=+0.179113321 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:55:28 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:55:33 np0005548788.localdomain sudo[263179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-humovbcfyotruaqmbreoritcrujyopwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014933.1369755-101-99695739064565/AnsiballZ_setup.py
Dec 06 09:55:33 np0005548788.localdomain sudo[263179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:33 np0005548788.localdomain python3.9[263181]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:55:34 np0005548788.localdomain sudo[263179]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:34 np0005548788.localdomain sudo[263242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsggjdjzuumqnxyhtpnybjnpmtnybglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014933.1369755-101-99695739064565/AnsiballZ_dnf.py
Dec 06 09:55:34 np0005548788.localdomain sudo[263242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:34 np0005548788.localdomain python3.9[263244]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:55:34 np0005548788.localdomain sudo[263246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:55:34 np0005548788.localdomain sudo[263246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:34 np0005548788.localdomain sudo[263246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:34 np0005548788.localdomain sudo[263264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:55:34 np0005548788.localdomain sudo[263264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7645 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F475F00000000001030307) 
Dec 06 09:55:35 np0005548788.localdomain podman[263354]: 2025-12-06 09:55:35.901815635 +0000 UTC m=+0.090892077 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, release=1763362218, io.openshift.expose-services=, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 09:55:36 np0005548788.localdomain podman[263354]: 2025-12-06 09:55:36.006572621 +0000 UTC m=+0.195649003 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, architecture=x86_64, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, release=1763362218, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7)
Dec 06 09:55:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:55:36 np0005548788.localdomain podman[263402]: 2025-12-06 09:55:36.214634218 +0000 UTC m=+0.098556518 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:55:36 np0005548788.localdomain podman[263402]: 2025-12-06 09:55:36.225128677 +0000 UTC m=+0.109050967 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:55:36 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:55:36 np0005548788.localdomain sudo[263264]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:36 np0005548788.localdomain sudo[263436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:55:36 np0005548788.localdomain sudo[263436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:36 np0005548788.localdomain sudo[263436]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:36 np0005548788.localdomain sudo[263454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:55:36 np0005548788.localdomain sudo[263454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:37 np0005548788.localdomain sudo[263454]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:37 np0005548788.localdomain sudo[263505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:55:37 np0005548788.localdomain sudo[263505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:37 np0005548788.localdomain sudo[263505]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:38 np0005548788.localdomain sudo[263242]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:38 np0005548788.localdomain sudo[263630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwgdjintlawswobfvltrmhejvegyymzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014938.2770605-137-86088277780078/AnsiballZ_stat.py
Dec 06 09:55:38 np0005548788.localdomain sudo[263630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:55:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:55:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:55:38 np0005548788.localdomain podman[263633]: 2025-12-06 09:55:38.87881984 +0000 UTC m=+0.109397998 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:55:38 np0005548788.localdomain podman[263633]: 2025-12-06 09:55:38.918678719 +0000 UTC m=+0.149256917 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:55:38 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:55:39 np0005548788.localdomain python3.9[263632]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:39 np0005548788.localdomain sudo[263630]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:39 np0005548788.localdomain sudo[263761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbxupgwsalwsksfvbytnkjkpolhourvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014939.3654277-167-224386739298401/AnsiballZ_command.py
Dec 06 09:55:39 np0005548788.localdomain sudo[263761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:39 np0005548788.localdomain python3.9[263763]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:55:40 np0005548788.localdomain sudo[263761]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:40 np0005548788.localdomain sudo[263872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swwbdukarrtpzyjkdjoztjjgffldjsml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014940.4059055-197-59199988345633/AnsiballZ_stat.py
Dec 06 09:55:40 np0005548788.localdomain sudo[263872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:55:40 np0005548788.localdomain podman[263875]: 2025-12-06 09:55:40.873486944 +0000 UTC m=+0.090860556 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Dec 06 09:55:40 np0005548788.localdomain podman[263875]: 2025-12-06 09:55:40.886907051 +0000 UTC m=+0.104280663 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:55:40 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:55:40 np0005548788.localdomain python3.9[263874]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:41 np0005548788.localdomain sudo[263872]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:42 np0005548788.localdomain sudo[264002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnbcklagusrvguyjmvvnzfedwirakqyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014941.4700685-230-59791754327701/AnsiballZ_lineinfile.py
Dec 06 09:55:42 np0005548788.localdomain sudo[264002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:42 np0005548788.localdomain python3.9[264004]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:42 np0005548788.localdomain sudo[264002]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:43 np0005548788.localdomain sudo[264112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wckpuycbqpeqzvjcstjbssjjrbtvgdgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014942.6281948-257-79248708307609/AnsiballZ_systemd_service.py
Dec 06 09:55:43 np0005548788.localdomain sudo[264112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:43 np0005548788.localdomain python3.9[264114]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:55:43 np0005548788.localdomain sudo[264112]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:55:44 np0005548788.localdomain sudo[264235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzntrohgulfpjyjglturvpfwxzftrdgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014943.9101214-281-75585063833978/AnsiballZ_systemd_service.py
Dec 06 09:55:44 np0005548788.localdomain podman[264201]: 2025-12-06 09:55:44.267312054 +0000 UTC m=+0.090071877 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:55:44 np0005548788.localdomain sudo[264235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:44 np0005548788.localdomain podman[264201]: 2025-12-06 09:55:44.280287741 +0000 UTC m=+0.103047554 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 06 09:55:44 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:55:44 np0005548788.localdomain python3.9[264245]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:55:45 np0005548788.localdomain sudo[264235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:46 np0005548788.localdomain sudo[264355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swxrhtgihxodoecbhtjqpfwxwbeeffqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014946.1146853-314-117256255263899/AnsiballZ_service_facts.py
Dec 06 09:55:46 np0005548788.localdomain sudo[264355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:55:46 np0005548788.localdomain systemd[1]: tmp-crun.vzhxGK.mount: Deactivated successfully.
Dec 06 09:55:46 np0005548788.localdomain podman[264358]: 2025-12-06 09:55:46.572425682 +0000 UTC m=+0.104589663 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:55:46 np0005548788.localdomain podman[264358]: 2025-12-06 09:55:46.58223798 +0000 UTC m=+0.114401971 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:55:46 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:55:46 np0005548788.localdomain python3.9[264357]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:55:46 np0005548788.localdomain network[264398]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:55:46 np0005548788.localdomain network[264399]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:55:46 np0005548788.localdomain network[264400]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:55:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:55:47.415 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:55:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:55:47.417 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:55:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:55:47.417 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:55:48 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:48 np0005548788.localdomain sshd[264448]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46360 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4AE050000000001030307) 
Dec 06 09:55:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:55:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:55:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:55:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:55:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:55:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16743 "" "Go-http-client/1.1"
Dec 06 09:55:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46361 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4B1F10000000001030307) 
Dec 06 09:55:51 np0005548788.localdomain sudo[264355]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7646 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4B5F10000000001030307) 
Dec 06 09:55:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:55:52 np0005548788.localdomain systemd[1]: tmp-crun.6KCmNy.mount: Deactivated successfully.
Dec 06 09:55:52 np0005548788.localdomain podman[264543]: 2025-12-06 09:55:52.271120648 +0000 UTC m=+0.092572965 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:55:52 np0005548788.localdomain podman[264543]: 2025-12-06 09:55:52.277592131 +0000 UTC m=+0.099044448 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:55:52 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:55:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46362 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4B9F10000000001030307) 
Dec 06 09:55:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54513 DF PROTO=TCP SPT=44094 DPT=9102 SEQ=582304889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4BDF00000000001030307) 
Dec 06 09:55:54 np0005548788.localdomain sudo[264649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkknbecppgybfwucnclzdegepfqphabj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014954.331012-344-135181544215422/AnsiballZ_file.py
Dec 06 09:55:54 np0005548788.localdomain sudo[264649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:55 np0005548788.localdomain python3.9[264651]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:55:55 np0005548788.localdomain sudo[264649]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:55 np0005548788.localdomain sudo[264759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjwpdxscdjbzzxnpiicpfgxecetcvsfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014955.2656493-368-162733453944297/AnsiballZ_modprobe.py
Dec 06 09:55:55 np0005548788.localdomain sudo[264759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:55 np0005548788.localdomain python3.9[264761]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 06 09:55:55 np0005548788.localdomain sudo[264759]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:56 np0005548788.localdomain sudo[264869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adxldmdtydowkigigtvfiwrlgyeibpej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014956.1771765-392-238601292647785/AnsiballZ_stat.py
Dec 06 09:55:56 np0005548788.localdomain sudo[264869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46363 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4C9B00000000001030307) 
Dec 06 09:55:56 np0005548788.localdomain python3.9[264871]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:56 np0005548788.localdomain sudo[264869]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:56 np0005548788.localdomain sudo[264926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvzjrkaseponzqasfdfbkgnzrjnzqsby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014956.1771765-392-238601292647785/AnsiballZ_file.py
Dec 06 09:55:56 np0005548788.localdomain sudo[264926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:57 np0005548788.localdomain python3.9[264928]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:57 np0005548788.localdomain sudo[264926]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:57 np0005548788.localdomain sshd[264448]: Connection closed by 101.47.142.76 port 39304 [preauth]
Dec 06 09:55:57 np0005548788.localdomain sudo[265037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esredyoyptpxxvrhpoexricdeiwavwjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014957.500657-431-129124728643880/AnsiballZ_lineinfile.py
Dec 06 09:55:57 np0005548788.localdomain sudo[265037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:57 np0005548788.localdomain python3.9[265039]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:58 np0005548788.localdomain sudo[265037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:58 np0005548788.localdomain sudo[265147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqkkzzrssfjsmlegkaqvajhragulwvdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014958.266861-458-68474296737357/AnsiballZ_file.py
Dec 06 09:55:58 np0005548788.localdomain sudo[265147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:58 np0005548788.localdomain python3.9[265149]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:58 np0005548788.localdomain sudo[265147]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:55:59 np0005548788.localdomain podman[265204]: 2025-12-06 09:55:59.270813851 +0000 UTC m=+0.086643780 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 06 09:55:59 np0005548788.localdomain podman[265204]: 2025-12-06 09:55:59.343169211 +0000 UTC m=+0.158999170 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller)
Dec 06 09:55:59 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:55:59 np0005548788.localdomain sudo[265282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjymwjnbofiapyeuqqcrbenzhjjfquqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014959.1106284-485-179415038899690/AnsiballZ_stat.py
Dec 06 09:55:59 np0005548788.localdomain sudo[265282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:59 np0005548788.localdomain python3.9[265284]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:59 np0005548788.localdomain sudo[265282]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:00.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:00.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:56:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:00.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:56:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:00.210 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:56:00 np0005548788.localdomain sudo[265394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jelrynilifvmkpmndobgbnkxezsvaimy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014960.0594468-512-68090345555048/AnsiballZ_stat.py
Dec 06 09:56:00 np0005548788.localdomain sudo[265394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:00 np0005548788.localdomain python3.9[265396]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:00 np0005548788.localdomain sudo[265394]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:01 np0005548788.localdomain systemd[1]: Starting dnf makecache...
Dec 06 09:56:01 np0005548788.localdomain sudo[265506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oretrhpsgcnyunqpmzvjfmsvqddspcgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014960.8789163-539-165428263621908/AnsiballZ_command.py
Dec 06 09:56:01 np0005548788.localdomain sudo[265506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:01 np0005548788.localdomain python3.9[265509]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:01 np0005548788.localdomain sudo[265506]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:01 np0005548788.localdomain dnf[265508]: Updating Subscription Management repositories.
Dec 06 09:56:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:02.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:02 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:02.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 09:56:02 np0005548788.localdomain sudo[265618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsadueiyutxohyqcoacnyzclrxcmhagz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014961.7673252-569-146340716509922/AnsiballZ_replace.py
Dec 06 09:56:02 np0005548788.localdomain sudo[265618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:02 np0005548788.localdomain python3.9[265620]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:02 np0005548788.localdomain sudo[265618]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:03 np0005548788.localdomain sudo[265728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfgkztbhngozbtxdwfjvcohogbdvrrjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014962.7033083-596-254154406294096/AnsiballZ_lineinfile.py
Dec 06 09:56:03 np0005548788.localdomain sudo[265728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:03 np0005548788.localdomain python3.9[265730]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:03.197 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:03.215 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:03.216 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:03.216 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:56:03 np0005548788.localdomain sudo[265728]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:03 np0005548788.localdomain dnf[265508]: Metadata cache refreshed recently.
Dec 06 09:56:03 np0005548788.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 09:56:03 np0005548788.localdomain systemd[1]: Finished dnf makecache.
Dec 06 09:56:03 np0005548788.localdomain systemd[1]: dnf-makecache.service: Consumed 2.195s CPU time.
Dec 06 09:56:03 np0005548788.localdomain sudo[265838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eykmvygjtixcliedkczajtypaztbirpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014963.3369884-596-14272213183553/AnsiballZ_lineinfile.py
Dec 06 09:56:03 np0005548788.localdomain sudo[265838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:03 np0005548788.localdomain python3.9[265840]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:03 np0005548788.localdomain sudo[265838]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.203 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.204 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.204 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.204 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.205 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:56:04 np0005548788.localdomain sudo[265949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tffabugjkunufteubynpgiwhmcrqnuzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014963.9940197-596-21921074228899/AnsiballZ_lineinfile.py
Dec 06 09:56:04 np0005548788.localdomain sudo[265949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:04 np0005548788.localdomain python3.9[265951]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:04 np0005548788.localdomain sudo[265949]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.673 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:56:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46364 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F4E9F10000000001030307) 
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.900 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.902 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12841MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.903 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:04.904 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.015 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.016 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:56:05 np0005548788.localdomain sudo[266080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jixabgdwdoemloluuqnabzxjsjfkasre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014964.6561985-596-32025227878289/AnsiballZ_lineinfile.py
Dec 06 09:56:05 np0005548788.localdomain sudo[266080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.104 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.213 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.213 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:56:05 np0005548788.localdomain python3.9[266082]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.234 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:56:05 np0005548788.localdomain sudo[266080]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.276 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SVM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NODE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.312 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:56:05 np0005548788.localdomain sudo[266210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxatgnkzsvgtsekwmyunaagtlehjzcyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014965.4177468-683-147212489530547/AnsiballZ_stat.py
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.762 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:56:05 np0005548788.localdomain sudo[266210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.770 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.789 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.791 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.792 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.793 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.793 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 09:56:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:05.814 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 09:56:05 np0005548788.localdomain python3.9[266214]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:05 np0005548788.localdomain sudo[266210]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:06 np0005548788.localdomain sudo[266324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyhgmrddnqcogeueznntdrxaqbftswvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014966.5268595-713-87973351536954/AnsiballZ_file.py
Dec 06 09:56:06 np0005548788.localdomain sudo[266324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:56:06 np0005548788.localdomain podman[266327]: 2025-12-06 09:56:06.94742511 +0000 UTC m=+0.095691903 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:56:06 np0005548788.localdomain podman[266327]: 2025-12-06 09:56:06.957528057 +0000 UTC m=+0.105794910 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:56:06 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:56:07 np0005548788.localdomain python3.9[266326]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:07 np0005548788.localdomain sudo[266324]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:07 np0005548788.localdomain sudo[266452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwtdpiqqxsprowpvduqrccoomtgpuxly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014967.2480736-737-131031401995419/AnsiballZ_stat.py
Dec 06 09:56:07 np0005548788.localdomain sudo[266452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:07 np0005548788.localdomain python3.9[266454]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:07 np0005548788.localdomain sudo[266452]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:07.809 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:07.810 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:08 np0005548788.localdomain sudo[266509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naghicuodeeucnzxfagptwgrqebnbvji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014967.2480736-737-131031401995419/AnsiballZ_file.py
Dec 06 09:56:08 np0005548788.localdomain sudo[266509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:08.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:08.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:08 np0005548788.localdomain python3.9[266511]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:08 np0005548788.localdomain sudo[266509]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:08 np0005548788.localdomain sudo[266619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbrxgsfczcnburtocropyuoarmfuhcqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014968.4287775-737-210844658849234/AnsiballZ_stat.py
Dec 06 09:56:08 np0005548788.localdomain sudo[266619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:56:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:56:08 np0005548788.localdomain python3.9[266621]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:09 np0005548788.localdomain sudo[266619]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:56:09 np0005548788.localdomain sudo[266688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynxwrsxlicmnvnyjbevkiaizopqldefj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014968.4287775-737-210844658849234/AnsiballZ_file.py
Dec 06 09:56:09 np0005548788.localdomain sudo[266688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:09 np0005548788.localdomain podman[266656]: 2025-12-06 09:56:09.273526097 +0000 UTC m=+0.091996578 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:56:09 np0005548788.localdomain podman[266656]: 2025-12-06 09:56:09.314859664 +0000 UTC m=+0.133330105 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:56:09 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:56:09 np0005548788.localdomain python3.9[266697]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:09 np0005548788.localdomain sudo[266688]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:10 np0005548788.localdomain sudo[266810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oomnwslvuihojztaysfitdvvasmazfdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014969.6994233-806-79788100104841/AnsiballZ_file.py
Dec 06 09:56:10 np0005548788.localdomain sudo[266810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:10 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:10.209 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:10 np0005548788.localdomain python3.9[266812]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:10 np0005548788.localdomain sudo[266810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:10 np0005548788.localdomain sudo[266920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inkyiflrrhkvzfyqlfrjkrezdqrfmlco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014970.41295-830-216148217108796/AnsiballZ_stat.py
Dec 06 09:56:10 np0005548788.localdomain sudo[266920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:10 np0005548788.localdomain python3.9[266922]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:10 np0005548788.localdomain sudo[266920]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:56:11 np0005548788.localdomain sudo[266983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erhuwnuvjlpzhbpwcausrhlctamkskli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014970.41295-830-216148217108796/AnsiballZ_file.py
Dec 06 09:56:11 np0005548788.localdomain sudo[266983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:11 np0005548788.localdomain podman[266965]: 2025-12-06 09:56:11.288485073 +0000 UTC m=+0.108964759 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 09:56:11 np0005548788.localdomain podman[266965]: 2025-12-06 09:56:11.330568364 +0000 UTC m=+0.151048080 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Dec 06 09:56:11 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:56:11 np0005548788.localdomain python3.9[266990]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:11 np0005548788.localdomain sudo[266983]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:11 np0005548788.localdomain sudo[267107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivwlgmttxytchzebmasjaxswsgqydcsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014971.6671262-866-145740289663469/AnsiballZ_stat.py
Dec 06 09:56:11 np0005548788.localdomain sudo[267107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:12 np0005548788.localdomain python3.9[267109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:12 np0005548788.localdomain sudo[267107]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:12 np0005548788.localdomain sudo[267164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btwvmzdzokenfxxndlgfvyvdesfnibtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014971.6671262-866-145740289663469/AnsiballZ_file.py
Dec 06 09:56:12 np0005548788.localdomain sudo[267164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:12 np0005548788.localdomain python3.9[267166]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:12 np0005548788.localdomain sudo[267164]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:13 np0005548788.localdomain sudo[267274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwaylthqmfhqcdekjazndhsmckyuqmyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014972.9906561-902-190014014961661/AnsiballZ_systemd.py
Dec 06 09:56:13 np0005548788.localdomain sudo[267274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:13 np0005548788.localdomain python3.9[267276]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:13 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:56:14 np0005548788.localdomain systemd-sysv-generator[267300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:14 np0005548788.localdomain systemd-rc-local-generator[267294]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: tmp-crun.uZNxmg.mount: Deactivated successfully.
Dec 06 09:56:14 np0005548788.localdomain podman[267314]: 2025-12-06 09:56:14.418794691 +0000 UTC m=+0.081338263 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 09:56:14 np0005548788.localdomain podman[267314]: 2025-12-06 09:56:14.430104515 +0000 UTC m=+0.092648077 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Dec 06 09:56:14 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:56:15 np0005548788.localdomain sudo[267274]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:15 np0005548788.localdomain sudo[267442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udhpbhwkvytskocewdnousymprasoxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014975.5945206-926-35788997600219/AnsiballZ_stat.py
Dec 06 09:56:15 np0005548788.localdomain sudo[267442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:16 np0005548788.localdomain python3.9[267444]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:16 np0005548788.localdomain sudo[267442]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:16 np0005548788.localdomain sudo[267499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cefblxexzddynpktjquowkcorflfyvrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014975.5945206-926-35788997600219/AnsiballZ_file.py
Dec 06 09:56:16 np0005548788.localdomain sudo[267499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:16 np0005548788.localdomain python3.9[267501]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:16 np0005548788.localdomain sudo[267499]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:17 np0005548788.localdomain sudo[267609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvfzgoqhzkmaeslrmsmrnxmbpvhorobv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014976.8110187-962-46781311149552/AnsiballZ_stat.py
Dec 06 09:56:17 np0005548788.localdomain sudo[267609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:56:17 np0005548788.localdomain systemd[1]: tmp-crun.ROh458.mount: Deactivated successfully.
Dec 06 09:56:17 np0005548788.localdomain podman[267612]: 2025-12-06 09:56:17.215591355 +0000 UTC m=+0.105095408 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:56:17 np0005548788.localdomain podman[267612]: 2025-12-06 09:56:17.22530714 +0000 UTC m=+0.114811113 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:56:17 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:56:17 np0005548788.localdomain python3.9[267611]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:17 np0005548788.localdomain sudo[267609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:17 np0005548788.localdomain sudo[267688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrbiqqleacndhuvuxopqcwunwhoikgql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014976.8110187-962-46781311149552/AnsiballZ_file.py
Dec 06 09:56:17 np0005548788.localdomain sudo[267688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:17 np0005548788.localdomain python3.9[267690]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:17 np0005548788.localdomain sudo[267688]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:18 np0005548788.localdomain sudo[267798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezxvzkxcfgzvmedrjxmfwyyiwybospws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014978.0814674-998-59197632127441/AnsiballZ_systemd.py
Dec 06 09:56:18 np0005548788.localdomain sudo[267798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:18 np0005548788.localdomain python3.9[267800]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:56:18 np0005548788.localdomain systemd-sysv-generator[267828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:18 np0005548788.localdomain systemd-rc-local-generator[267823]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:19 np0005548788.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:56:19 np0005548788.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:56:19 np0005548788.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:56:19 np0005548788.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:56:19 np0005548788.localdomain sudo[267798]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63167 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F523340000000001030307) 
Dec 06 09:56:19 np0005548788.localdomain podman[240078]: time="2025-12-06T09:56:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:56:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:56:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:56:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:56:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16745 "" "Go-http-client/1.1"
Dec 06 09:56:19 np0005548788.localdomain sudo[267950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mylkfqpsjxlgvuwhnnauiwaxkcjxrhtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014979.6693115-1028-635027888200/AnsiballZ_file.py
Dec 06 09:56:19 np0005548788.localdomain sudo[267950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:20 np0005548788.localdomain python3.9[267952]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:20 np0005548788.localdomain sudo[267950]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63168 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F527300000000001030307) 
Dec 06 09:56:20 np0005548788.localdomain sudo[268060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raiznykobhbayqmroliimghdlqykxqra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014980.4768338-1052-73260553907095/AnsiballZ_stat.py
Dec 06 09:56:20 np0005548788.localdomain sudo[268060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:20 np0005548788.localdomain python3.9[268062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:21 np0005548788.localdomain sudo[268060]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46365 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F529F00000000001030307) 
Dec 06 09:56:21 np0005548788.localdomain sudo[268117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-istqieufypxgqvktjlopbvbvgyroxuxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014980.4768338-1052-73260553907095/AnsiballZ_file.py
Dec 06 09:56:21 np0005548788.localdomain sudo[268117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:21 np0005548788.localdomain python3.9[268119]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:21 np0005548788.localdomain sudo[268117]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:22 np0005548788.localdomain sudo[268227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odkudjovfmadwpiblgkpaklrrmkotbtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014981.9273345-1094-44583591532937/AnsiballZ_file.py
Dec 06 09:56:22 np0005548788.localdomain sudo[268227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:22 np0005548788.localdomain python3.9[268229]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:22 np0005548788.localdomain sudo[268227]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63169 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F52F300000000001030307) 
Dec 06 09:56:23 np0005548788.localdomain sudo[268337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qovnwulewndnuwyxxgkcvymckkfmssbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014982.7511516-1118-214251158338029/AnsiballZ_stat.py
Dec 06 09:56:23 np0005548788.localdomain sudo[268337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:56:23 np0005548788.localdomain systemd[1]: tmp-crun.pQM2xK.mount: Deactivated successfully.
Dec 06 09:56:23 np0005548788.localdomain podman[268340]: 2025-12-06 09:56:23.148248731 +0000 UTC m=+0.096636452 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:56:23 np0005548788.localdomain podman[268340]: 2025-12-06 09:56:23.18677176 +0000 UTC m=+0.135159521 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:56:23 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:56:23 np0005548788.localdomain python3.9[268339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:23 np0005548788.localdomain sudo[268337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:23 np0005548788.localdomain sudo[268411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szkzrskmgduqfovamycxczsbwnoydhsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014982.7511516-1118-214251158338029/AnsiballZ_file.py
Dec 06 09:56:23 np0005548788.localdomain sudo[268411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7647 DF PROTO=TCP SPT=47412 DPT=9102 SEQ=1477047818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F533F00000000001030307) 
Dec 06 09:56:23 np0005548788.localdomain python3.9[268413]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.8bvecf6n recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:23 np0005548788.localdomain sudo[268411]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:24 np0005548788.localdomain sudo[268521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfytnvpsippoziekyvbcbuhrufalpieo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.1554422-1154-280009672766943/AnsiballZ_file.py
Dec 06 09:56:24 np0005548788.localdomain sudo[268521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:24 np0005548788.localdomain python3.9[268523]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:24 np0005548788.localdomain sudo[268521]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:25 np0005548788.localdomain sudo[268631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eckdroodsharxcsmclwayebwsfdzlvuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.8858294-1178-121727244139451/AnsiballZ_stat.py
Dec 06 09:56:25 np0005548788.localdomain sudo[268631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:25 np0005548788.localdomain sudo[268631]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:25 np0005548788.localdomain sudo[268688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynezdxantvamubvikderkkvgpxezdsql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.8858294-1178-121727244139451/AnsiballZ_file.py
Dec 06 09:56:25 np0005548788.localdomain sudo[268688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:25 np0005548788.localdomain sudo[268688]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63170 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F53EF00000000001030307) 
Dec 06 09:56:26 np0005548788.localdomain sudo[268798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcfnfwbpzrurxcixsoeratjunzrgrdtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014986.40993-1220-206397856589916/AnsiballZ_container_config_data.py
Dec 06 09:56:26 np0005548788.localdomain sudo[268798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:27 np0005548788.localdomain python3.9[268800]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 06 09:56:27 np0005548788.localdomain sudo[268798]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:27 np0005548788.localdomain sudo[268908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbmzpwyuaozsdhcfqdanldlxmpkppgdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014987.46304-1247-54607547070574/AnsiballZ_container_config_hash.py
Dec 06 09:56:27 np0005548788.localdomain sudo[268908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:28 np0005548788.localdomain python3.9[268910]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:56:28 np0005548788.localdomain sudo[268908]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:28 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:56:28.830 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:28 np0005548788.localdomain sudo[269018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpegryyixgjdsbmlbqmxzkzyxbujnkun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014988.482671-1274-269583166713508/AnsiballZ_podman_container_info.py
Dec 06 09:56:28 np0005548788.localdomain sudo[269018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:29 np0005548788.localdomain python3.9[269020]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:56:29 np0005548788.localdomain sudo[269018]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:56:30 np0005548788.localdomain podman[269065]: 2025-12-06 09:56:30.262982933 +0000 UTC m=+0.090517991 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:56:30 np0005548788.localdomain podman[269065]: 2025-12-06 09:56:30.334664101 +0000 UTC m=+0.162199129 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:56:30 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:56:30 np0005548788.localdomain sshd[269091]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:56:31 np0005548788.localdomain sshd[269091]: Received disconnect from 148.227.3.232 port 37652:11: Bye Bye [preauth]
Dec 06 09:56:31 np0005548788.localdomain sshd[269091]: Disconnected from authenticating user root 148.227.3.232 port 37652 [preauth]
Dec 06 09:56:33 np0005548788.localdomain sudo[269183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smmzvepcpqsqbepkrgawzkmgutduvyig ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014993.068438-1313-74710862413871/AnsiballZ_edpm_container_manage.py
Dec 06 09:56:33 np0005548788.localdomain sudo[269183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:33 np0005548788.localdomain python3[269185]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:56:34 np0005548788.localdomain python3[269185]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7",
                                                                    "Digest": "sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:11:02.031267563Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 249482216,
                                                                    "VirtualSize": 249482216,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:8c448567789503f6c5be645a12473dfc27734872532d528b6ee764c214f9f2f3"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:24.212273596Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:11:01.523582443Z",
                                                                              "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:11:03.162365736Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:56:34 np0005548788.localdomain sudo[269183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:34 np0005548788.localdomain sudo[269354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igiaslmdqspoyqyelflkktzegqscegpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014994.4370217-1337-60224160541944/AnsiballZ_stat.py
Dec 06 09:56:34 np0005548788.localdomain sudo[269354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:34 np0005548788.localdomain python3.9[269356]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63171 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F55FF00000000001030307) 
Dec 06 09:56:34 np0005548788.localdomain sudo[269354]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:35 np0005548788.localdomain sudo[269466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpboutkbdpiucxsympwpyvvdnwvhhkmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014995.570159-1364-180884391069466/AnsiballZ_file.py
Dec 06 09:56:35 np0005548788.localdomain sudo[269466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:36 np0005548788.localdomain python3.9[269468]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:36 np0005548788.localdomain sudo[269466]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:36 np0005548788.localdomain sudo[269521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmmemysmrphxddxfwgqqhrokkkodudch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014995.570159-1364-180884391069466/AnsiballZ_stat.py
Dec 06 09:56:36 np0005548788.localdomain sudo[269521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:36 np0005548788.localdomain python3.9[269523]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:36 np0005548788.localdomain sudo[269521]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:56:37 np0005548788.localdomain sudo[269636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-libppmljapyuelarobtftobclgmrxlgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014996.6501687-1364-109438571519466/AnsiballZ_copy.py
Dec 06 09:56:37 np0005548788.localdomain sudo[269636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:37 np0005548788.localdomain podman[269625]: 2025-12-06 09:56:37.27590315 +0000 UTC m=+0.095008442 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Dec 06 09:56:37 np0005548788.localdomain podman[269625]: 2025-12-06 09:56:37.287522185 +0000 UTC m=+0.106627457 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 09:56:37 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:56:37 np0005548788.localdomain python3.9[269643]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014996.6501687-1364-109438571519466/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:37 np0005548788.localdomain sudo[269636]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:37 np0005548788.localdomain sudo[269704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-begvgbgbuzkpaqpqrfzktjzipmgonvht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014996.6501687-1364-109438571519466/AnsiballZ_systemd.py
Dec 06 09:56:37 np0005548788.localdomain sudo[269704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:38 np0005548788.localdomain sudo[269707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:56:38 np0005548788.localdomain sudo[269707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:56:38 np0005548788.localdomain sudo[269707]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:38 np0005548788.localdomain python3.9[269706]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:38 np0005548788.localdomain sudo[269725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:56:38 np0005548788.localdomain sudo[269725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:56:38 np0005548788.localdomain sudo[269725]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:56:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:56:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:56:39 np0005548788.localdomain sudo[269704]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:39 np0005548788.localdomain sudo[269804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:56:39 np0005548788.localdomain sudo[269804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:56:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:56:39 np0005548788.localdomain sudo[269804]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:39 np0005548788.localdomain podman[269841]: 2025-12-06 09:56:39.540622802 +0000 UTC m=+0.067814199 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:56:39 np0005548788.localdomain podman[269841]: 2025-12-06 09:56:39.550029107 +0000 UTC m=+0.077220544 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:56:39 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:56:39 np0005548788.localdomain python3.9[269923]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:40 np0005548788.localdomain sudo[270031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flwhzivuoggvnswvisbphsptyexcaajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015000.237417-1466-50459714337961/AnsiballZ_file.py
Dec 06 09:56:40 np0005548788.localdomain sudo[270031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:40 np0005548788.localdomain python3.9[270033]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:40 np0005548788.localdomain sudo[270031]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:41 np0005548788.localdomain sudo[270141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnshclcnzdjxgyxehkncqpyrcssjzcbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015001.4553537-1502-226800021560692/AnsiballZ_file.py
Dec 06 09:56:41 np0005548788.localdomain sudo[270141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:56:41 np0005548788.localdomain systemd[1]: tmp-crun.pplmqk.mount: Deactivated successfully.
Dec 06 09:56:41 np0005548788.localdomain podman[270144]: 2025-12-06 09:56:41.864390496 +0000 UTC m=+0.109963851 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Dec 06 09:56:41 np0005548788.localdomain podman[270144]: 2025-12-06 09:56:41.883582018 +0000 UTC m=+0.129155393 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64)
Dec 06 09:56:41 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:56:41 np0005548788.localdomain python3.9[270143]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:56:41 np0005548788.localdomain sudo[270141]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:42 np0005548788.localdomain sudo[270269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjlajocighsloylhaqremqhrocftlxly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015002.172168-1526-128008217425263/AnsiballZ_modprobe.py
Dec 06 09:56:42 np0005548788.localdomain sudo[270269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:42 np0005548788.localdomain python3.9[270271]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 06 09:56:42 np0005548788.localdomain sudo[270269]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:43 np0005548788.localdomain sudo[270379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwabpziqgcmmyvxcuefhgkpzazwtuoys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015003.0466828-1550-37321913242785/AnsiballZ_stat.py
Dec 06 09:56:43 np0005548788.localdomain sudo[270379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:43 np0005548788.localdomain python3.9[270381]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:43 np0005548788.localdomain sudo[270379]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:43 np0005548788.localdomain sudo[270436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttjmnmujxwzjmfpdcigqbvuqdylqxjey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015003.0466828-1550-37321913242785/AnsiballZ_file.py
Dec 06 09:56:43 np0005548788.localdomain sudo[270436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:44 np0005548788.localdomain python3.9[270438]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:44 np0005548788.localdomain sudo[270436]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:44 np0005548788.localdomain sudo[270546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exdizomnxwvqppkryednufyxjaqfhuof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015004.325838-1589-142328879976372/AnsiballZ_lineinfile.py
Dec 06 09:56:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:56:44 np0005548788.localdomain sudo[270546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:44 np0005548788.localdomain systemd[1]: tmp-crun.tCRVsm.mount: Deactivated successfully.
Dec 06 09:56:44 np0005548788.localdomain podman[270548]: 2025-12-06 09:56:44.757940836 +0000 UTC m=+0.102843948 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 09:56:44 np0005548788.localdomain podman[270548]: 2025-12-06 09:56:44.770580993 +0000 UTC m=+0.115484085 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:56:44 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:56:44 np0005548788.localdomain python3.9[270549]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:44 np0005548788.localdomain sudo[270546]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:45 np0005548788.localdomain sudo[270674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvkliulqvkmiwqgeigzhwedvdyuaahiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015005.2259176-1616-175595122948905/AnsiballZ_dnf.py
Dec 06 09:56:45 np0005548788.localdomain sudo[270674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:45 np0005548788.localdomain python3.9[270676]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:56:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:56:47.418 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:56:47.419 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:56:47.419 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:56:48 np0005548788.localdomain podman[270679]: 2025-12-06 09:56:48.27347907 +0000 UTC m=+0.098035057 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:56:48 np0005548788.localdomain podman[270679]: 2025-12-06 09:56:48.283756982 +0000 UTC m=+0.108312979 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:56:48 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:56:49 np0005548788.localdomain sudo[270674]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57343 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F598650000000001030307) 
Dec 06 09:56:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:56:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:56:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:56:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:56:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:56:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16746 "" "Go-http-client/1.1"
Dec 06 09:56:50 np0005548788.localdomain python3.9[270810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:56:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57344 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F59C700000000001030307) 
Dec 06 09:56:51 np0005548788.localdomain sudo[270922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtdxakjqyqgaksesjabjlvxlnedgidhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015010.7620203-1668-71680278483515/AnsiballZ_file.py
Dec 06 09:56:51 np0005548788.localdomain sudo[270922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:51 np0005548788.localdomain python3.9[270924]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:51 np0005548788.localdomain sudo[270922]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63172 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F59FF10000000001030307) 
Dec 06 09:56:52 np0005548788.localdomain sudo[271032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdiptibdlcspnmyjzofeogomktpyeyir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015011.8171952-1701-22085337610037/AnsiballZ_systemd_service.py
Dec 06 09:56:52 np0005548788.localdomain sudo[271032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:52 np0005548788.localdomain python3.9[271034]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:56:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57345 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F5A4700000000001030307) 
Dec 06 09:56:52 np0005548788.localdomain systemd-rc-local-generator[271055]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:52 np0005548788.localdomain systemd-sysv-generator[271061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548788.localdomain sudo[271032]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46366 DF PROTO=TCP SPT=54502 DPT=9102 SEQ=185377270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F5A7F00000000001030307) 
Dec 06 09:56:53 np0005548788.localdomain python3.9[271178]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:56:53 np0005548788.localdomain network[271195]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:56:53 np0005548788.localdomain network[271196]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:56:53 np0005548788.localdomain network[271197]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:56:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:56:53 np0005548788.localdomain podman[271203]: 2025-12-06 09:56:53.76580133 +0000 UTC m=+0.103689474 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:56:53 np0005548788.localdomain podman[271203]: 2025-12-06 09:56:53.777541109 +0000 UTC m=+0.115429273 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:56:54 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:56:56 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57346 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F5B4300000000001030307) 
Dec 06 09:56:59 np0005548788.localdomain sudo[271448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trdbhkymkveofmcsblojeacexmznrnmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015019.19282-1758-163574806121242/AnsiballZ_systemd_service.py
Dec 06 09:56:59 np0005548788.localdomain sudo[271448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:59 np0005548788.localdomain python3.9[271450]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:59 np0005548788.localdomain sudo[271448]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:00.201 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:00.203 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:57:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:00.203 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:57:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:00.242 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:57:00 np0005548788.localdomain sudo[271559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsmjhzmlwlmqofvkjphcxwrbetgkqimv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015020.1226392-1758-144860362551755/AnsiballZ_systemd_service.py
Dec 06 09:57:00 np0005548788.localdomain sudo[271559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:57:00 np0005548788.localdomain systemd[1]: tmp-crun.GgEk4B.mount: Deactivated successfully.
Dec 06 09:57:00 np0005548788.localdomain podman[271562]: 2025-12-06 09:57:00.547592306 +0000 UTC m=+0.083325614 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 09:57:00 np0005548788.localdomain podman[271562]: 2025-12-06 09:57:00.598506455 +0000 UTC m=+0.134239813 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 06 09:57:00 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:57:00 np0005548788.localdomain python3.9[271561]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:00 np0005548788.localdomain sudo[271559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:01 np0005548788.localdomain sudo[271695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhiopkiojvuwfathxtjrnaevupjohsav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015020.9283493-1758-236982971027016/AnsiballZ_systemd_service.py
Dec 06 09:57:01 np0005548788.localdomain sudo[271695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:01 np0005548788.localdomain python3.9[271697]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:01 np0005548788.localdomain sudo[271695]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:01 np0005548788.localdomain sudo[271806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgvhxriahbubqtlkbhclvqaqfuqhhrar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015021.71673-1758-149730204207775/AnsiballZ_systemd_service.py
Dec 06 09:57:01 np0005548788.localdomain sudo[271806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:02 np0005548788.localdomain python3.9[271808]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:02 np0005548788.localdomain sudo[271806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:02 np0005548788.localdomain sudo[271917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awmaoxieloqhyhudsqrssqioqpskdkux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015022.4793844-1758-132384879850453/AnsiballZ_systemd_service.py
Dec 06 09:57:02 np0005548788.localdomain sudo[271917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:03 np0005548788.localdomain python3.9[271919]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:03 np0005548788.localdomain sudo[271917]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:03 np0005548788.localdomain sudo[272028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsxcsrxrfyeyaqvguawrkszhzvlvpaof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015023.2987754-1758-220480075729065/AnsiballZ_systemd_service.py
Dec 06 09:57:03 np0005548788.localdomain sudo[272028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:03 np0005548788.localdomain python3.9[272030]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:04 np0005548788.localdomain sudo[272028]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:04.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:04.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:04.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:04.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:57:04 np0005548788.localdomain sudo[272139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwdvpkmczbneliotxlgtakcrkakfooaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015024.1617677-1758-193978515114370/AnsiballZ_systemd_service.py
Dec 06 09:57:04 np0005548788.localdomain sudo[272139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57347 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F5D3F00000000001030307) 
Dec 06 09:57:04 np0005548788.localdomain python3.9[272141]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:04 np0005548788.localdomain sudo[272139]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.208 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.208 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.209 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.209 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.210 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:57:05 np0005548788.localdomain sudo[272251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyzdyhzrddsjfuhrlejwrtuzcwkrbnqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015024.957006-1758-240992071225598/AnsiballZ_systemd_service.py
Dec 06 09:57:05 np0005548788.localdomain sudo[272251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:05 np0005548788.localdomain python3.9[272253]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:05 np0005548788.localdomain sudo[272251]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.686 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.902 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.904 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12849MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.904 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.905 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.989 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:57:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:05.990 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:57:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:06.007 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:57:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:06.483 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:57:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:06.490 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:57:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:06.513 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:57:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:06.515 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:57:06 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:06.515 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:07 np0005548788.localdomain sudo[272405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogdehrsxxvgpdrhzmsgjxuwsylczqdgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015026.9099276-1935-158128308183689/AnsiballZ_file.py
Dec 06 09:57:07 np0005548788.localdomain sudo[272405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:07 np0005548788.localdomain python3.9[272407]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:07 np0005548788.localdomain sudo[272405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:57:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:07 np0005548788.localdomain sudo[272515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghqabmqftjurawnmnvcupojcivnfvvot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015027.5591528-1935-10234620923588/AnsiballZ_file.py
Dec 06 09:57:07 np0005548788.localdomain sudo[272515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:57:07 np0005548788.localdomain podman[272518]: 2025-12-06 09:57:07.97889262 +0000 UTC m=+0.100149193 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 06 09:57:07 np0005548788.localdomain podman[272518]: 2025-12-06 09:57:07.989978128 +0000 UTC m=+0.111234731 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:57:08 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:57:08 np0005548788.localdomain python3.9[272517]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:08 np0005548788.localdomain sudo[272515]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:08 np0005548788.localdomain sudo[272645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-outhrhaharyisflldfgxxuymepbodoeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015028.2162817-1935-236414576553804/AnsiballZ_file.py
Dec 06 09:57:08 np0005548788.localdomain sudo[272645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:08.510 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:08 np0005548788.localdomain python3.9[272647]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:08 np0005548788.localdomain sudo[272645]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:57:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:57:09 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:09.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:09 np0005548788.localdomain sudo[272755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrsoikijesklddnllwrusyzarmjakqgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015028.8980508-1935-202063924823840/AnsiballZ_file.py
Dec 06 09:57:09 np0005548788.localdomain sudo[272755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:09 np0005548788.localdomain python3.9[272757]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:09 np0005548788.localdomain sudo[272755]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:09 np0005548788.localdomain sudo[272865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzncuelyhphdhlztruadykldlysufxyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015029.6872373-1935-25083520793951/AnsiballZ_file.py
Dec 06 09:57:09 np0005548788.localdomain sudo[272865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:57:10 np0005548788.localdomain systemd[1]: tmp-crun.CU36BI.mount: Deactivated successfully.
Dec 06 09:57:10 np0005548788.localdomain podman[272868]: 2025-12-06 09:57:10.086131371 +0000 UTC m=+0.082893611 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:57:10 np0005548788.localdomain podman[272868]: 2025-12-06 09:57:10.11861434 +0000 UTC m=+0.115376560 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:57:10 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:57:10 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:10.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:10 np0005548788.localdomain python3.9[272867]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:10 np0005548788.localdomain sudo[272865]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:10 np0005548788.localdomain sudo[272998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsqkkicwztvhtbwugeqbrpkzfwlpfyqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015030.339625-1935-281428305077815/AnsiballZ_file.py
Dec 06 09:57:10 np0005548788.localdomain sudo[272998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:10 np0005548788.localdomain python3.9[273000]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:10 np0005548788.localdomain sudo[272998]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:11 np0005548788.localdomain sudo[273108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfzvnwteizktqcasnpjoayygdacmpjfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015030.9968574-1935-173389891973625/AnsiballZ_file.py
Dec 06 09:57:11 np0005548788.localdomain sudo[273108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:11 np0005548788.localdomain python3.9[273110]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:11 np0005548788.localdomain sudo[273108]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:11 np0005548788.localdomain sudo[273218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwdpgphxgtabmlwkohketgkzhjxazvrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015031.6850674-1935-39904004456706/AnsiballZ_file.py
Dec 06 09:57:11 np0005548788.localdomain sudo[273218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:57:12 np0005548788.localdomain podman[273221]: 2025-12-06 09:57:12.106838157 +0000 UTC m=+0.097985735 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Dec 06 09:57:12 np0005548788.localdomain podman[273221]: 2025-12-06 09:57:12.125794721 +0000 UTC m=+0.116942349 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:57:12 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:57:12 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:57:12.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:12 np0005548788.localdomain python3.9[273220]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:12 np0005548788.localdomain sudo[273218]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:12 np0005548788.localdomain sudo[273346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udmhpvzubriuneoqbivrelzdtyauivba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015032.4404185-2106-60013759389334/AnsiballZ_file.py
Dec 06 09:57:12 np0005548788.localdomain sudo[273346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:12 np0005548788.localdomain sshd[273349]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:12 np0005548788.localdomain python3.9[273348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:13 np0005548788.localdomain sudo[273346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:13 np0005548788.localdomain sudo[273458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qojdumgeatnymjvicxwvcoxjrvsourhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015033.1371083-2106-274804366390893/AnsiballZ_file.py
Dec 06 09:57:13 np0005548788.localdomain sudo[273458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:13 np0005548788.localdomain python3.9[273460]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:13 np0005548788.localdomain sudo[273458]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:14 np0005548788.localdomain sudo[273568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljvoaapwpehzvwvgruawkzwaofnkxpry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015033.7698493-2106-152207494386963/AnsiballZ_file.py
Dec 06 09:57:14 np0005548788.localdomain sudo[273568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:14 np0005548788.localdomain python3.9[273570]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:14 np0005548788.localdomain sudo[273568]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:14 np0005548788.localdomain sshd[273349]: Received disconnect from 43.163.93.82 port 59878:11:  [preauth]
Dec 06 09:57:14 np0005548788.localdomain sshd[273349]: Disconnected from authenticating user root 43.163.93.82 port 59878 [preauth]
Dec 06 09:57:14 np0005548788.localdomain sudo[273678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdgrwtbroubkrazpvvyvekxtrcppmaba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015034.6357355-2106-5081446705451/AnsiballZ_file.py
Dec 06 09:57:14 np0005548788.localdomain sudo[273678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:57:15 np0005548788.localdomain podman[273681]: 2025-12-06 09:57:15.024874635 +0000 UTC m=+0.091861933 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:57:15 np0005548788.localdomain podman[273681]: 2025-12-06 09:57:15.038891485 +0000 UTC m=+0.105878823 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 06 09:57:15 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:57:15 np0005548788.localdomain python3.9[273680]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:15 np0005548788.localdomain sudo[273678]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:15 np0005548788.localdomain sudo[273806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgpyuxxhotzgsybrnipmwkpekfpmzpme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015035.2684371-2106-28332326307911/AnsiballZ_file.py
Dec 06 09:57:15 np0005548788.localdomain sudo[273806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:15 np0005548788.localdomain python3.9[273808]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:15 np0005548788.localdomain sudo[273806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:16 np0005548788.localdomain sudo[273916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjlmqgrwvjybsnxnajvyclivdbjpmqdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015035.9272346-2106-73504541266900/AnsiballZ_file.py
Dec 06 09:57:16 np0005548788.localdomain sudo[273916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:16 np0005548788.localdomain python3.9[273918]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:16 np0005548788.localdomain sudo[273916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:16 np0005548788.localdomain sudo[274026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyjpvrygvbyemaqgbdhgkmyrwgbsjcpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015036.6645503-2106-186140587436768/AnsiballZ_file.py
Dec 06 09:57:16 np0005548788.localdomain sudo[274026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:17 np0005548788.localdomain python3.9[274028]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:17 np0005548788.localdomain sudo[274026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:17 np0005548788.localdomain sudo[274136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayerldtkqrivqwttynhdobjjxihpzivi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015037.3069916-2106-250425641043807/AnsiballZ_file.py
Dec 06 09:57:17 np0005548788.localdomain sshd[274138]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:17 np0005548788.localdomain sudo[274136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:17 np0005548788.localdomain python3.9[274140]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:17 np0005548788.localdomain sudo[274136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:18 np0005548788.localdomain sudo[274248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-undxbcpljjsmboqtuixxovenvauorbtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015038.2772615-2280-209087694629940/AnsiballZ_command.py
Dec 06 09:57:18 np0005548788.localdomain sudo[274248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:57:18 np0005548788.localdomain systemd[1]: tmp-crun.ywCo2w.mount: Deactivated successfully.
Dec 06 09:57:18 np0005548788.localdomain podman[274251]: 2025-12-06 09:57:18.727693854 +0000 UTC m=+0.104630733 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:57:18 np0005548788.localdomain podman[274251]: 2025-12-06 09:57:18.765721248 +0000 UTC m=+0.142658077 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:57:18 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:57:18 np0005548788.localdomain python3.9[274250]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:18 np0005548788.localdomain sudo[274248]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48185 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F60D950000000001030307) 
Dec 06 09:57:19 np0005548788.localdomain python3.9[274383]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:57:19 np0005548788.localdomain podman[240078]: time="2025-12-06T09:57:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:57:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:57:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:57:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:57:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16749 "" "Go-http-client/1.1"
Dec 06 09:57:20 np0005548788.localdomain sudo[274491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijwlrisnijjlalecfckzgplqndtaulqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015039.975363-2334-95795748981213/AnsiballZ_systemd_service.py
Dec 06 09:57:20 np0005548788.localdomain sudo[274491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48186 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F611B00000000001030307) 
Dec 06 09:57:20 np0005548788.localdomain python3.9[274493]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 09:57:20 np0005548788.localdomain systemd-sysv-generator[274524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:57:20 np0005548788.localdomain systemd-rc-local-generator[274521]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:21 np0005548788.localdomain sudo[274491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57348 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F613F00000000001030307) 
Dec 06 09:57:21 np0005548788.localdomain sudo[274637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpmhyhzwnmohsusdainmfxstwillgjuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015041.2602816-2358-83413393355033/AnsiballZ_command.py
Dec 06 09:57:21 np0005548788.localdomain sudo[274637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:21 np0005548788.localdomain python3.9[274639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:21 np0005548788.localdomain sudo[274637]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:22 np0005548788.localdomain sudo[274748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbsemzpdvnfxzidwhkjplpowzgslxgyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015041.9208434-2358-231587869492388/AnsiballZ_command.py
Dec 06 09:57:22 np0005548788.localdomain sudo[274748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:22 np0005548788.localdomain python3.9[274750]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:22 np0005548788.localdomain sudo[274748]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48187 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F619B00000000001030307) 
Dec 06 09:57:22 np0005548788.localdomain sudo[274859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whrstpshqkwdaahjraiyqqywomiblopk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015042.635376-2358-213753119413798/AnsiballZ_command.py
Dec 06 09:57:22 np0005548788.localdomain sudo[274859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:23 np0005548788.localdomain python3.9[274861]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:23 np0005548788.localdomain sudo[274859]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:23 np0005548788.localdomain sudo[274970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urbiawkfkwrduscppxnonbvrfylphdbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015043.2749949-2358-191385144849413/AnsiballZ_command.py
Dec 06 09:57:23 np0005548788.localdomain sudo[274970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63173 DF PROTO=TCP SPT=45010 DPT=9102 SEQ=2168535139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F61DF00000000001030307) 
Dec 06 09:57:23 np0005548788.localdomain python3.9[274972]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:23 np0005548788.localdomain sudo[274970]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:24 np0005548788.localdomain sudo[275081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnzhbksdubxdnozxhdnekvsknxdxobkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015043.9220634-2358-76057991343004/AnsiballZ_command.py
Dec 06 09:57:24 np0005548788.localdomain sudo[275081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:24 np0005548788.localdomain python3.9[275083]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:57:24 np0005548788.localdomain sudo[275081]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:24 np0005548788.localdomain systemd[1]: tmp-crun.jJQQOf.mount: Deactivated successfully.
Dec 06 09:57:24 np0005548788.localdomain podman[275085]: 2025-12-06 09:57:24.541419198 +0000 UTC m=+0.085190753 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:57:24 np0005548788.localdomain podman[275085]: 2025-12-06 09:57:24.54945175 +0000 UTC m=+0.093223325 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:57:24 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:57:24 np0005548788.localdomain sshd[274138]: Connection closed by 45.78.219.195 port 33992 [preauth]
Dec 06 09:57:24 np0005548788.localdomain sudo[275210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjinchnfnyscquwnniutrrfwzjnzsvct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015044.589087-2358-217915686435908/AnsiballZ_command.py
Dec 06 09:57:24 np0005548788.localdomain sudo[275210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:25 np0005548788.localdomain python3.9[275212]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:26 np0005548788.localdomain sudo[275210]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:26 np0005548788.localdomain sudo[275321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjlxchcywxgduiynmmpmaazqulraxpdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015046.275005-2358-7938768106283/AnsiballZ_command.py
Dec 06 09:57:26 np0005548788.localdomain sudo[275321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48188 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F629700000000001030307) 
Dec 06 09:57:26 np0005548788.localdomain python3.9[275323]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:26 np0005548788.localdomain sudo[275321]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:27 np0005548788.localdomain sudo[275432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coyzujrtjiqbtfkrnroscjbxzqtdsekp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015046.9379735-2358-88134565697511/AnsiballZ_command.py
Dec 06 09:57:27 np0005548788.localdomain sudo[275432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:27 np0005548788.localdomain python3.9[275434]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:28 np0005548788.localdomain sudo[275432]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:30 np0005548788.localdomain sudo[275543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xofzjenkzykunpfoudsoffcfmhwxzcnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015049.8931952-2565-135951230548520/AnsiballZ_file.py
Dec 06 09:57:30 np0005548788.localdomain sudo[275543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:30 np0005548788.localdomain python3.9[275545]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:30 np0005548788.localdomain sudo[275543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:30 np0005548788.localdomain sudo[275653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sokmeopmdbybgmtcignvrvzyqmxcgdyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015050.6062663-2565-48658246257808/AnsiballZ_file.py
Dec 06 09:57:30 np0005548788.localdomain sudo[275653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:57:31 np0005548788.localdomain podman[275656]: 2025-12-06 09:57:31.040485065 +0000 UTC m=+0.091963487 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:57:31 np0005548788.localdomain python3.9[275655]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:31 np0005548788.localdomain podman[275656]: 2025-12-06 09:57:31.139680457 +0000 UTC m=+0.191158879 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:57:31 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:57:31 np0005548788.localdomain sudo[275653]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:31 np0005548788.localdomain sudo[275789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdzutgabhdsvowcpogribagejtysptrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015051.2894275-2565-3354087543183/AnsiballZ_file.py
Dec 06 09:57:31 np0005548788.localdomain sudo[275789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:31 np0005548788.localdomain python3.9[275791]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:31 np0005548788.localdomain sudo[275789]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:32 np0005548788.localdomain sudo[275899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfimmdcwlktinewftlydcwhsjaxjxoys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015052.101296-2631-45806861270764/AnsiballZ_file.py
Dec 06 09:57:32 np0005548788.localdomain sudo[275899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:32 np0005548788.localdomain python3.9[275901]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:32 np0005548788.localdomain sudo[275899]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:33 np0005548788.localdomain sudo[276009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aelutkpbhfgawigzthqrfhxylhsnnlrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015052.8072093-2631-205642553780834/AnsiballZ_file.py
Dec 06 09:57:33 np0005548788.localdomain sudo[276009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:33 np0005548788.localdomain python3.9[276011]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:33 np0005548788.localdomain sudo[276009]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:33 np0005548788.localdomain sudo[276119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqxhvxuiykxmiylszjenexwigspynvac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015053.548174-2631-49155994141898/AnsiballZ_file.py
Dec 06 09:57:33 np0005548788.localdomain sudo[276119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:34 np0005548788.localdomain python3.9[276121]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:34 np0005548788.localdomain sudo[276119]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:34 np0005548788.localdomain sudo[276229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iujlrxqeisruvlmgpjuptochdbwlfnws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015054.2112205-2631-63913124323193/AnsiballZ_file.py
Dec 06 09:57:34 np0005548788.localdomain sudo[276229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:34 np0005548788.localdomain python3.9[276231]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:34 np0005548788.localdomain sudo[276229]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48189 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F649F00000000001030307) 
Dec 06 09:57:35 np0005548788.localdomain sudo[276339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdpddgrtjvzjlrmakwbxmuymdonoxmak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015054.8057384-2631-86369934451639/AnsiballZ_file.py
Dec 06 09:57:35 np0005548788.localdomain sudo[276339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:35 np0005548788.localdomain python3.9[276341]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:35 np0005548788.localdomain sudo[276339]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:35 np0005548788.localdomain sudo[276449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jojhvbsaoumnixtpdtanxadyplngbgoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015055.5311055-2631-63660713896560/AnsiballZ_file.py
Dec 06 09:57:35 np0005548788.localdomain sudo[276449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:36 np0005548788.localdomain python3.9[276451]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:36 np0005548788.localdomain sudo[276449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:36 np0005548788.localdomain sudo[276559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riszeeimhgczuesvjlahehdetofxuvay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015056.174828-2631-42036802231130/AnsiballZ_file.py
Dec 06 09:57:36 np0005548788.localdomain sudo[276559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:36 np0005548788.localdomain python3.9[276561]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:36 np0005548788.localdomain sudo[276559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:57:38 np0005548788.localdomain podman[276579]: 2025-12-06 09:57:38.263232334 +0000 UTC m=+0.087098044 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:57:38 np0005548788.localdomain podman[276579]: 2025-12-06 09:57:38.301147104 +0000 UTC m=+0.125012854 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:57:38 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:57:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:57:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:57:39 np0005548788.localdomain sudo[276599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:57:39 np0005548788.localdomain sudo[276599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:57:39 np0005548788.localdomain sudo[276599]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:39 np0005548788.localdomain sudo[276617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:57:39 np0005548788.localdomain sudo[276617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:57:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:57:40 np0005548788.localdomain podman[276649]: 2025-12-06 09:57:40.25683233 +0000 UTC m=+0.083492471 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:57:40 np0005548788.localdomain podman[276649]: 2025-12-06 09:57:40.268562757 +0000 UTC m=+0.095222948 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:57:40 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:57:40 np0005548788.localdomain sudo[276617]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:41 np0005548788.localdomain sudo[276691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:57:41 np0005548788.localdomain sudo[276691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:57:41 np0005548788.localdomain sudo[276691]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:57:42 np0005548788.localdomain podman[276763]: 2025-12-06 09:57:42.269357629 +0000 UTC m=+0.096109347 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 06 09:57:42 np0005548788.localdomain podman[276763]: 2025-12-06 09:57:42.284725211 +0000 UTC m=+0.111476959 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 09:57:42 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:57:42 np0005548788.localdomain sudo[276818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnrkcfvzgiiioqycqbnswyuvfmgjrlww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015061.9027658-2956-172249996597/AnsiballZ_getent.py
Dec 06 09:57:42 np0005548788.localdomain sudo[276818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:42 np0005548788.localdomain python3.9[276820]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 06 09:57:42 np0005548788.localdomain sudo[276818]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:42 np0005548788.localdomain sshd[276839]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:43 np0005548788.localdomain sshd[276840]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:43 np0005548788.localdomain sshd[276840]: Accepted publickey for zuul from 192.168.122.30 port 49066 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:57:43 np0005548788.localdomain systemd-logind[765]: New session 60 of user zuul.
Dec 06 09:57:43 np0005548788.localdomain systemd[1]: Started Session 60 of User zuul.
Dec 06 09:57:43 np0005548788.localdomain sshd[276840]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:57:44 np0005548788.localdomain sshd[276843]: Received disconnect from 192.168.122.30 port 49066:11: disconnected by user
Dec 06 09:57:44 np0005548788.localdomain sshd[276843]: Disconnected from user zuul 192.168.122.30 port 49066
Dec 06 09:57:44 np0005548788.localdomain sshd[276840]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:57:44 np0005548788.localdomain systemd-logind[765]: Session 60 logged out. Waiting for processes to exit.
Dec 06 09:57:44 np0005548788.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Dec 06 09:57:44 np0005548788.localdomain systemd-logind[765]: Removed session 60.
Dec 06 09:57:44 np0005548788.localdomain python3.9[276951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:57:45 np0005548788.localdomain podman[277039]: 2025-12-06 09:57:45.294337801 +0000 UTC m=+0.107383769 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:57:45 np0005548788.localdomain podman[277039]: 2025-12-06 09:57:45.310676724 +0000 UTC m=+0.123722752 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:57:45 np0005548788.localdomain python3.9[277038]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015064.2932281-3037-141155188047974/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:45 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:57:45 np0005548788.localdomain python3.9[277165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:46 np0005548788.localdomain python3.9[277220]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:47 np0005548788.localdomain python3.9[277328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:57:47.419 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:57:47.420 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:57:47.420 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:47 np0005548788.localdomain python3.9[277414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015066.61657-3037-86606570187662/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:48 np0005548788.localdomain python3.9[277522]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:48 np0005548788.localdomain python3.9[277608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015067.7549195-3037-40072015443224/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=975829b10d65b228c43d1745a85328aeeb7df1e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:57:49 np0005548788.localdomain podman[277711]: 2025-12-06 09:57:49.253286277 +0000 UTC m=+0.076907625 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:57:49 np0005548788.localdomain podman[277711]: 2025-12-06 09:57:49.290298778 +0000 UTC m=+0.113920136 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:57:49 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:57:49 np0005548788.localdomain python3.9[277722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9535 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F682C40000000001030307) 
Dec 06 09:57:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:57:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:57:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:57:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:57:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:57:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16742 "" "Go-http-client/1.1"
Dec 06 09:57:49 np0005548788.localdomain sshd[276839]: Received disconnect from 45.78.194.186 port 53798:11: Bye Bye [preauth]
Dec 06 09:57:49 np0005548788.localdomain sshd[276839]: Disconnected from authenticating user root 45.78.194.186 port 53798 [preauth]
Dec 06 09:57:49 np0005548788.localdomain python3.9[277824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015068.9208572-3037-71301104876996/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9536 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F686B00000000001030307) 
Dec 06 09:57:50 np0005548788.localdomain python3.9[277932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:51 np0005548788.localdomain python3.9[278018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015070.0856855-3037-122879391125801/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48190 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F689F10000000001030307) 
Dec 06 09:57:52 np0005548788.localdomain sudo[278126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syhhxqhrwyksxycgoyejzplzpsyifdfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015071.8734124-3286-196429899119194/AnsiballZ_file.py
Dec 06 09:57:52 np0005548788.localdomain sudo[278126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:52 np0005548788.localdomain python3.9[278128]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:52 np0005548788.localdomain sudo[278126]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9537 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F68EB00000000001030307) 
Dec 06 09:57:52 np0005548788.localdomain sudo[278236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmboucpvylhaqrsldsgpnrtazyqvkgnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015072.6470454-3310-61894702935567/AnsiballZ_copy.py
Dec 06 09:57:52 np0005548788.localdomain sudo[278236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:53 np0005548788.localdomain python3.9[278238]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:53 np0005548788.localdomain sudo[278236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57349 DF PROTO=TCP SPT=56032 DPT=9102 SEQ=1764489722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F691F00000000001030307) 
Dec 06 09:57:53 np0005548788.localdomain sudo[278346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxodmvaphlcpymtbsvixyxfrtsajaufz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015073.4326086-3334-191619096287504/AnsiballZ_stat.py
Dec 06 09:57:53 np0005548788.localdomain sudo[278346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:53 np0005548788.localdomain python3.9[278348]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:53 np0005548788.localdomain sudo[278346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:54 np0005548788.localdomain sudo[278458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxzzvbbefaclzhehtavloxhoevdbxveg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015074.249617-3361-155086517512843/AnsiballZ_file.py
Dec 06 09:57:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:57:54 np0005548788.localdomain sudo[278458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:54 np0005548788.localdomain systemd[1]: tmp-crun.X4WFxf.mount: Deactivated successfully.
Dec 06 09:57:54 np0005548788.localdomain podman[278460]: 2025-12-06 09:57:54.702892366 +0000 UTC m=+0.101843791 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:57:54 np0005548788.localdomain podman[278460]: 2025-12-06 09:57:54.712567167 +0000 UTC m=+0.111518642 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:57:54 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:57:54 np0005548788.localdomain python3.9[278461]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:54 np0005548788.localdomain sudo[278458]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:55 np0005548788.localdomain python3.9[278586]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:56 np0005548788.localdomain python3.9[278696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9538 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F69E710000000001030307) 
Dec 06 09:57:56 np0005548788.localdomain python3.9[278751]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:57 np0005548788.localdomain python3.9[278859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:58 np0005548788.localdomain python3.9[278914]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:58 np0005548788.localdomain sudo[279022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayrspubharycgkmhfatgvgijnhisxzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015078.6766894-3490-119926703934356/AnsiballZ_container_config_data.py
Dec 06 09:57:58 np0005548788.localdomain sudo[279022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:59 np0005548788.localdomain python3.9[279024]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 06 09:57:59 np0005548788.localdomain sudo[279022]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:59 np0005548788.localdomain sudo[279132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaurnbpvszkfqtzaokmoyndoceeegvjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015079.520161-3517-60682472489264/AnsiballZ_container_config_hash.py
Dec 06 09:57:59 np0005548788.localdomain sudo[279132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:00 np0005548788.localdomain python3.9[279134]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:58:00 np0005548788.localdomain sudo[279132]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:00.183 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:00.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:58:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:00.183 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:58:00 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:00.204 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:58:00 np0005548788.localdomain sudo[279242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlmxphcxtjtujxqqyimfliblgfdksciz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015080.49683-3547-180964714318265/AnsiballZ_edpm_container_manage.py
Dec 06 09:58:00 np0005548788.localdomain sudo[279242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:01 np0005548788.localdomain python3[279244]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:58:01 np0005548788.localdomain python3[279244]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:58:01 np0005548788.localdomain sudo[279242]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:01 np0005548788.localdomain sudo[279417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jslmkpuqvctarhmebebvzyxmgosnkkmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015081.7516181-3571-19648289847371/AnsiballZ_stat.py
Dec 06 09:58:02 np0005548788.localdomain sudo[279417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:58:02 np0005548788.localdomain systemd[1]: tmp-crun.ZWlhVP.mount: Deactivated successfully.
Dec 06 09:58:02 np0005548788.localdomain podman[279420]: 2025-12-06 09:58:02.119119504 +0000 UTC m=+0.097189955 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:58:02 np0005548788.localdomain podman[279420]: 2025-12-06 09:58:02.154631663 +0000 UTC m=+0.132702054 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:58:02 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:58:02 np0005548788.localdomain python3.9[279419]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:02 np0005548788.localdomain sudo[279417]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:03 np0005548788.localdomain sudo[279553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmmotsqvjuvoyiemgbbvqsfsmdvbbjmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015082.8809793-3607-142177062120328/AnsiballZ_container_config_data.py
Dec 06 09:58:03 np0005548788.localdomain sudo[279553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:03 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:03.196 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:03 np0005548788.localdomain python3.9[279555]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 06 09:58:03 np0005548788.localdomain sudo[279553]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:04 np0005548788.localdomain sudo[279663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gonnatjyfipwmzoorimgujsbyfqknhgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015083.8489366-3634-216497581987015/AnsiballZ_container_config_hash.py
Dec 06 09:58:04 np0005548788.localdomain sudo[279663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:04.181 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:04.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:04 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:04.182 229898 DEBUG nova.compute.manager [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:58:04 np0005548788.localdomain python3.9[279665]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:58:04 np0005548788.localdomain sudo[279663]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9539 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F6BDF00000000001030307) 
Dec 06 09:58:05 np0005548788.localdomain sudo[279773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agmywtaoqrhymzveqxgxrqjvjqgbenrq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015084.8080106-3664-60822045145415/AnsiballZ_edpm_container_manage.py
Dec 06 09:58:05 np0005548788.localdomain sudo[279773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:05 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:05.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:05 np0005548788.localdomain python3[279775]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:58:05 np0005548788.localdomain python3[279775]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:58:05 np0005548788.localdomain sudo[279773]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:06 np0005548788.localdomain sudo[279944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmeijgzerbuemppwofmbsqqvprjdauuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015086.0176768-3688-245292824285843/AnsiballZ_stat.py
Dec 06 09:58:06 np0005548788.localdomain sudo[279944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:06 np0005548788.localdomain python3.9[279946]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:06 np0005548788.localdomain sudo[279944]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.208 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.209 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.209 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.210 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.210 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:07 np0005548788.localdomain sudo[280057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skgxsqyblxfoarpzlnxrnzisfugbypbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015086.9928267-3715-142068252578499/AnsiballZ_file.py
Dec 06 09:58:07 np0005548788.localdomain sudo[280057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:07 np0005548788.localdomain python3.9[280059]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:07 np0005548788.localdomain sudo[280057]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.669 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.895 229898 WARNING nova.virt.libvirt.driver [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.898 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12854MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.898 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.899 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.960 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.960 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:58:07 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:07.981 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:08 np0005548788.localdomain sudo[280187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qemtozcklfqzvdwxnqctkmifzfkfvxrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015087.574718-3715-24955865470498/AnsiballZ_copy.py
Dec 06 09:58:08 np0005548788.localdomain sudo[280187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:08 np0005548788.localdomain python3.9[280190]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015087.574718-3715-24955865470498/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:08 np0005548788.localdomain sudo[280187]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:08.447 229898 DEBUG oslo_concurrency.processutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:08.454 229898 DEBUG nova.compute.provider_tree [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:58:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:08.472 229898 DEBUG nova.scheduler.client.report [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:58:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:08.475 229898 DEBUG nova.compute.resource_tracker [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:58:08 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:08.475 229898 DEBUG oslo_concurrency.lockutils [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:08 np0005548788.localdomain sudo[280264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-befgqekhbvscyuefjdicxnqlalphuygt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015087.574718-3715-24955865470498/AnsiballZ_systemd.py
Dec 06 09:58:08 np0005548788.localdomain sudo[280264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:58:08 np0005548788.localdomain podman[280267]: 2025-12-06 09:58:08.634251342 +0000 UTC m=+0.089756503 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:58:08 np0005548788.localdomain podman[280267]: 2025-12-06 09:58:08.648621601 +0000 UTC m=+0.104126812 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 09:58:08 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:58:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:58:08 np0005548788.localdomain python3.9[280266]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:58:09 np0005548788.localdomain sudo[280264]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:09 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:09.470 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:09 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:09.471 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:10 np0005548788.localdomain python3.9[280395]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:58:11 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:11.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:11 np0005548788.localdomain podman[280504]: 2025-12-06 09:58:11.259051603 +0000 UTC m=+0.082516147 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:58:11 np0005548788.localdomain python3.9[280503]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:11 np0005548788.localdomain podman[280504]: 2025-12-06 09:58:11.266240847 +0000 UTC m=+0.089705391 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:58:11 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:58:12 np0005548788.localdomain python3.9[280634]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:12 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:12.182 229898 DEBUG oslo_service.periodic_task [None req-d6512a20-2769-4845-9af3-eca0b57e22e1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:13 np0005548788.localdomain sudo[280742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agmooxwusmjtrneykhchmaauuqpldpex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015092.6299598-3883-208335156471596/AnsiballZ_podman_container.py
Dec 06 09:58:13 np0005548788.localdomain sudo[280742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:58:13 np0005548788.localdomain podman[280744]: 2025-12-06 09:58:13.289234761 +0000 UTC m=+0.133838709 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, vcs-type=git)
Dec 06 09:58:13 np0005548788.localdomain podman[280744]: 2025-12-06 09:58:13.310814894 +0000 UTC m=+0.155418812 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container)
Dec 06 09:58:13 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:58:13 np0005548788.localdomain python3.9[280745]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:58:13 np0005548788.localdomain sudo[280742]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:13 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation.
Dec 06 09:58:13 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:58:13 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:58:13 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:58:14 np0005548788.localdomain sudo[280898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdkzrrjokkodtoqloxcnnghklaijpapi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015093.8526845-3907-16473790427530/AnsiballZ_systemd.py
Dec 06 09:58:14 np0005548788.localdomain sudo[280898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:14 np0005548788.localdomain python3.9[280900]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:58:14 np0005548788.localdomain systemd[1]: Stopping nova_compute container...
Dec 06 09:58:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:58:16 np0005548788.localdomain podman[280918]: 2025-12-06 09:58:16.266869486 +0000 UTC m=+0.097096622 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:58:16 np0005548788.localdomain podman[280918]: 2025-12-06 09:58:16.309718184 +0000 UTC m=+0.139945320 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:58:16 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:58:18 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:18.846 229898 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Dec 06 09:58:18 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:18.848 229898 DEBUG oslo_concurrency.lockutils [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:58:18 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:18.848 229898 DEBUG oslo_concurrency.lockutils [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:58:18 np0005548788.localdomain nova_compute[229894]: 2025-12-06 09:58:18.849 229898 DEBUG oslo_concurrency.lockutils [None req-af36983f-eff3-4b94-9b6d-46902d7b9817 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:58:19 np0005548788.localdomain virtqemud[229107]: End of file while reading data: Input/output error
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: libpod-b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0.scope: Deactivated successfully.
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: libpod-b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0.scope: Consumed 18.896s CPU time.
Dec 06 09:58:19 np0005548788.localdomain podman[280904]: 2025-12-06 09:58:19.370830524 +0000 UTC m=+4.816050687 container died b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: tmp-crun.c1fcpQ.mount: Deactivated successfully.
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0-userdata-shm.mount: Deactivated successfully.
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4-merged.mount: Deactivated successfully.
Dec 06 09:58:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63769 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F6F7F50000000001030307) 
Dec 06 09:58:19 np0005548788.localdomain podman[280946]: 2025-12-06 09:58:19.541876105 +0000 UTC m=+0.150296663 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:19 np0005548788.localdomain podman[280904]: 2025-12-06 09:58:19.556490231 +0000 UTC m=+5.001710374 container cleanup b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:58:19 np0005548788.localdomain podman[280904]: nova_compute
Dec 06 09:58:19 np0005548788.localdomain podman[280946]: 2025-12-06 09:58:19.578655023 +0000 UTC m=+0.187075611 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:58:19 np0005548788.localdomain podman[240078]: time="2025-12-06T09:58:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:58:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:58:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146540 "" "Go-http-client/1.1"
Dec 06 09:58:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:58:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16625 "" "Go-http-client/1.1"
Dec 06 09:58:19 np0005548788.localdomain podman[280989]: error opening file `/run/crun/b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0/status`: No such file or directory
Dec 06 09:58:19 np0005548788.localdomain podman[280974]: 2025-12-06 09:58:19.718928012 +0000 UTC m=+0.128134452 container cleanup b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 09:58:19 np0005548788.localdomain podman[280974]: nova_compute
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: Stopped nova_compute container.
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: Starting nova_compute container...
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:58:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:19 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfecb75d87015eca2c329a0561137a162a86d9e385a301584c2efa6eceb9fbf4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:19 np0005548788.localdomain podman[280991]: 2025-12-06 09:58:19.874939931 +0000 UTC m=+0.118630434 container init b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=nova_compute)
Dec 06 09:58:19 np0005548788.localdomain podman[280991]: 2025-12-06 09:58:19.884853931 +0000 UTC m=+0.128544444 container start b6ba55e28bf1b499f8e0ba52b93ab7983300229d9eb6c6425f7db93175c3f7b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, config_id=edpm)
Dec 06 09:58:19 np0005548788.localdomain podman[280991]: nova_compute
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: + sudo -E kolla_set_configs
Dec 06 09:58:19 np0005548788.localdomain systemd[1]: Started nova_compute container.
Dec 06 09:58:19 np0005548788.localdomain sudo[280898]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Validating config file
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying service configuration files
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /etc/ceph
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Creating directory /etc/ceph
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Writing out command to execute
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: ++ cat /run_command
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: + CMD=nova-compute
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: + ARGS=
Dec 06 09:58:19 np0005548788.localdomain nova_compute[281005]: + sudo kolla_copy_cacerts
Dec 06 09:58:20 np0005548788.localdomain nova_compute[281005]: + [[ ! -n '' ]]
Dec 06 09:58:20 np0005548788.localdomain nova_compute[281005]: + . kolla_extend_start
Dec 06 09:58:20 np0005548788.localdomain nova_compute[281005]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 09:58:20 np0005548788.localdomain nova_compute[281005]: Running command: 'nova-compute'
Dec 06 09:58:20 np0005548788.localdomain nova_compute[281005]: + umask 0022
Dec 06 09:58:20 np0005548788.localdomain nova_compute[281005]: + exec nova-compute
Dec 06 09:58:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63770 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F6FBF00000000001030307) 
Dec 06 09:58:20 np0005548788.localdomain sudo[281125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcnuoxlzgqzxeobiiynpjggvcpjdykzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015100.657119-3934-4686898494230/AnsiballZ_podman_container.py
Dec 06 09:58:20 np0005548788.localdomain sudo[281125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9540 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F6FDF00000000001030307) 
Dec 06 09:58:21 np0005548788.localdomain python3.9[281127]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:58:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157.scope.
Dec 06 09:58:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 09:58:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:21 np0005548788.localdomain podman[281152]: 2025-12-06 09:58:21.573079875 +0000 UTC m=+0.166326334 container init 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:58:21 np0005548788.localdomain podman[281152]: 2025-12-06 09:58:21.585568394 +0000 UTC m=+0.178814833 container start 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute_init, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:58:21 np0005548788.localdomain python3.9[281127]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Applying nova statedir ownership
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673
Dec 06 09:58:21 np0005548788.localdomain nova_compute_init[281170]: INFO:nova_statedir:Nova statedir ownership complete
Dec 06 09:58:21 np0005548788.localdomain systemd[1]: libpod-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157.scope: Deactivated successfully.
Dec 06 09:58:21 np0005548788.localdomain podman[281171]: 2025-12-06 09:58:21.691947675 +0000 UTC m=+0.076694485 container died 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Dec 06 09:58:21 np0005548788.localdomain podman[281182]: 2025-12-06 09:58:21.765063727 +0000 UTC m=+0.069396947 container cleanup 05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:58:21 np0005548788.localdomain systemd[1]: libpod-conmon-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157.scope: Deactivated successfully.
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.816 281009 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.816 281009 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.817 281009 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.818 281009 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 09:58:21 np0005548788.localdomain sudo[281125]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.971 281009 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.996 281009 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:21.996 281009 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 09:58:22 np0005548788.localdomain sshd[262680]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.431 281009 INFO nova.virt.driver [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 09:58:22 np0005548788.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Dec 06 09:58:22 np0005548788.localdomain systemd[1]: session-59.scope: Consumed 1min 37.743s CPU time.
Dec 06 09:58:22 np0005548788.localdomain systemd-logind[765]: Session 59 logged out. Waiting for processes to exit.
Dec 06 09:58:22 np0005548788.localdomain systemd-logind[765]: Removed session 59.
Dec 06 09:58:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9de43e7b8a79fe03a5975173706e6157d0155e932b8542b83a4e016bd3ef92fe-merged.mount: Deactivated successfully.
Dec 06 09:58:22 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05aa29a0cbbc88428becb9cc5883d1eda652edc23ddc2ec1735294af70ff0157-userdata-shm.mount: Deactivated successfully.
Dec 06 09:58:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63771 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F703F00000000001030307) 
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.555 281009 INFO nova.compute.provider_config [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.567 281009 DEBUG oslo_concurrency.lockutils [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.567 281009 DEBUG oslo_concurrency.lockutils [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.568 281009 DEBUG oslo_concurrency.lockutils [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.568 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.568 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.568 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.569 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.569 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.569 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.569 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.569 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.570 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.570 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.570 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.570 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.570 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.570 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.571 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.571 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.571 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.571 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.571 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.572 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] console_host                   = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.572 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.572 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.572 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.572 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.573 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.573 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.573 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.573 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.573 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.574 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.574 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.574 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.574 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.574 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.575 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.575 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.576 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.576 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.576 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] host                           = np0005548788.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.576 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.577 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.577 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.577 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.578 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.578 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.579 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.579 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.580 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.580 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.580 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.580 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.580 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.581 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.581 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.581 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.581 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.581 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.582 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.582 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.582 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.582 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.582 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.582 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.583 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.583 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.583 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.583 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.583 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.584 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.584 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.584 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.584 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.584 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.584 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.585 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.585 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.585 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.585 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.585 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.586 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.586 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.586 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.586 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.586 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.587 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.587 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.587 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.587 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.588 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.588 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.588 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.589 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.589 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.589 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.590 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.590 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.590 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.590 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.591 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.591 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.591 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.591 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.591 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.591 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.592 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.592 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.592 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.592 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.592 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.593 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.593 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.593 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.593 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.593 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.593 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.594 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.594 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.594 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.594 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.594 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.595 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.595 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.595 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.595 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.595 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.596 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.596 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.596 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.596 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.596 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.596 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.597 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.597 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.598 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.599 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.599 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.599 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.599 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.599 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.600 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.600 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.600 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.600 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.600 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.600 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.601 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.601 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.601 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.601 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.601 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.602 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.602 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.602 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.602 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.602 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.603 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.603 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.603 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.603 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.603 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.604 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.604 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.604 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.604 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.604 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.604 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.605 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.605 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.605 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.605 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.605 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.606 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.606 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.606 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.606 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.606 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.607 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.607 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.607 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.607 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.607 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.608 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.608 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.608 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.608 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.608 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.608 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.609 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.609 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.609 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.610 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.610 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.610 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.610 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.610 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.611 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.611 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.611 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.611 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.611 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.611 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.612 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.612 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.612 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.612 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.612 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.613 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.613 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.613 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.613 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.613 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.614 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.614 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.614 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.614 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.614 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.615 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.615 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.615 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.615 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.615 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.615 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.616 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.616 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.616 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.616 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.616 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.617 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.617 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.617 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.617 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.617 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.618 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.618 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.618 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.618 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.618 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.618 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.619 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.619 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.619 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.619 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.619 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.620 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.620 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.620 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.620 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.620 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.621 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.621 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.621 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.621 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.621 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.621 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.622 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.622 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.622 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.622 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.622 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.623 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.623 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.623 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.623 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.623 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.624 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.624 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.624 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.624 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.624 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.625 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.625 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.625 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.625 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.625 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.625 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.626 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.626 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.626 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.626 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.626 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.627 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.627 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.627 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.627 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.627 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.628 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.628 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.628 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.628 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.628 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.628 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.629 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.629 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.629 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.629 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.629 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.630 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.630 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.630 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.630 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.630 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.630 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.631 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.631 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.631 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.631 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.631 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.632 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.632 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.632 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.632 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.632 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.633 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.633 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.633 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.633 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.634 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.634 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.634 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.634 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.634 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.635 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.635 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.635 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.635 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.635 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.636 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.636 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.636 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.636 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.636 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.637 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.637 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.637 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.637 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.638 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.638 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.638 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.639 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.639 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.639 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.639 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.640 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.640 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.640 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.640 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.641 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.641 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.641 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.642 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.642 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.642 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.642 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.643 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.643 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.643 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.644 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.644 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.644 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.644 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.645 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.645 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.645 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.645 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.646 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.646 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.646 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.646 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.647 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.647 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.647 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.647 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.648 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.648 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.648 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.649 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.649 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.649 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.649 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.650 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.650 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.650 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.650 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.651 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.651 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.651 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.651 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.652 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.652 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.652 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.652 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.653 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.653 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.653 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.653 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.654 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.654 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.654 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.654 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.655 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.655 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.655 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.655 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.656 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.656 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.656 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.656 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.657 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.657 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.657 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.657 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.657 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.657 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.658 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.658 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.658 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.658 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.658 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.659 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.659 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.659 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.659 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.659 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.659 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.660 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.660 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.660 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.660 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.660 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.661 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.661 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.661 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.661 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.661 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.661 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.662 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.662 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.662 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.662 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.662 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.663 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.663 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.663 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.663 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.663 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.663 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.664 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.664 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.664 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.664 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.664 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.665 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.665 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.665 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.665 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.665 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.666 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.666 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.666 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.666 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.666 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.667 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.667 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.667 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.667 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.667 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.667 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.668 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.668 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.668 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.668 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.668 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.669 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.669 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.669 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.669 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.669 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.669 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.670 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.670 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.670 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.670 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.670 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.671 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.671 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.671 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.671 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.671 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.671 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.672 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.672 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.672 281009 WARNING oslo_config.cfg [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: and ``live_migration_inbound_addr`` respectively.
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: ).  Its value may be silently ignored in the future.
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.672 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.673 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.673 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.673 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.673 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.673 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.674 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.674 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.674 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.674 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.674 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.675 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.675 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.675 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.676 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.676 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.677 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.677 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.677 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rbd_secret_uuid        = 1939e851-b10c-5c3b-9bb7-8e7f380233e8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.677 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.677 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.677 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.678 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.678 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.678 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.678 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.678 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.679 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.679 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.679 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.679 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.679 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.680 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.680 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.680 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.680 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.680 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.681 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.681 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.681 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.681 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.681 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.682 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.682 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.682 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.682 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.682 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.682 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.683 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.683 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.683 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.683 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.683 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.684 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.684 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.684 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.684 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.684 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.684 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.685 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.685 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.685 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.685 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.685 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.686 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.686 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.686 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.686 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.686 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.687 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.687 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.687 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.687 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.687 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.687 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.688 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.688 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.688 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.688 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.688 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.689 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.689 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.689 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.689 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.689 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.690 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.690 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.690 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.690 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.691 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.691 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.691 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.691 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.691 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.691 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.692 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.692 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.692 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.692 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.692 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.693 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.693 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.693 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.693 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.693 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.694 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.694 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.694 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.694 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.694 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.694 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.695 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.695 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.695 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.695 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.695 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.696 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.696 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.696 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.696 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.696 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.697 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.697 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.697 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.697 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.697 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.698 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.698 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.698 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.698 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.698 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.699 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.699 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.699 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.699 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.699 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.700 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.700 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.700 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.700 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.701 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.702 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.702 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.703 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.703 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.703 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.703 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.703 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.704 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.704 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.704 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.704 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.704 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.705 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.705 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.705 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.705 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.705 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.706 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.706 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.706 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.706 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.706 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.707 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.707 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.707 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.707 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.707 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.708 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.708 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.708 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.708 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.708 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.708 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.709 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.709 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.709 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.709 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.710 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.710 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.710 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.710 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.710 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.711 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.711 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.711 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.711 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.711 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.711 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.712 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.712 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.712 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.712 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.712 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.713 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.713 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.713 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.713 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.714 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.714 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.714 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.714 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.714 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.715 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.715 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.715 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.715 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.715 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.715 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.716 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.716 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.716 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.716 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.716 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.717 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.717 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.717 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.717 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.717 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.717 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.718 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.718 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.718 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.718 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.718 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.719 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.719 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.719 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.719 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.719 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.720 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.720 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.720 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.720 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.721 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.721 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.721 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.721 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.721 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.722 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.722 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.722 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.722 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.722 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.723 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.723 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.723 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.723 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.724 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.724 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.724 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.724 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.725 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.725 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.725 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.725 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.726 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.726 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.726 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.726 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.726 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.727 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.727 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.727 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.727 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.728 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.728 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.728 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.728 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.728 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.729 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.729 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.729 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.729 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.729 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.730 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.730 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.730 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.730 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.730 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.731 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.731 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.731 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.731 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.731 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.732 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.732 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.732 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.732 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.732 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.732 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.733 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.733 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.733 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.733 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.733 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.734 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.734 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.734 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.734 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.734 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.735 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.735 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.735 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.735 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.735 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.736 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.736 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.736 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.736 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.736 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.736 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.737 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.737 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.737 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.737 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.737 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.738 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.738 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.738 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.738 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.738 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.738 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.739 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.739 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.739 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.739 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.739 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.740 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.740 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.740 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.740 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.740 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.741 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.741 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.741 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.741 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.741 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.742 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.742 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.742 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.742 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.743 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.743 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.743 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.743 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.743 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.743 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.744 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.744 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.744 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.744 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.745 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.745 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.745 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.745 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.745 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.746 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.746 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.746 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.746 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.746 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.747 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.747 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.747 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.747 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.747 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.747 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.748 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.748 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.748 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.748 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.748 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.749 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.749 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.749 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.749 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.749 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.749 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.750 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.750 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.750 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.750 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.750 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.751 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.751 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.751 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.751 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.751 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.752 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.752 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.752 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.752 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.752 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.752 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.753 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.753 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.753 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.753 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.753 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.754 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.754 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.754 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.754 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.754 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.754 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.755 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.755 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.755 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.755 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.755 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.756 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.756 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.756 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.756 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.756 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.756 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.757 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.757 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.757 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.757 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.757 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.758 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.758 281009 DEBUG oslo_service.service [None req-e25bf31c-92cf-4d83-8331-018210357a5f - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.759 281009 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.777 281009 INFO nova.virt.node [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Determined node identity 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from /var/lib/nova/compute_id
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.778 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.778 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.779 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.779 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.792 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f551eb9b3d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.795 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f551eb9b3d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.795 281009 INFO nova.virt.libvirt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Connection event '1' reason 'None'
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.799 281009 INFO nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <host>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <uuid>74aa0f2e-bd78-406d-a4f0-2263c03ef4c3</uuid>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <arch>x86_64</arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model>EPYC-Rome-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <vendor>AMD</vendor>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <microcode version='16777317'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <signature family='23' model='49' stepping='0'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='x2apic'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='tsc-deadline'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='osxsave'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='hypervisor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='tsc_adjust'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='spec-ctrl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='stibp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='arch-capabilities'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='cmp_legacy'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='topoext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='virt-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='lbrv'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='tsc-scale'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='vmcb-clean'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='pause-filter'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='pfthreshold'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='svme-addr-chk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='rdctl-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='mds-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature name='pschange-mc-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <pages unit='KiB' size='4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <pages unit='KiB' size='2048'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <pages unit='KiB' size='1048576'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <power_management>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <suspend_mem/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <suspend_disk/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <suspend_hybrid/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </power_management>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <iommu support='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <migration_features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <live/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <uri_transports>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <uri_transport>tcp</uri_transport>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <uri_transport>rdma</uri_transport>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </uri_transports>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </migration_features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <topology>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <cells num='1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <cell id='0'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           <memory unit='KiB'>16116604</memory>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           <pages unit='KiB' size='4'>4029151</pages>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           <distances>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <sibling id='0' value='10'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           </distances>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           <cpus num='8'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:           </cpus>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         </cell>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </cells>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </topology>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <cache>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </cache>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <secmodel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model>selinux</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <doi>0</doi>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </secmodel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <secmodel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model>dac</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <doi>0</doi>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </secmodel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </host>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <guest>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <os_type>hvm</os_type>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <arch name='i686'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <wordsize>32</wordsize>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <domain type='qemu'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <domain type='kvm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <pae/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <nonpae/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <acpi default='on' toggle='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <apic default='on' toggle='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <cpuselection/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <deviceboot/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <externalSnapshot/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </guest>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <guest>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <os_type>hvm</os_type>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <arch name='x86_64'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <wordsize>64</wordsize>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <domain type='qemu'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <domain type='kvm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <acpi default='on' toggle='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <apic default='on' toggle='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <cpuselection/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <deviceboot/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <externalSnapshot/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </guest>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: </capabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.805 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.810 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: <domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <domain>kvm</domain>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <arch>i686</arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <vcpu max='240'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <iothreads supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <os supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='firmware'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <loader supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>rom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pflash</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='readonly'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>yes</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='secure'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </loader>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='maximumMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <vendor>AMD</vendor>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='succor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='custom' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-128'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-256'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-512'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <memoryBacking supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='sourceType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>file</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>anonymous</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>memfd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </memoryBacking>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <disk supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='diskDevice'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>disk</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>cdrom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>floppy</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>lun</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ide</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>fdc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>sata</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <graphics supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vnc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>egl-headless</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </graphics>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <video supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='modelType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vga</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>cirrus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>none</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>bochs</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ramfb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <hostdev supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='mode'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>subsystem</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='startupPolicy'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>mandatory</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>requisite</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>optional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='subsysType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pci</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='capsType'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='pciBackend'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </hostdev>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <rng supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>random</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>egd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <filesystem supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='driverType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>path</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>handle</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtiofs</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </filesystem>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <tpm supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tpm-tis</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tpm-crb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>emulator</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>external</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendVersion'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>2.0</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </tpm>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <redirdev supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </redirdev>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <channel supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </channel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <crypto supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>qemu</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </crypto>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <interface supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>passt</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </interface>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <panic supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>isa</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>hyperv</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </panic>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <console supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>null</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dev</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>file</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pipe</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>stdio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>udp</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tcp</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>qemu-vdagent</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </console>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <gic supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <genid supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <backup supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <async-teardown supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <ps2 supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <sev supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <sgx supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <hyperv supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='features'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>relaxed</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vapic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>spinlocks</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vpindex</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>runtime</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>synic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>stimer</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>reset</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vendor_id</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>frequencies</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>reenlightenment</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tlbflush</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ipi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>avic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>emsr_bitmap</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>xmm_input</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <defaults>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </defaults>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </hyperv>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <launchSecurity supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='sectype'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tdx</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </launchSecurity>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: </domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.819 281009 DEBUG nova.virt.libvirt.volume.mount [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.820 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: <domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <domain>kvm</domain>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <arch>i686</arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <vcpu max='1024'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <iothreads supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <os supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='firmware'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <loader supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>rom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pflash</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='readonly'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>yes</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='secure'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </loader>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='maximumMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <vendor>AMD</vendor>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='succor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='custom' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-128'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-256'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-512'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <memoryBacking supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='sourceType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>file</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>anonymous</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>memfd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </memoryBacking>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <disk supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='diskDevice'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>disk</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>cdrom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>floppy</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>lun</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>fdc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>sata</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <graphics supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vnc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>egl-headless</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </graphics>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <video supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='modelType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vga</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>cirrus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>none</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>bochs</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ramfb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <hostdev supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='mode'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>subsystem</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='startupPolicy'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>mandatory</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>requisite</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>optional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='subsysType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pci</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='capsType'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='pciBackend'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </hostdev>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <rng supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>random</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>egd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <filesystem supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='driverType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>path</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>handle</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtiofs</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </filesystem>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <tpm supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tpm-tis</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tpm-crb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>emulator</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>external</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendVersion'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>2.0</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </tpm>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <redirdev supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </redirdev>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <channel supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </channel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <crypto supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>qemu</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </crypto>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <interface supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>passt</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </interface>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <panic supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>isa</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>hyperv</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </panic>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <console supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>null</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dev</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>file</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pipe</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>stdio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>udp</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tcp</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>qemu-vdagent</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </console>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <gic supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <genid supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <backup supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <async-teardown supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <ps2 supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <sev supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <sgx supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <hyperv supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='features'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>relaxed</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vapic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>spinlocks</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vpindex</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>runtime</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>synic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>stimer</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>reset</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vendor_id</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>frequencies</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>reenlightenment</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tlbflush</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ipi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>avic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>emsr_bitmap</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>xmm_input</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <defaults>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </defaults>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </hyperv>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <launchSecurity supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='sectype'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tdx</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </launchSecurity>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: </domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.847 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.852 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: <domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <domain>kvm</domain>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <arch>x86_64</arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <vcpu max='240'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <iothreads supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <os supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='firmware'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <loader supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>rom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pflash</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='readonly'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>yes</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='secure'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </loader>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='maximumMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <vendor>AMD</vendor>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='succor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='custom' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-128'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-256'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-512'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <memoryBacking supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='sourceType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>file</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>anonymous</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>memfd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </memoryBacking>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <disk supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='diskDevice'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>disk</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>cdrom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>floppy</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>lun</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ide</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>fdc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>sata</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <graphics supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vnc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>egl-headless</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </graphics>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <video supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='modelType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vga</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>cirrus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>none</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>bochs</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ramfb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <hostdev supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='mode'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>subsystem</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='startupPolicy'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>mandatory</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>requisite</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>optional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='subsysType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pci</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='capsType'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='pciBackend'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </hostdev>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <rng supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>random</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>egd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <filesystem supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='driverType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>path</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>handle</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>virtiofs</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </filesystem>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <tpm supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tpm-tis</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tpm-crb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>emulator</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>external</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendVersion'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>2.0</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </tpm>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <redirdev supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </redirdev>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <channel supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </channel>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <crypto supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>qemu</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </crypto>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <interface supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='backendType'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>passt</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </interface>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <panic supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>isa</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>hyperv</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </panic>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <console supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>null</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vc</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dev</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>file</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pipe</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>stdio</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>udp</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tcp</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>qemu-vdagent</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </console>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <gic supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <genid supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <backup supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <async-teardown supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <ps2 supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <sev supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <sgx supported='no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <hyperv supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='features'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>relaxed</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vapic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>spinlocks</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vpindex</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>runtime</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>synic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>stimer</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>reset</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>vendor_id</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>frequencies</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>reenlightenment</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tlbflush</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>ipi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>avic</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>emsr_bitmap</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>xmm_input</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <defaults>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </defaults>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </hyperv>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <launchSecurity supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='sectype'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>tdx</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </launchSecurity>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: </domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.904 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]: <domainCapabilities>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <domain>kvm</domain>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <arch>x86_64</arch>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <vcpu max='1024'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <iothreads supported='yes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <os supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <enum name='firmware'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>efi</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <loader supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>rom</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>pflash</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='readonly'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>yes</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='secure'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>yes</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>no</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </loader>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:   <cpu>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <enum name='maximumMigratable'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>on</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <value>off</value>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <vendor>AMD</vendor>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='succor'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:     <mode name='custom' supported='yes'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Denverton-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='auto-ibrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amd-psfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='stibp-always-on'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='EPYC-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-128'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-256'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx10-512'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='prefetchiti'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Haswell-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512er'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512pf'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fma4'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tbm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xop'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='amx-tile'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-bf16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-fp16'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bitalg'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrc'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fzrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='la57'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='taa-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xfd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ifma'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='cmpccxadd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fbsdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='fsrs'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='ibrs-all'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='mcdt-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pbrsb-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='psdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='serialize'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vaes'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='hle'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:22 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='rtm'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512bw'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512cd'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512dq'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512f'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='avx512vl'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='invpcid'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='pcid'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='pku'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='mpx'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='core-capability'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='split-lock-detect'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='cldemote'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='erms'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='gfni'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdir64b'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='movdiri'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='xsaves'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='athlon-v1'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='core2duo-v1'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='coreduo-v1'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='n270-v1'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='ss'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <blockers model='phenom-v1'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnow'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <feature name='3dnowext'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </blockers>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </mode>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   <memoryBacking supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <enum name='sourceType'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <value>file</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <value>anonymous</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <value>memfd</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   </memoryBacking>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <disk supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='diskDevice'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>disk</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>cdrom</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>floppy</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>lun</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>fdc</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>sata</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <graphics supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>vnc</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>egl-headless</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </graphics>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <video supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='modelType'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>vga</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>cirrus</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>none</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>bochs</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>ramfb</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <hostdev supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='mode'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>subsystem</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='startupPolicy'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>mandatory</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>requisite</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>optional</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='subsysType'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>pci</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>scsi</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='capsType'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='pciBackend'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </hostdev>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <rng supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio-transitional</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtio-non-transitional</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>random</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>egd</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <filesystem supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='driverType'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>path</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>handle</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>virtiofs</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </filesystem>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <tpm supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>tpm-tis</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>tpm-crb</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>emulator</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>external</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='backendVersion'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>2.0</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </tpm>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <redirdev supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='bus'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>usb</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </redirdev>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <channel supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </channel>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <crypto supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='model'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>qemu</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='backendModel'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>builtin</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </crypto>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <interface supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='backendType'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>default</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>passt</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </interface>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <panic supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='model'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>isa</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>hyperv</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </panic>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <console supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='type'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>null</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>vc</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>pty</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>dev</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>file</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>pipe</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>stdio</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>udp</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>tcp</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>unix</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>qemu-vdagent</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>dbus</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </console>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <gic supported='no'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <genid supported='yes'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <backup supported='yes'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <async-teardown supported='yes'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <ps2 supported='yes'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <sev supported='no'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <sgx supported='no'/>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <hyperv supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='features'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>relaxed</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>vapic</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>spinlocks</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>vpindex</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>runtime</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>synic</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>stimer</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>reset</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>vendor_id</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>frequencies</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>reenlightenment</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>tlbflush</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>ipi</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>avic</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>emsr_bitmap</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>xmm_input</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <defaults>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </defaults>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </hyperv>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     <launchSecurity supported='yes'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       <enum name='sectype'>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:         <value>tdx</value>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:       </enum>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:     </launchSecurity>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: </domainCapabilities>
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.953 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.953 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.954 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.954 281009 INFO nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Secure Boot support detected
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.957 281009 INFO nova.virt.libvirt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.957 281009 INFO nova.virt.libvirt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:22.969 281009 DEBUG nova.virt.libvirt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.033 281009 INFO nova.virt.node [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Determined node identity 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from /var/lib/nova/compute_id
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.050 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Verified node 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 matches my host np0005548788.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.071 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.176 281009 DEBUG oslo_concurrency.lockutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.177 281009 DEBUG oslo_concurrency.lockutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.177 281009 DEBUG oslo_concurrency.lockutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.178 281009 DEBUG nova.compute.resource_tracker [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.178 281009 DEBUG oslo_concurrency.processutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48191 DF PROTO=TCP SPT=49086 DPT=9102 SEQ=3466498631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F707F00000000001030307) 
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.643 281009 DEBUG oslo_concurrency.processutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.875 281009 WARNING nova.virt.libvirt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.877 281009 DEBUG nova.compute.resource_tracker [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12900MB free_disk=41.83721923828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.877 281009 DEBUG oslo_concurrency.lockutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:23.878 281009 DEBUG oslo_concurrency.lockutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.088 281009 DEBUG nova.compute.resource_tracker [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.088 281009 DEBUG nova.compute.resource_tracker [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.110 281009 DEBUG nova.scheduler.client.report [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.173 281009 DEBUG nova.scheduler.client.report [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.173 281009 DEBUG nova.compute.provider_tree [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.196 281009 DEBUG nova.scheduler.client.report [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.225 281009 DEBUG nova.scheduler.client.report [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.243 281009 DEBUG oslo_concurrency.processutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.704 281009 DEBUG oslo_concurrency.processutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.710 281009 DEBUG nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.711 281009 INFO nova.virt.libvirt.host [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] kernel doesn't support AMD SEV
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.712 281009 DEBUG nova.compute.provider_tree [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.713 281009 DEBUG nova.virt.libvirt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.746 281009 DEBUG nova.scheduler.client.report [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.788 281009 DEBUG nova.compute.resource_tracker [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.788 281009 DEBUG oslo_concurrency.lockutils [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.788 281009 DEBUG nova.service [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.830 281009 DEBUG nova.service [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 09:58:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:58:24.831 281009 DEBUG nova.servicegroup.drivers.db [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] DB_Driver: join new ServiceGroup member np0005548788.localdomain to the compute group, service = <Service: host=np0005548788.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 09:58:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:58:25 np0005548788.localdomain podman[281295]: 2025-12-06 09:58:25.267899209 +0000 UTC m=+0.091316362 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:58:25 np0005548788.localdomain podman[281295]: 2025-12-06 09:58:25.303769278 +0000 UTC m=+0.127186441 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:58:25 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:58:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63772 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F713B00000000001030307) 
Dec 06 09:58:30 np0005548788.localdomain sshd[281313]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:58:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:58:33 np0005548788.localdomain podman[281315]: 2025-12-06 09:58:33.297527526 +0000 UTC m=+0.125344844 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:58:33 np0005548788.localdomain podman[281315]: 2025-12-06 09:58:33.340129416 +0000 UTC m=+0.167946684 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:58:33 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:58:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63773 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F733F00000000001030307) 
Dec 06 09:58:35 np0005548788.localdomain sshd[281313]: Connection closed by 101.47.142.76 port 36262 [preauth]
Dec 06 09:58:37 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:58:37.916 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:58:37 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:58:37.918 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:58:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:58:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:58:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:58:39 np0005548788.localdomain systemd[1]: tmp-crun.PBhDhe.mount: Deactivated successfully.
Dec 06 09:58:39 np0005548788.localdomain podman[281340]: 2025-12-06 09:58:39.266259187 +0000 UTC m=+0.091949801 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:58:39 np0005548788.localdomain podman[281340]: 2025-12-06 09:58:39.283666111 +0000 UTC m=+0.109356745 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:58:39 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:58:41 np0005548788.localdomain sudo[281360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:58:41 np0005548788.localdomain sudo[281360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:58:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:58:41 np0005548788.localdomain sudo[281360]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:41 np0005548788.localdomain sudo[281384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:58:41 np0005548788.localdomain sudo[281384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:58:41 np0005548788.localdomain podman[281378]: 2025-12-06 09:58:41.489321897 +0000 UTC m=+0.092154658 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:41 np0005548788.localdomain podman[281378]: 2025-12-06 09:58:41.525008361 +0000 UTC m=+0.127841172 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:58:41 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:58:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:58:41.921 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:58:42 np0005548788.localdomain sudo[281384]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:43 np0005548788.localdomain sudo[281449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:58:43 np0005548788.localdomain sudo[281449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:58:43 np0005548788.localdomain sudo[281449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:58:43 np0005548788.localdomain podman[281467]: 2025-12-06 09:58:43.704978194 +0000 UTC m=+0.093189319 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:58:43 np0005548788.localdomain podman[281467]: 2025-12-06 09:58:43.72373189 +0000 UTC m=+0.111942995 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Dec 06 09:58:43 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:58:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:58:47 np0005548788.localdomain podman[281487]: 2025-12-06 09:58:47.273067593 +0000 UTC m=+0.094184262 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:58:47 np0005548788.localdomain podman[281487]: 2025-12-06 09:58:47.283528949 +0000 UTC m=+0.104645588 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:58:47 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:58:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:58:47.421 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:58:47.423 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:58:47.423 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12161 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F76D250000000001030307) 
Dec 06 09:58:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:58:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:58:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:58:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:58:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:58:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16745 "" "Go-http-client/1.1"
Dec 06 09:58:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:58:50 np0005548788.localdomain podman[281506]: 2025-12-06 09:58:50.255863208 +0000 UTC m=+0.081693411 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:50 np0005548788.localdomain podman[281506]: 2025-12-06 09:58:50.268588556 +0000 UTC m=+0.094418819 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:58:50 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:58:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12162 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F771300000000001030307) 
Dec 06 09:58:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63774 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F773F00000000001030307) 
Dec 06 09:58:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12163 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F779300000000001030307) 
Dec 06 09:58:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9541 DF PROTO=TCP SPT=52022 DPT=9102 SEQ=951275845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F77BF00000000001030307) 
Dec 06 09:58:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:58:56 np0005548788.localdomain podman[281529]: 2025-12-06 09:58:56.253180281 +0000 UTC m=+0.079893226 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:58:56 np0005548788.localdomain podman[281529]: 2025-12-06 09:58:56.284335153 +0000 UTC m=+0.111048078 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:58:56 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:58:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12164 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F788F00000000001030307) 
Dec 06 09:59:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:59:04 np0005548788.localdomain podman[281547]: 2025-12-06 09:59:04.255217817 +0000 UTC m=+0.080146373 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:59:04 np0005548788.localdomain podman[281547]: 2025-12-06 09:59:04.333063037 +0000 UTC m=+0.157991643 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 06 09:59:04 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:59:05 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12165 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7A9F00000000001030307) 
Dec 06 09:59:05 np0005548788.localdomain sshd[281571]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:59:05 np0005548788.localdomain sshd[281571]: Received disconnect from 148.227.3.232 port 39818:11: Bye Bye [preauth]
Dec 06 09:59:05 np0005548788.localdomain sshd[281571]: Disconnected from authenticating user root 148.227.3.232 port 39818 [preauth]
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 09:59:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:59:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:59:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:59:10 np0005548788.localdomain systemd[1]: tmp-crun.LF9wh1.mount: Deactivated successfully.
Dec 06 09:59:10 np0005548788.localdomain podman[281573]: 2025-12-06 09:59:10.266082943 +0000 UTC m=+0.092606202 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 06 09:59:10 np0005548788.localdomain podman[281573]: 2025-12-06 09:59:10.30375928 +0000 UTC m=+0.130282549 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 09:59:10 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:59:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:59:12 np0005548788.localdomain podman[281592]: 2025-12-06 09:59:12.261559038 +0000 UTC m=+0.085209932 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:59:12 np0005548788.localdomain podman[281592]: 2025-12-06 09:59:12.30069128 +0000 UTC m=+0.124342124 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:59:12 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:59:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:13.834 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:13.851 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:59:14 np0005548788.localdomain podman[281615]: 2025-12-06 09:59:14.262862674 +0000 UTC m=+0.087275745 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec 06 09:59:14 np0005548788.localdomain podman[281615]: 2025-12-06 09:59:14.305538706 +0000 UTC m=+0.129951737 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec 06 09:59:14 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:59:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:59:18 np0005548788.localdomain podman[281636]: 2025-12-06 09:59:18.268164961 +0000 UTC m=+0.094637616 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:18 np0005548788.localdomain podman[281636]: 2025-12-06 09:59:18.280371701 +0000 UTC m=+0.106844396 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:18 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:59:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7976 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7E2540000000001030307) 
Dec 06 09:59:19 np0005548788.localdomain podman[240078]: time="2025-12-06T09:59:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:59:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:59:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:59:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:59:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16744 "" "Go-http-client/1.1"
Dec 06 09:59:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7977 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7E6700000000001030307) 
Dec 06 09:59:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:59:21 np0005548788.localdomain podman[281655]: 2025-12-06 09:59:21.258010076 +0000 UTC m=+0.083875369 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:59:21 np0005548788.localdomain podman[281655]: 2025-12-06 09:59:21.296702794 +0000 UTC m=+0.122568057 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:59:21 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:59:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12166 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7E9F10000000001030307) 
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.008 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.008 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.008 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.024 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.025 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.025 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.026 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.026 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.027 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.027 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.027 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.028 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.051 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.051 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.051 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.052 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.052 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.513 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:59:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7978 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7EE700000000001030307) 
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.737 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.739 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12888MB free_disk=41.837059020996094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.739 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.740 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.823 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.823 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:59:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:22.855 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:59:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:23.278 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:59:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:23.284 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:59:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:23.303 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:59:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:23.307 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:59:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 09:59:23.308 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63775 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=2601230013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7F1F00000000001030307) 
Dec 06 09:59:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7979 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F7FE300000000001030307) 
Dec 06 09:59:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:59:27 np0005548788.localdomain podman[281722]: 2025-12-06 09:59:27.255601558 +0000 UTC m=+0.080667039 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Dec 06 09:59:27 np0005548788.localdomain podman[281722]: 2025-12-06 09:59:27.260240973 +0000 UTC m=+0.085306494 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:27 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 09:59:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7980 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F81DF00000000001030307) 
Dec 06 09:59:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 09:59:34 np0005548788.localdomain podman[281740]: 2025-12-06 09:59:34.916845817 +0000 UTC m=+0.175343885 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 09:59:34 np0005548788.localdomain podman[281740]: 2025-12-06 09:59:34.986247264 +0000 UTC m=+0.244745332 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:59:35 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   09:59:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:59:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 09:59:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 09:59:41 np0005548788.localdomain podman[281765]: 2025-12-06 09:59:41.267319685 +0000 UTC m=+0.092628882 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:59:41 np0005548788.localdomain podman[281765]: 2025-12-06 09:59:41.302929716 +0000 UTC m=+0.128238943 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:41 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 09:59:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 09:59:43 np0005548788.localdomain systemd[1]: tmp-crun.8PrTEV.mount: Deactivated successfully.
Dec 06 09:59:43 np0005548788.localdomain podman[281785]: 2025-12-06 09:59:43.255102689 +0000 UTC m=+0.084479108 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:59:43 np0005548788.localdomain podman[281785]: 2025-12-06 09:59:43.267634791 +0000 UTC m=+0.097011190 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:59:43 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 09:59:43 np0005548788.localdomain sudo[281807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:59:43 np0005548788.localdomain sudo[281807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:43 np0005548788.localdomain sudo[281807]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:43 np0005548788.localdomain sudo[281825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:59:43 np0005548788.localdomain sudo[281825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:44 np0005548788.localdomain sudo[281825]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:44 np0005548788.localdomain sudo[281875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:59:44 np0005548788.localdomain sudo[281875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 09:59:44 np0005548788.localdomain sudo[281875]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:44 np0005548788.localdomain sudo[281894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 09:59:44 np0005548788.localdomain sudo[281894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:44 np0005548788.localdomain podman[281893]: 2025-12-06 09:59:44.745317711 +0000 UTC m=+0.093871451 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm)
Dec 06 09:59:44 np0005548788.localdomain podman[281893]: 2025-12-06 09:59:44.761530667 +0000 UTC m=+0.110084437 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git)
Dec 06 09:59:44 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 09:59:45 np0005548788.localdomain sudo[281894]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:45 np0005548788.localdomain sshd[281949]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:59:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:59:47.422 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:59:47.423 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 09:59:47.423 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:48 np0005548788.localdomain sudo[281951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:59:48 np0005548788.localdomain sudo[281951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:48 np0005548788.localdomain sudo[281951]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 09:59:49 np0005548788.localdomain podman[281969]: 2025-12-06 09:59:49.278057764 +0000 UTC m=+0.095281845 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 09:59:49 np0005548788.localdomain podman[281969]: 2025-12-06 09:59:49.319718785 +0000 UTC m=+0.136942836 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd)
Dec 06 09:59:49 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 09:59:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2196 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F857840000000001030307) 
Dec 06 09:59:49 np0005548788.localdomain podman[240078]: time="2025-12-06T09:59:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:59:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:59:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 09:59:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:09:59:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16756 "" "Go-http-client/1.1"
Dec 06 09:59:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2197 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F85B700000000001030307) 
Dec 06 09:59:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7981 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F85DF00000000001030307) 
Dec 06 09:59:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 09:59:52 np0005548788.localdomain podman[281988]: 2025-12-06 09:59:52.257776005 +0000 UTC m=+0.083395765 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:59:52 np0005548788.localdomain podman[281988]: 2025-12-06 09:59:52.294707508 +0000 UTC m=+0.120327238 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:59:52 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 09:59:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2198 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F863700000000001030307) 
Dec 06 09:59:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12167 DF PROTO=TCP SPT=45998 DPT=9102 SEQ=2653988779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F867F10000000001030307) 
Dec 06 09:59:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2199 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F873300000000001030307) 
Dec 06 09:59:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 09:59:58 np0005548788.localdomain podman[282011]: 2025-12-06 09:59:58.258764943 +0000 UTC m=+0.081665371 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:59:58 np0005548788.localdomain podman[282011]: 2025-12-06 09:59:58.293679882 +0000 UTC m=+0.116580260 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:59:58 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:00:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2200 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F893F00000000001030307) 
Dec 06 10:00:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:00:05 np0005548788.localdomain podman[282028]: 2025-12-06 10:00:05.271064548 +0000 UTC m=+0.095876828 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:05 np0005548788.localdomain podman[282028]: 2025-12-06 10:00:05.319728619 +0000 UTC m=+0.144540849 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:00:05 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:00:07 np0005548788.localdomain sshd[282053]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:00:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:00:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5418 writes, 23K keys, 5418 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5418 writes, 822 syncs, 6.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 116 writes, 350 keys, 116 commit groups, 1.0 writes per commit group, ingest: 0.49 MB, 0.00 MB/s
                                                          Interval WAL: 116 writes, 49 syncs, 2.37 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:00:12 np0005548788.localdomain sshd[282053]: Connection closed by 45.78.194.186 port 56440 [preauth]
Dec 06 10:00:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:00:12 np0005548788.localdomain podman[282055]: 2025-12-06 10:00:12.194771626 +0000 UTC m=+0.087902605 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:00:12 np0005548788.localdomain podman[282055]: 2025-12-06 10:00:12.208783445 +0000 UTC m=+0.101914444 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm)
Dec 06 10:00:12 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:00:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:00:13 np0005548788.localdomain podman[282076]: 2025-12-06 10:00:13.70503256 +0000 UTC m=+0.094202108 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:00:13 np0005548788.localdomain podman[282076]: 2025-12-06 10:00:13.74161644 +0000 UTC m=+0.130786028 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:00:13 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:00:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.2 total, 600.0 interval
                                                          Cumulative writes: 5344 writes, 23K keys, 5344 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5344 writes, 666 syncs, 8.02 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4 writes, 18 keys, 4 commit groups, 1.0 writes per commit group, ingest: 0.01 MB, 0.00 MB/s
                                                          Interval WAL: 4 writes, 2 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:00:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:00:15 np0005548788.localdomain podman[282097]: 2025-12-06 10:00:15.270426163 +0000 UTC m=+0.096289701 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=)
Dec 06 10:00:15 np0005548788.localdomain podman[282097]: 2025-12-06 10:00:15.285326549 +0000 UTC m=+0.111190087 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal)
Dec 06 10:00:15 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:00:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20301 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F8CCB50000000001030307) 
Dec 06 10:00:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:00:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:00:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:00:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 10:00:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:00:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16756 "" "Go-http-client/1.1"
Dec 06 10:00:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:00:20 np0005548788.localdomain podman[282117]: 2025-12-06 10:00:20.267136419 +0000 UTC m=+0.087017657 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:00:20 np0005548788.localdomain podman[282117]: 2025-12-06 10:00:20.30371899 +0000 UTC m=+0.123600198 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:00:20 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:00:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20302 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F8D0B00000000001030307) 
Dec 06 10:00:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2201 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F8D3F10000000001030307) 
Dec 06 10:00:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20303 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F8D8B00000000001030307) 
Dec 06 10:00:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:00:23 np0005548788.localdomain podman[282136]: 2025-12-06 10:00:23.263050734 +0000 UTC m=+0.088120421 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:00:23 np0005548788.localdomain podman[282136]: 2025-12-06 10:00:23.275866856 +0000 UTC m=+0.100936603 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:00:23 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.302 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.303 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7982 DF PROTO=TCP SPT=35292 DPT=9102 SEQ=3466537804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F8DBF00000000001030307) 
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.473 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.474 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.474 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.659 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.660 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.660 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.661 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.661 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.661 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:23.662 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.070 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.071 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.071 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.072 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.072 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.515 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.772 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.774 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12891MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.775 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.776 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.846 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.847 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:00:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:24.872 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:25.325 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:25.332 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:00:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:25.349 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:00:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:25.352 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:00:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:00:25.352 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20304 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F8E8700000000001030307) 
Dec 06 10:00:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:00:29 np0005548788.localdomain podman[282203]: 2025-12-06 10:00:29.24392527 +0000 UTC m=+0.070435450 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:00:29 np0005548788.localdomain podman[282203]: 2025-12-06 10:00:29.248798958 +0000 UTC m=+0.075309128 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:00:29 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:00:34 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20305 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F907F00000000001030307) 
Dec 06 10:00:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:00:36 np0005548788.localdomain podman[282221]: 2025-12-06 10:00:36.263967376 +0000 UTC m=+0.083308122 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec 06 10:00:36 np0005548788.localdomain podman[282221]: 2025-12-06 10:00:36.328765572 +0000 UTC m=+0.148106278 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:00:36 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:00:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:00:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:00:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:00:43 np0005548788.localdomain podman[282247]: 2025-12-06 10:00:43.258869194 +0000 UTC m=+0.081683743 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:00:43 np0005548788.localdomain podman[282247]: 2025-12-06 10:00:43.295795536 +0000 UTC m=+0.118610105 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:00:43 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:00:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:00:44 np0005548788.localdomain podman[282265]: 2025-12-06 10:00:44.249847532 +0000 UTC m=+0.077867887 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:00:44 np0005548788.localdomain podman[282265]: 2025-12-06 10:00:44.282979167 +0000 UTC m=+0.110999472 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:00:44 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:00:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:00:46 np0005548788.localdomain systemd[1]: tmp-crun.GwPIAn.mount: Deactivated successfully.
Dec 06 10:00:46 np0005548788.localdomain podman[282288]: 2025-12-06 10:00:46.262364802 +0000 UTC m=+0.089340808 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Dec 06 10:00:46 np0005548788.localdomain podman[282288]: 2025-12-06 10:00:46.275506374 +0000 UTC m=+0.102482400 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 06 10:00:46 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:00:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:00:47.423 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:00:47.424 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:00:47.424 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:48 np0005548788.localdomain sudo[282308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:00:48 np0005548788.localdomain sudo[282308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:00:48 np0005548788.localdomain sudo[282308]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:48 np0005548788.localdomain sudo[282326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:00:48 np0005548788.localdomain sudo[282326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:00:49 np0005548788.localdomain sudo[282326]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54343 DF PROTO=TCP SPT=45732 DPT=9102 SEQ=1120928780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F941E40000000001030307) 
Dec 06 10:00:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:00:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:00:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:00:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 10:00:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:00:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16752 "" "Go-http-client/1.1"
Dec 06 10:00:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54344 DF PROTO=TCP SPT=45732 DPT=9102 SEQ=1120928780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F945F10000000001030307) 
Dec 06 10:00:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20306 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F947F00000000001030307) 
Dec 06 10:00:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:00:51 np0005548788.localdomain podman[282375]: 2025-12-06 10:00:51.265396681 +0000 UTC m=+0.087654456 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:00:51 np0005548788.localdomain podman[282375]: 2025-12-06 10:00:51.302350734 +0000 UTC m=+0.124608459 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:00:51 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:00:52 np0005548788.localdomain sudo[282393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:00:52 np0005548788.localdomain sudo[282393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:00:52 np0005548788.localdomain sudo[282393]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:52 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54345 DF PROTO=TCP SPT=45732 DPT=9102 SEQ=1120928780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F94DF00000000001030307) 
Dec 06 10:00:53 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2202 DF PROTO=TCP SPT=55384 DPT=9102 SEQ=1021444787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F951F00000000001030307) 
Dec 06 10:00:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:00:54 np0005548788.localdomain podman[282411]: 2025-12-06 10:00:54.258182621 +0000 UTC m=+0.083613152 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:00:54 np0005548788.localdomain podman[282411]: 2025-12-06 10:00:54.294807073 +0000 UTC m=+0.120237574 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:00:54 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:00:56 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54346 DF PROTO=TCP SPT=45732 DPT=9102 SEQ=1120928780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F95DB00000000001030307) 
Dec 06 10:01:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:01:00 np0005548788.localdomain systemd[1]: tmp-crun.3npOxY.mount: Deactivated successfully.
Dec 06 10:01:00 np0005548788.localdomain podman[282434]: 2025-12-06 10:01:00.267173577 +0000 UTC m=+0.093151194 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:01:00 np0005548788.localdomain podman[282434]: 2025-12-06 10:01:00.298526368 +0000 UTC m=+0.124503975 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:01:00 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:01:01 np0005548788.localdomain CROND[282455]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 10:01:01 np0005548788.localdomain run-parts[282458]: (/etc/cron.hourly) starting 0anacron
Dec 06 10:01:01 np0005548788.localdomain run-parts[282464]: (/etc/cron.hourly) finished 0anacron
Dec 06 10:01:01 np0005548788.localdomain CROND[282454]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 10:01:04 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54347 DF PROTO=TCP SPT=45732 DPT=9102 SEQ=1120928780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F97DF10000000001030307) 
Dec 06 10:01:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:01:07 np0005548788.localdomain podman[282465]: 2025-12-06 10:01:07.24559216 +0000 UTC m=+0.071881444 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:01:07 np0005548788.localdomain podman[282465]: 2025-12-06 10:01:07.313909072 +0000 UTC m=+0.140198376 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:01:07 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:01:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:01:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:01:09 np0005548788.localdomain sshd[282489]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:09 np0005548788.localdomain sshd[282489]: Accepted publickey for zuul from 38.102.83.114 port 33430 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:01:09 np0005548788.localdomain systemd-logind[765]: New session 61 of user zuul.
Dec 06 10:01:09 np0005548788.localdomain systemd[1]: Started Session 61 of User zuul.
Dec 06 10:01:09 np0005548788.localdomain sshd[282489]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:01:10 np0005548788.localdomain sudo[282509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptrifjmstxnguhvaguqwiavjbvkxvuki ; /usr/bin/python3
Dec 06 10:01:10 np0005548788.localdomain sudo[282509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:01:10 np0005548788.localdomain python3[282511]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:01:11 np0005548788.localdomain subscription-manager[282512]: Unregistered machine with identity: 5e361fc3-fe07-43ea-9bfe-d0f84e52c70c
Dec 06 10:01:12 np0005548788.localdomain sudo[282509]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:12 np0005548788.localdomain sshd[281949]: Connection closed by 45.78.219.195 port 54554 [preauth]
Dec 06 10:01:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:01:13 np0005548788.localdomain systemd[1]: tmp-crun.qFTvcH.mount: Deactivated successfully.
Dec 06 10:01:13 np0005548788.localdomain podman[282514]: 2025-12-06 10:01:13.714789014 +0000 UTC m=+0.096550219 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:01:13 np0005548788.localdomain podman[282514]: 2025-12-06 10:01:13.757945916 +0000 UTC m=+0.139707111 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:01:13 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:01:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:01:15 np0005548788.localdomain podman[282533]: 2025-12-06 10:01:15.26004517 +0000 UTC m=+0.084091997 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:01:15 np0005548788.localdomain podman[282533]: 2025-12-06 10:01:15.271055428 +0000 UTC m=+0.095102305 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:01:15 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:01:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:01:17 np0005548788.localdomain podman[282556]: 2025-12-06 10:01:17.244254942 +0000 UTC m=+0.066244070 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:01:17 np0005548788.localdomain systemd[1]: tmp-crun.2FPxcS.mount: Deactivated successfully.
Dec 06 10:01:17 np0005548788.localdomain podman[282556]: 2025-12-06 10:01:17.281819183 +0000 UTC m=+0.103808301 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64)
Dec 06 10:01:17 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:01:19 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49339 DF PROTO=TCP SPT=43844 DPT=9102 SEQ=1822743941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9B7150000000001030307) 
Dec 06 10:01:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:01:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:01:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:01:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 10:01:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:01:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16752 "" "Go-http-client/1.1"
Dec 06 10:01:20 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49340 DF PROTO=TCP SPT=43844 DPT=9102 SEQ=1822743941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9BB300000000001030307) 
Dec 06 10:01:21 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54348 DF PROTO=TCP SPT=45732 DPT=9102 SEQ=1120928780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9BDF00000000001030307) 
Dec 06 10:01:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:01:22 np0005548788.localdomain podman[282577]: 2025-12-06 10:01:22.258989642 +0000 UTC m=+0.083507660 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:01:22 np0005548788.localdomain podman[282577]: 2025-12-06 10:01:22.27230929 +0000 UTC m=+0.096827338 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:01:22 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:01:22 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49341 DF PROTO=TCP SPT=43844 DPT=9102 SEQ=1822743941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9C3300000000001030307) 
Dec 06 10:01:23 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20307 DF PROTO=TCP SPT=60210 DPT=9102 SEQ=272254140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9C5F00000000001030307) 
Dec 06 10:01:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:23.353 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:23.354 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:23.354 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:23.355 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:23.355 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:01:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:24.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:24.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:01:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:24.007 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:01:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:24.035 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:01:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:24.036 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:25.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:25.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:01:25 np0005548788.localdomain podman[282594]: 2025-12-06 10:01:25.254561055 +0000 UTC m=+0.075942047 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:01:25 np0005548788.localdomain podman[282594]: 2025-12-06 10:01:25.288509266 +0000 UTC m=+0.109890338 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:01:25 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:01:25 np0005548788.localdomain sshd[282618]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.034 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.035 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.035 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.035 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.036 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.524 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:01:26 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49342 DF PROTO=TCP SPT=43844 DPT=9102 SEQ=1822743941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9D2F00000000001030307) 
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.746 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.748 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12892MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.748 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.749 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.836 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.837 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:01:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:26.854 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:01:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:27.302 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:01:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:27.310 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:01:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:27.326 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:01:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:27.329 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:01:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:01:27.329 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:30 np0005548788.localdomain sshd[282618]: Received disconnect from 101.47.142.76 port 43888:11: Bye Bye [preauth]
Dec 06 10:01:30 np0005548788.localdomain sshd[282618]: Disconnected from authenticating user root 101.47.142.76 port 43888 [preauth]
Dec 06 10:01:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:01:31 np0005548788.localdomain podman[282664]: 2025-12-06 10:01:31.254327149 +0000 UTC m=+0.082698164 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:01:31 np0005548788.localdomain podman[282664]: 2025-12-06 10:01:31.284541185 +0000 UTC m=+0.112912180 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:01:31 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:01:33 np0005548788.localdomain sshd[282684]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:33 np0005548788.localdomain sshd[282684]: Received disconnect from 148.227.3.232 port 50286:11: Bye Bye [preauth]
Dec 06 10:01:33 np0005548788.localdomain sshd[282684]: Disconnected from authenticating user root 148.227.3.232 port 50286 [preauth]
Dec 06 10:01:35 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49343 DF PROTO=TCP SPT=43844 DPT=9102 SEQ=1822743941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5F9F3F00000000001030307) 
Dec 06 10:01:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:01:38 np0005548788.localdomain podman[282686]: 2025-12-06 10:01:38.258975206 +0000 UTC m=+0.085849181 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 06 10:01:38 np0005548788.localdomain podman[282686]: 2025-12-06 10:01:38.301714945 +0000 UTC m=+0.128588910 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:01:38 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:01:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:01:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:01:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:01:44 np0005548788.localdomain podman[282711]: 2025-12-06 10:01:44.264468734 +0000 UTC m=+0.090288017 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:01:44 np0005548788.localdomain podman[282711]: 2025-12-06 10:01:44.27704731 +0000 UTC m=+0.102866613 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:01:44 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:01:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:01:46 np0005548788.localdomain podman[282730]: 2025-12-06 10:01:46.262992066 +0000 UTC m=+0.088682288 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:01:46 np0005548788.localdomain podman[282730]: 2025-12-06 10:01:46.299725622 +0000 UTC m=+0.125415824 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:01:46 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:01:46 np0005548788.localdomain sudo[282753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:01:46 np0005548788.localdomain sudo[282753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:46 np0005548788.localdomain sudo[282753]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:46 np0005548788.localdomain sudo[282771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 10:01:46 np0005548788.localdomain sudo[282771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 2025-12-06 10:01:47.118434651 +0000 UTC m=+0.080571049 container create 3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_colden, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, distribution-scope=public, release=1763362218, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: Started libpod-conmon-3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd.scope.
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 2025-12-06 10:01:47.085937296 +0000 UTC m=+0.048073724 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 2025-12-06 10:01:47.204818618 +0000 UTC m=+0.166955026 container init 3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_colden, version=7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=1763362218, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 2025-12-06 10:01:47.222981284 +0000 UTC m=+0.185117682 container start 3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_colden, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, release=1763362218, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public)
Dec 06 10:01:47 np0005548788.localdomain sharp_colden[282847]: 167 167
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 2025-12-06 10:01:47.223334455 +0000 UTC m=+0.185470853 container attach 3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_colden, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, version=7, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, name=rhceph)
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: libpod-3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd.scope: Deactivated successfully.
Dec 06 10:01:47 np0005548788.localdomain podman[282831]: 2025-12-06 10:01:47.227068539 +0000 UTC m=+0.189204947 container died 3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_colden, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-59dda3760aacd3a18032a862a6b0c80be98fd01f35b7e0cb0fd48bc52965fd96-merged.mount: Deactivated successfully.
Dec 06 10:01:47 np0005548788.localdomain podman[282852]: 2025-12-06 10:01:47.350338876 +0000 UTC m=+0.115017395 container remove 3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_colden, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=1763362218, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=)
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: libpod-conmon-3c4fd7b3b166875829feeb272567691657632bcac68e41c65231b7d0928345bd.scope: Deactivated successfully.
Dec 06 10:01:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:01:47.424 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:01:47.425 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:01:47.426 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: tmp-crun.brkAYP.mount: Deactivated successfully.
Dec 06 10:01:47 np0005548788.localdomain podman[282864]: 2025-12-06 10:01:47.442655934 +0000 UTC m=+0.105773772 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Dec 06 10:01:47 np0005548788.localdomain podman[282864]: 2025-12-06 10:01:47.485610979 +0000 UTC m=+0.148728737 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:01:47 np0005548788.localdomain podman[282893]: 
Dec 06 10:01:47 np0005548788.localdomain podman[282893]: 2025-12-06 10:01:47.583836738 +0000 UTC m=+0.084866120 container create e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_shaw, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, ceph=True)
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: Started libpod-conmon-e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd.scope.
Dec 06 10:01:47 np0005548788.localdomain podman[282893]: 2025-12-06 10:01:47.545506175 +0000 UTC m=+0.046535547 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:01:47 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:01:47 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1635beb673a0b9c59680a37264249d97ca4d55e84ce7045864121d0c27ebece/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1635beb673a0b9c59680a37264249d97ca4d55e84ce7045864121d0c27ebece/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1635beb673a0b9c59680a37264249d97ca4d55e84ce7045864121d0c27ebece/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1635beb673a0b9c59680a37264249d97ca4d55e84ce7045864121d0c27ebece/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548788.localdomain podman[282893]: 2025-12-06 10:01:47.663941072 +0000 UTC m=+0.164970424 container init e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_shaw, architecture=x86_64, version=7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.component=rhceph-container)
Dec 06 10:01:47 np0005548788.localdomain podman[282893]: 2025-12-06 10:01:47.674771824 +0000 UTC m=+0.175801176 container start e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_shaw, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, ceph=True)
Dec 06 10:01:47 np0005548788.localdomain podman[282893]: 2025-12-06 10:01:47.675579719 +0000 UTC m=+0.176609081 container attach e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_shaw, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]: [
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:     {
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "available": false,
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "ceph_device": false,
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "lsm_data": {},
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "lvs": [],
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "path": "/dev/sr0",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "rejected_reasons": [
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "Has a FileSystem",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "Insufficient space (<5GB)"
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         ],
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         "sys_api": {
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "actuators": null,
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "device_nodes": "sr0",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "human_readable_size": "482.00 KB",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "id_bus": "ata",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "model": "QEMU DVD-ROM",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "nr_requests": "2",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "partitions": {},
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "path": "/dev/sr0",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "removable": "1",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "rev": "2.5+",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "ro": "0",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "rotational": "1",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "sas_address": "",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "sas_device_handle": "",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "scheduler_mode": "mq-deadline",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "sectors": 0,
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "sectorsize": "2048",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "size": 493568.0,
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "support_discard": "0",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "type": "disk",
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:             "vendor": "QEMU"
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:         }
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]:     }
Dec 06 10:01:48 np0005548788.localdomain stupefied_shaw[282909]: ]
Dec 06 10:01:48 np0005548788.localdomain systemd[1]: libpod-e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd.scope: Deactivated successfully.
Dec 06 10:01:48 np0005548788.localdomain systemd[1]: libpod-e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd.scope: Consumed 1.136s CPU time.
Dec 06 10:01:48 np0005548788.localdomain podman[282893]: 2025-12-06 10:01:48.757823201 +0000 UTC m=+1.258852563 container died e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_shaw, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, distribution-scope=public, GIT_CLEAN=True, release=1763362218, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=)
Dec 06 10:01:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e1635beb673a0b9c59680a37264249d97ca4d55e84ce7045864121d0c27ebece-merged.mount: Deactivated successfully.
Dec 06 10:01:48 np0005548788.localdomain podman[284838]: 2025-12-06 10:01:48.841247577 +0000 UTC m=+0.071238123 container remove e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_shaw, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, name=rhceph, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:01:48 np0005548788.localdomain systemd[1]: libpod-conmon-e948d7a6d9f554c23a7bee94b9f8b55e2771808f391c1849775379a95d0212bd.scope: Deactivated successfully.
Dec 06 10:01:48 np0005548788.localdomain sudo[282771]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:49 np0005548788.localdomain sudo[284851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:01:49 np0005548788.localdomain sudo[284851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:49 np0005548788.localdomain sudo[284851]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:49 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56443 DF PROTO=TCP SPT=36596 DPT=9102 SEQ=2389210300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5FA2C450000000001030307) 
Dec 06 10:01:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:01:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:01:49 np0005548788.localdomain sudo[284869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:01:49 np0005548788.localdomain sshd[284886]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:49 np0005548788.localdomain sudo[284869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:49 np0005548788.localdomain sudo[284869]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:01:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146547 "" "Go-http-client/1.1"
Dec 06 10:01:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:01:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16749 "" "Go-http-client/1.1"
Dec 06 10:01:49 np0005548788.localdomain sshd[284886]: Accepted publickey for tripleo-admin from 192.168.122.11 port 60892 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:01:49 np0005548788.localdomain sudo[284889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:01:49 np0005548788.localdomain systemd-logind[765]: New session 62 of user tripleo-admin.
Dec 06 10:01:49 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 06 10:01:49 np0005548788.localdomain sudo[284889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:49 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 06 10:01:49 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 06 10:01:49 np0005548788.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Queued start job for default target Main User Target.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Created slice User Application Slice.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Reached target Paths.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Reached target Timers.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Starting D-Bus User Message Bus Socket...
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Starting Create User's Volatile Files and Directories...
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Finished Create User's Volatile Files and Directories.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Reached target Sockets.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Reached target Basic System.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Reached target Main User Target.
Dec 06 10:01:49 np0005548788.localdomain systemd[284909]: Startup finished in 148ms.
Dec 06 10:01:49 np0005548788.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 06 10:01:50 np0005548788.localdomain systemd[1]: Started Session 62 of User tripleo-admin.
Dec 06 10:01:50 np0005548788.localdomain sshd[284886]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:01:50 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56444 DF PROTO=TCP SPT=36596 DPT=9102 SEQ=2389210300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5FA30300000000001030307) 
Dec 06 10:01:50 np0005548788.localdomain sudo[284889]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:50 np0005548788.localdomain sudo[285080]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjqsvolkwomdbxlzplqbjtcqcoqxbioi ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015310.099193-60030-144040143495667/AnsiballZ_blockinfile.py
Dec 06 10:01:50 np0005548788.localdomain sudo[285080]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:01:50 np0005548788.localdomain python3[285082]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 10:01:50 np0005548788.localdomain sudo[285080]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:50 np0005548788.localdomain systemd-journald[47853]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 80.5 (268 of 333 items), suggesting rotation.
Dec 06 10:01:50 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 10:01:50 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:01:50 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:01:50 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:01:50 np0005548788.localdomain sudo[285101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:01:50 np0005548788.localdomain sudo[285101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:50 np0005548788.localdomain sudo[285101]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:51 np0005548788.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:0d:33:99 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49344 DF PROTO=TCP SPT=43844 DPT=9102 SEQ=1822743941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD5FA33F00000000001030307) 
Dec 06 10:01:51 np0005548788.localdomain sudo[285243]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efghpwjisbgfmfaoivqcvaftsmfpobsx ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015310.9857383-60044-171345858446590/AnsiballZ_systemd.py
Dec 06 10:01:51 np0005548788.localdomain sudo[285243]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:01:51 np0005548788.localdomain python3[285245]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 10:01:52 np0005548788.localdomain systemd[1]: Stopping Netfilter Tables...
Dec 06 10:01:52 np0005548788.localdomain systemd[1]: nftables.service: Deactivated successfully.
Dec 06 10:01:52 np0005548788.localdomain systemd[1]: Stopped Netfilter Tables.
Dec 06 10:01:52 np0005548788.localdomain systemd[1]: Starting Netfilter Tables...
Dec 06 10:01:52 np0005548788.localdomain systemd[1]: Finished Netfilter Tables.
Dec 06 10:01:52 np0005548788.localdomain sudo[285243]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:01:53 np0005548788.localdomain podman[285270]: 2025-12-06 10:01:53.274770601 +0000 UTC m=+0.097247750 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:01:53 np0005548788.localdomain podman[285270]: 2025-12-06 10:01:53.316533331 +0000 UTC m=+0.139010450 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:01:53 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:01:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:01:56 np0005548788.localdomain podman[285288]: 2025-12-06 10:01:56.264149816 +0000 UTC m=+0.086752028 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:01:56 np0005548788.localdomain podman[285288]: 2025-12-06 10:01:56.303635156 +0000 UTC m=+0.126237378 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:01:56 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:02:00 np0005548788.localdomain sudo[285311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:00 np0005548788.localdomain sudo[285311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:00 np0005548788.localdomain sudo[285311]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:01 np0005548788.localdomain sudo[285329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:02:02 np0005548788.localdomain sudo[285329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:02 np0005548788.localdomain sudo[285329]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:02 np0005548788.localdomain podman[285346]: 2025-12-06 10:02:02.098029736 +0000 UTC m=+0.088248534 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:02:02 np0005548788.localdomain podman[285346]: 2025-12-06 10:02:02.103144733 +0000 UTC m=+0.093363551 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 06 10:02:02 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:02:03 np0005548788.localdomain sudo[285364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:03 np0005548788.localdomain sudo[285364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:03 np0005548788.localdomain sudo[285364]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:04 np0005548788.localdomain sudo[285382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:04 np0005548788.localdomain sudo[285382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:04 np0005548788.localdomain sudo[285382]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:05 np0005548788.localdomain sudo[285400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:05 np0005548788.localdomain sudo[285400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:05 np0005548788.localdomain sudo[285400]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:07 np0005548788.localdomain sudo[285418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:07 np0005548788.localdomain sudo[285418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:07 np0005548788.localdomain sudo[285418]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:02:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:02:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:02:09 np0005548788.localdomain podman[285436]: 2025-12-06 10:02:09.258225528 +0000 UTC m=+0.080842417 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:02:09 np0005548788.localdomain podman[285436]: 2025-12-06 10:02:09.302782073 +0000 UTC m=+0.125398982 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:02:09 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:02:12 np0005548788.localdomain sshd[282492]: Received disconnect from 38.102.83.114 port 33430:11: disconnected by user
Dec 06 10:02:12 np0005548788.localdomain sshd[282492]: Disconnected from user zuul 38.102.83.114 port 33430
Dec 06 10:02:12 np0005548788.localdomain sshd[282489]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:02:12 np0005548788.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Dec 06 10:02:12 np0005548788.localdomain systemd-logind[765]: Session 61 logged out. Waiting for processes to exit.
Dec 06 10:02:12 np0005548788.localdomain systemd-logind[765]: Removed session 61.
Dec 06 10:02:12 np0005548788.localdomain sudo[285459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:02:12 np0005548788.localdomain sudo[285459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:12 np0005548788.localdomain sudo[285459]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:13 np0005548788.localdomain sudo[285477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:02:13 np0005548788.localdomain sudo[285477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 2025-12-06 10:02:13.680690896 +0000 UTC m=+0.081898665 container create 2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hermann, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main)
Dec 06 10:02:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee.scope.
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 2025-12-06 10:02:13.647742515 +0000 UTC m=+0.048950334 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:02:13 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 2025-12-06 10:02:13.763327323 +0000 UTC m=+0.164535102 container init 2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hermann, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1763362218, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 2025-12-06 10:02:13.773398993 +0000 UTC m=+0.174606772 container start 2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hermann, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-type=git, version=7, release=1763362218, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 2025-12-06 10:02:13.773613429 +0000 UTC m=+0.174821198 container attach 2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hermann, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:02:13 np0005548788.localdomain objective_hermann[285553]: 167 167
Dec 06 10:02:13 np0005548788.localdomain systemd[1]: libpod-2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee.scope: Deactivated successfully.
Dec 06 10:02:13 np0005548788.localdomain podman[285538]: 2025-12-06 10:02:13.780671947 +0000 UTC m=+0.181879716 container died 2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hermann, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, architecture=x86_64, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:02:13 np0005548788.localdomain podman[285558]: 2025-12-06 10:02:13.89055371 +0000 UTC m=+0.095573415 container remove 2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hermann, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Dec 06 10:02:13 np0005548788.localdomain systemd[1]: libpod-conmon-2cf0336fc07943dee531a67f006f235213f690dc45ec2d27d6e9d41f5e0342ee.scope: Deactivated successfully.
Dec 06 10:02:13 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:02:14 np0005548788.localdomain systemd-rc-local-generator[285602]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:02:14 np0005548788.localdomain systemd-sysv-generator[285606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: tmp-crun.JjBXDZ.mount: Deactivated successfully.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-95552c4d3c00eadb41c94c7bc08a745e726a1e37ca3f1c449bca7c35e393e330-merged.mount: Deactivated successfully.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:02:14 np0005548788.localdomain podman[285612]: 2025-12-06 10:02:14.411639719 +0000 UTC m=+0.087173847 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:02:14 np0005548788.localdomain podman[285612]: 2025-12-06 10:02:14.423483123 +0000 UTC m=+0.099017211 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:02:14 np0005548788.localdomain systemd-sysv-generator[285661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:02:14 np0005548788.localdomain systemd-rc-local-generator[285658]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:02:14 np0005548788.localdomain systemd[1]: Starting Ceph mds.mds.np0005548788.erzujf for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:02:15 np0005548788.localdomain podman[285725]: 
Dec 06 10:02:15 np0005548788.localdomain podman[285725]: 2025-12-06 10:02:15.090963147 +0000 UTC m=+0.081197154 container create 381108e5c18c07e1e8f42d0ee892321e8c6383b57f8173e7a2b918b98d7d8c1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548788-erzujf, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, ceph=True, architecture=x86_64)
Dec 06 10:02:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ae78d5494f286dea7686b6a0492e2a4f2ef122b6d8c27c1c4e9c768e5c9b66/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ae78d5494f286dea7686b6a0492e2a4f2ef122b6d8c27c1c4e9c768e5c9b66/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ae78d5494f286dea7686b6a0492e2a4f2ef122b6d8c27c1c4e9c768e5c9b66/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ae78d5494f286dea7686b6a0492e2a4f2ef122b6d8c27c1c4e9c768e5c9b66/merged/var/lib/ceph/mds/ceph-mds.np0005548788.erzujf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:15 np0005548788.localdomain podman[285725]: 2025-12-06 10:02:15.151779944 +0000 UTC m=+0.142013941 container init 381108e5c18c07e1e8f42d0ee892321e8c6383b57f8173e7a2b918b98d7d8c1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548788-erzujf, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git)
Dec 06 10:02:15 np0005548788.localdomain podman[285725]: 2025-12-06 10:02:15.057483699 +0000 UTC m=+0.047717746 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:02:15 np0005548788.localdomain podman[285725]: 2025-12-06 10:02:15.159495921 +0000 UTC m=+0.149729948 container start 381108e5c18c07e1e8f42d0ee892321e8c6383b57f8173e7a2b918b98d7d8c1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548788-erzujf, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, release=1763362218, version=7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, build-date=2025-11-26T19:44:28Z)
Dec 06 10:02:15 np0005548788.localdomain bash[285725]: 381108e5c18c07e1e8f42d0ee892321e8c6383b57f8173e7a2b918b98d7d8c1c
Dec 06 10:02:15 np0005548788.localdomain systemd[1]: Started Ceph mds.mds.np0005548788.erzujf for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: main not setting numa affinity
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: pidfile_write: ignore empty --pid-file
Dec 06 10:02:15 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548788-erzujf[285739]: starting mds.mds.np0005548788.erzujf at 
Dec 06 10:02:15 np0005548788.localdomain sudo[285477]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Updating MDS map to version 8 from mon.0
Dec 06 10:02:15 np0005548788.localdomain sudo[285762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:15 np0005548788.localdomain sudo[285762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:15 np0005548788.localdomain sudo[285762]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:15 np0005548788.localdomain sudo[285780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:02:15 np0005548788.localdomain sudo[285780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:15 np0005548788.localdomain sudo[285780]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Updating MDS map to version 9 from mon.0
Dec 06 10:02:15 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Monitors have assigned me to become a standby.
Dec 06 10:02:15 np0005548788.localdomain sudo[285799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:02:15 np0005548788.localdomain sudo[285799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:02:16 np0005548788.localdomain podman[285859]: 2025-12-06 10:02:16.669415382 +0000 UTC m=+0.090932014 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:02:16 np0005548788.localdomain podman[285859]: 2025-12-06 10:02:16.68762991 +0000 UTC m=+0.109146582 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:02:16 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:02:16 np0005548788.localdomain systemd[1]: tmp-crun.Ikmn8Q.mount: Deactivated successfully.
Dec 06 10:02:16 np0005548788.localdomain podman[285912]: 2025-12-06 10:02:16.877912233 +0000 UTC m=+0.101164867 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-26T19:44:28Z, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.41.4)
Dec 06 10:02:16 np0005548788.localdomain podman[285912]: 2025-12-06 10:02:16.976601403 +0000 UTC m=+0.199854067 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:02:17 np0005548788.localdomain sudo[285799]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:17 np0005548788.localdomain sudo[285996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:17 np0005548788.localdomain sudo[285996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:02:17 np0005548788.localdomain sudo[285996]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:17 np0005548788.localdomain podman[286013]: 2025-12-06 10:02:17.857753748 +0000 UTC m=+0.082981980 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:02:17 np0005548788.localdomain podman[286013]: 2025-12-06 10:02:17.900252932 +0000 UTC m=+0.125481144 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec 06 10:02:17 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:02:18 np0005548788.localdomain sudo[286034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:18 np0005548788.localdomain sudo[286034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:18 np0005548788.localdomain sudo[286034]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:02:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:02:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:02:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148695 "" "Go-http-client/1.1"
Dec 06 10:02:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:02:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17227 "" "Go-http-client/1.1"
Dec 06 10:02:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:02:24 np0005548788.localdomain podman[286052]: 2025-12-06 10:02:24.244413033 +0000 UTC m=+0.071867488 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:02:24 np0005548788.localdomain podman[286052]: 2025-12-06 10:02:24.25965578 +0000 UTC m=+0.087110245 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:02:24 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:02:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:24.324 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:24.325 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:24.343 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:24.344 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:24.344 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:24.345 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:02:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:25.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:25.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.030 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.030 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.053 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.053 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.054 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.054 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.054 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.510 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.694 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.695 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12868MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.695 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.696 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.787 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.788 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:02:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:26.813 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:02:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:02:27 np0005548788.localdomain podman[286113]: 2025-12-06 10:02:27.248582421 +0000 UTC m=+0.077663036 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:02:27 np0005548788.localdomain podman[286113]: 2025-12-06 10:02:27.256744722 +0000 UTC m=+0.085825367 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:02:27 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:02:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:27.284 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:02:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:27.293 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:02:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:27.316 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:02:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:27.319 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:02:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:27.320 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:02:28.295 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:31 np0005548788.localdomain sshd[286138]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:02:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:02:32 np0005548788.localdomain podman[286140]: 2025-12-06 10:02:32.260746963 +0000 UTC m=+0.086194137 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:02:32 np0005548788.localdomain podman[286140]: 2025-12-06 10:02:32.266666364 +0000 UTC m=+0.092113528 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:02:32 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:02:33 np0005548788.localdomain sshd[286138]: Received disconnect from 45.78.194.186 port 35464:11: Bye Bye [preauth]
Dec 06 10:02:33 np0005548788.localdomain sshd[286138]: Disconnected from authenticating user root 45.78.194.186 port 35464 [preauth]
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:02:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:02:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:02:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:02:40 np0005548788.localdomain podman[286158]: 2025-12-06 10:02:40.261253028 +0000 UTC m=+0.081735900 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 06 10:02:40 np0005548788.localdomain podman[286158]: 2025-12-06 10:02:40.299335738 +0000 UTC m=+0.119818600 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:02:40 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:02:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:02:45 np0005548788.localdomain podman[286183]: 2025-12-06 10:02:45.273165052 +0000 UTC m=+0.098517545 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:02:45 np0005548788.localdomain podman[286183]: 2025-12-06 10:02:45.288741561 +0000 UTC m=+0.114094054 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible)
Dec 06 10:02:45 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:02:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:02:47 np0005548788.localdomain podman[286203]: 2025-12-06 10:02:47.25429013 +0000 UTC m=+0.073874699 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:02:47 np0005548788.localdomain podman[286203]: 2025-12-06 10:02:47.288015416 +0000 UTC m=+0.107599995 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:02:47 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:02:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:02:47.425 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:02:47.425 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:02:47.426 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:02:48 np0005548788.localdomain podman[286228]: 2025-12-06 10:02:48.243532375 +0000 UTC m=+0.075538531 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible)
Dec 06 10:02:48 np0005548788.localdomain podman[286228]: 2025-12-06 10:02:48.254889233 +0000 UTC m=+0.086895409 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 06 10:02:48 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:02:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:02:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:02:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:02:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148695 "" "Go-http-client/1.1"
Dec 06 10:02:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:02:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17239 "" "Go-http-client/1.1"
Dec 06 10:02:51 np0005548788.localdomain sshd[284924]: Received disconnect from 192.168.122.11 port 60892:11: disconnected by user
Dec 06 10:02:51 np0005548788.localdomain sshd[284924]: Disconnected from user tripleo-admin 192.168.122.11 port 60892
Dec 06 10:02:51 np0005548788.localdomain sshd[284886]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 06 10:02:51 np0005548788.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Dec 06 10:02:51 np0005548788.localdomain systemd[1]: session-62.scope: Consumed 1.380s CPU time.
Dec 06 10:02:51 np0005548788.localdomain systemd-logind[765]: Session 62 logged out. Waiting for processes to exit.
Dec 06 10:02:51 np0005548788.localdomain systemd-logind[765]: Removed session 62.
Dec 06 10:02:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:02:55 np0005548788.localdomain podman[286248]: 2025-12-06 10:02:55.243034145 +0000 UTC m=+0.072371603 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd)
Dec 06 10:02:55 np0005548788.localdomain podman[286248]: 2025-12-06 10:02:55.253910119 +0000 UTC m=+0.083247567 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:02:55 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:02:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:02:58 np0005548788.localdomain podman[286266]: 2025-12-06 10:02:58.264070693 +0000 UTC m=+0.090331805 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:02:58 np0005548788.localdomain podman[286266]: 2025-12-06 10:02:58.300678357 +0000 UTC m=+0.126939479 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:02:58 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Activating special unit Exit the Session...
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped target Main User Target.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped target Basic System.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped target Paths.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped target Sockets.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped target Timers.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Closed D-Bus User Message Bus Socket.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Removed slice User Application Slice.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Reached target Shutdown.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Finished Exit the Session.
Dec 06 10:03:01 np0005548788.localdomain systemd[284909]: Reached target Exit the Session.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 06 10:03:01 np0005548788.localdomain systemd[1]: user-1003.slice: Consumed 1.742s CPU time.
Dec 06 10:03:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:03:03 np0005548788.localdomain podman[286291]: 2025-12-06 10:03:03.264700741 +0000 UTC m=+0.088561121 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:03:03 np0005548788.localdomain podman[286291]: 2025-12-06 10:03:03.271505599 +0000 UTC m=+0.095365989 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:03:03 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.494 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:03:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:08 np0005548788.localdomain sudo[286309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:08 np0005548788.localdomain sudo[286309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:08 np0005548788.localdomain sudo[286309]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:08 np0005548788.localdomain sudo[286327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:08 np0005548788.localdomain sudo[286327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:08 np0005548788.localdomain sudo[286327]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:08 np0005548788.localdomain sudo[286345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:03:08 np0005548788.localdomain sudo[286345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:03:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:03:09 np0005548788.localdomain sudo[286345]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:03:11 np0005548788.localdomain podman[286394]: 2025-12-06 10:03:11.25423619 +0000 UTC m=+0.079187603 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:03:11 np0005548788.localdomain podman[286394]: 2025-12-06 10:03:11.295679851 +0000 UTC m=+0.120631264 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:03:11 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:03:11 np0005548788.localdomain sudo[286419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:11 np0005548788.localdomain sudo[286419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:11 np0005548788.localdomain sudo[286419]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:11 np0005548788.localdomain sudo[286437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:11 np0005548788.localdomain sudo[286437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:11 np0005548788.localdomain sudo[286437]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:03:16 np0005548788.localdomain podman[286455]: 2025-12-06 10:03:16.256403295 +0000 UTC m=+0.082738631 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:03:16 np0005548788.localdomain podman[286455]: 2025-12-06 10:03:16.295789994 +0000 UTC m=+0.122125310 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:03:16 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:03:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:03:18 np0005548788.localdomain podman[286475]: 2025-12-06 10:03:18.252888044 +0000 UTC m=+0.079492512 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:03:18 np0005548788.localdomain podman[286475]: 2025-12-06 10:03:18.291615474 +0000 UTC m=+0.118219902 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:03:18 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:03:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:03:18 np0005548788.localdomain podman[286498]: 2025-12-06 10:03:18.427226197 +0000 UTC m=+0.090845230 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Dec 06 10:03:18 np0005548788.localdomain podman[286498]: 2025-12-06 10:03:18.44426365 +0000 UTC m=+0.107882633 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:03:18 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:03:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:03:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:03:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:03:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148695 "" "Go-http-client/1.1"
Dec 06 10:03:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:03:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17235 "" "Go-http-client/1.1"
Dec 06 10:03:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:22.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:22.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:03:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:22.026 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:03:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:22.028 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:22.029 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:03:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:22.042 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:24.047 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:25.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:25.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:25.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:03:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:26.008 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:03:26 np0005548788.localdomain podman[286519]: 2025-12-06 10:03:26.257816715 +0000 UTC m=+0.085839486 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:03:26 np0005548788.localdomain podman[286519]: 2025-12-06 10:03:26.272696352 +0000 UTC m=+0.100719083 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:03:26 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.021 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.022 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.023 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:27.023 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.028 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.028 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.491 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.752 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.754 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12875MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.754 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.754 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.879 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.880 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:03:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:28.948 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.007 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.008 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.031 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.049 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.064 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:03:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:03:29 np0005548788.localdomain podman[286563]: 2025-12-06 10:03:29.265038769 +0000 UTC m=+0.087288971 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:03:29 np0005548788.localdomain podman[286563]: 2025-12-06 10:03:29.275692206 +0000 UTC m=+0.097942438 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:03:29 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.534 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.542 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.561 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.562 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:03:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:03:29.562 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.808s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:33 np0005548788.localdomain sudo[286607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:33 np0005548788.localdomain sudo[286607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:33 np0005548788.localdomain sudo[286607]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:33 np0005548788.localdomain sudo[286625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 10:03:33 np0005548788.localdomain sudo[286625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:03:33 np0005548788.localdomain sudo[286625]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:33 np0005548788.localdomain podman[286658]: 2025-12-06 10:03:33.694139428 +0000 UTC m=+0.091431568 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:03:33 np0005548788.localdomain podman[286658]: 2025-12-06 10:03:33.72771989 +0000 UTC m=+0.125012080 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 06 10:03:33 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:03:34 np0005548788.localdomain sudo[286682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:34 np0005548788.localdomain sudo[286682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:34 np0005548788.localdomain sudo[286682]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:34 np0005548788.localdomain sudo[286700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:34 np0005548788.localdomain sudo[286700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:34 np0005548788.localdomain sudo[286700]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:36 np0005548788.localdomain sudo[286718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:36 np0005548788.localdomain sudo[286718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:36 np0005548788.localdomain sudo[286718]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:37 np0005548788.localdomain sudo[286736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:37 np0005548788.localdomain sudo[286736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:37 np0005548788.localdomain sudo[286736]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:37 np0005548788.localdomain sudo[286754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:37 np0005548788.localdomain sudo[286754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 2025-12-06 10:03:38.015997455 +0000 UTC m=+0.072756064 container create 9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_pascal, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main)
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: Started libpod-conmon-9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f.scope.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 2025-12-06 10:03:37.984001592 +0000 UTC m=+0.040760171 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 2025-12-06 10:03:38.08975238 +0000 UTC m=+0.146510929 container init 9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_pascal, release=1763362218, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7)
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 2025-12-06 10:03:38.103344477 +0000 UTC m=+0.160103026 container start 9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_pascal, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 2025-12-06 10:03:38.103608355 +0000 UTC m=+0.160366914 container attach 9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_pascal, description=Red Hat Ceph Storage 7, release=1763362218, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, ceph=True, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_BRANCH=main)
Dec 06 10:03:38 np0005548788.localdomain keen_pascal[286831]: 167 167
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: libpod-9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f.scope: Deactivated successfully.
Dec 06 10:03:38 np0005548788.localdomain podman[286816]: 2025-12-06 10:03:38.106404551 +0000 UTC m=+0.163163080 container died 9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_pascal, name=rhceph, GIT_BRANCH=main, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph)
Dec 06 10:03:38 np0005548788.localdomain podman[286836]: 2025-12-06 10:03:38.201728678 +0000 UTC m=+0.085975501 container remove 9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_pascal, release=1763362218, GIT_CLEAN=True, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph)
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: libpod-conmon-9f2a4164e1438964fecf663040fc1569a04b591aa3aa66e5243e0452989b2c8f.scope: Deactivated successfully.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:03:38 np0005548788.localdomain systemd-sysv-generator[286879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:03:38 np0005548788.localdomain systemd-rc-local-generator[286875]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e659efadb78e830155cac471f433be63f404486ba5ce553a13ee94bdc36ba474-merged.mount: Deactivated successfully.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:03:38 np0005548788.localdomain systemd-rc-local-generator[286921]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:03:38 np0005548788.localdomain systemd-sysv-generator[286924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:03:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:03:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:03:38 np0005548788.localdomain systemd[1]: Starting Ceph mgr.np0005548788.yvwbqq for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:03:39 np0005548788.localdomain podman[286981]: 
Dec 06 10:03:39 np0005548788.localdomain podman[286981]: 2025-12-06 10:03:39.350928452 +0000 UTC m=+0.071806125 container create f0e5abbd0eccf3653793798ce41e81f92483b6a70b5cb9f2bcd18994fe5c643c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git)
Dec 06 10:03:39 np0005548788.localdomain systemd[1]: tmp-crun.DYUhSm.mount: Deactivated successfully.
Dec 06 10:03:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74f6f21cc0631f50dd0c6b9cee8d262ba9ef919d454d5f62464d69b6a4089240/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74f6f21cc0631f50dd0c6b9cee8d262ba9ef919d454d5f62464d69b6a4089240/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74f6f21cc0631f50dd0c6b9cee8d262ba9ef919d454d5f62464d69b6a4089240/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:39 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74f6f21cc0631f50dd0c6b9cee8d262ba9ef919d454d5f62464d69b6a4089240/merged/var/lib/ceph/mgr/ceph-np0005548788.yvwbqq supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:39 np0005548788.localdomain podman[286981]: 2025-12-06 10:03:39.410997767 +0000 UTC m=+0.131875430 container init f0e5abbd0eccf3653793798ce41e81f92483b6a70b5cb9f2bcd18994fe5c643c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, version=7, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main)
Dec 06 10:03:39 np0005548788.localdomain podman[286981]: 2025-12-06 10:03:39.417856707 +0000 UTC m=+0.138734370 container start f0e5abbd0eccf3653793798ce41e81f92483b6a70b5cb9f2bcd18994fe5c643c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4)
Dec 06 10:03:39 np0005548788.localdomain bash[286981]: f0e5abbd0eccf3653793798ce41e81f92483b6a70b5cb9f2bcd18994fe5c643c
Dec 06 10:03:39 np0005548788.localdomain podman[286981]: 2025-12-06 10:03:39.325638845 +0000 UTC m=+0.046516508 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:03:39 np0005548788.localdomain systemd[1]: Started Ceph mgr.np0005548788.yvwbqq for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: pidfile_write: ignore empty --pid-file
Dec 06 10:03:39 np0005548788.localdomain sudo[286754]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'alerts'
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'balancer'
Dec 06 10:03:39 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:39.574+0000 7fde981c3140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:03:39 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'cephadm'
Dec 06 10:03:39 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:39.644+0000 7fde981c3140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:03:40 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'crash'
Dec 06 10:03:40 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:03:40 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'dashboard'
Dec 06 10:03:40 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:40.370+0000 7fde981c3140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:03:40 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'devicehealth'
Dec 06 10:03:40 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:03:40 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 10:03:40 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:40.946+0000 7fde981c3140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 10:03:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 10:03:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]:   from numpy import show_config as show_numpy_config
Dec 06 10:03:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:41.088+0000 7fde981c3140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'influx'
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:41.147+0000 7fde981c3140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'insights'
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'iostat'
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'k8sevents'
Dec 06 10:03:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:41.262+0000 7fde981c3140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'localpool'
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'mirroring'
Dec 06 10:03:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'nfs'
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'orchestrator'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.038+0000 7fde981c3140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.186+0000 7fde981c3140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'osd_support'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.252+0000 7fde981c3140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain podman[287028]: 2025-12-06 10:03:42.269008508 +0000 UTC m=+0.092386708 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:03:42 np0005548788.localdomain podman[287028]: 2025-12-06 10:03:42.311680028 +0000 UTC m=+0.135058228 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.311+0000 7fde981c3140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'progress'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.379+0000 7fde981c3140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'prometheus'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.438+0000 7fde981c3140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.753+0000 7fde981c3140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rbd_support'
Dec 06 10:03:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:42.840+0000 7fde981c3140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'restful'
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rgw'
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rook'
Dec 06 10:03:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:43.184+0000 7fde981c3140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'selftest'
Dec 06 10:03:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:43.609+0000 7fde981c3140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:43.669+0000 7fde981c3140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'snap_schedule'
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'stats'
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'status'
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'telegraf'
Dec 06 10:03:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:43.860+0000 7fde981c3140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'telemetry'
Dec 06 10:03:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:43.920+0000 7fde981c3140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:44.050+0000 7fde981c3140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:44.195+0000 7fde981c3140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'volumes'
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:44.388+0000 7fde981c3140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'zabbix'
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:03:44.445+0000 7fde981c3140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x557b4a2971e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:03:44 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3108124117
Dec 06 10:03:44 np0005548788.localdomain sudo[287054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:44 np0005548788.localdomain sudo[287054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:44 np0005548788.localdomain sudo[287054]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:44 np0005548788.localdomain sudo[287072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:44 np0005548788.localdomain sudo[287072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:44 np0005548788.localdomain sudo[287072]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:44 np0005548788.localdomain sudo[287090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:03:44 np0005548788.localdomain sudo[287090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:45 np0005548788.localdomain podman[287178]: 2025-12-06 10:03:45.729390474 +0000 UTC m=+0.096528694 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Dec 06 10:03:45 np0005548788.localdomain podman[287178]: 2025-12-06 10:03:45.872117216 +0000 UTC m=+0.239255436 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, vcs-type=git, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:03:46 np0005548788.localdomain sudo[287090]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:46 np0005548788.localdomain sudo[287281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:46 np0005548788.localdomain sudo[287281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:46 np0005548788.localdomain sudo[287281]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:03:46 np0005548788.localdomain podman[287299]: 2025-12-06 10:03:46.981292423 +0000 UTC m=+0.088526490 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:03:47 np0005548788.localdomain podman[287299]: 2025-12-06 10:03:47.019944579 +0000 UTC m=+0.127178656 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:03:47 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:03:47 np0005548788.localdomain sudo[287318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:47 np0005548788.localdomain sudo[287318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:47 np0005548788.localdomain sudo[287318]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:03:47.426 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:03:47.427 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:03:47.427 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:48 np0005548788.localdomain sudo[287336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:48 np0005548788.localdomain sudo[287336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:48 np0005548788.localdomain sudo[287336]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:03:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:03:48 np0005548788.localdomain podman[287354]: 2025-12-06 10:03:48.797532387 +0000 UTC m=+0.083884587 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:03:48 np0005548788.localdomain podman[287354]: 2025-12-06 10:03:48.806659668 +0000 UTC m=+0.093011888 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:03:48 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:03:48 np0005548788.localdomain podman[287355]: 2025-12-06 10:03:48.849837223 +0000 UTC m=+0.128536447 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible)
Dec 06 10:03:48 np0005548788.localdomain podman[287355]: 2025-12-06 10:03:48.893708031 +0000 UTC m=+0.172407215 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:03:48 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:03:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:03:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:03:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:03:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150831 "" "Go-http-client/1.1"
Dec 06 10:03:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:03:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17725 "" "Go-http-client/1.1"
Dec 06 10:03:49 np0005548788.localdomain sudo[287397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:03:49 np0005548788.localdomain sudo[287397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:49 np0005548788.localdomain sudo[287397]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:49 np0005548788.localdomain sudo[287415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:03:49 np0005548788.localdomain sudo[287415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:49 np0005548788.localdomain sudo[287415]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:49 np0005548788.localdomain sudo[287433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:49 np0005548788.localdomain sudo[287433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:49 np0005548788.localdomain sudo[287433]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:49 np0005548788.localdomain sudo[287451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:49 np0005548788.localdomain sudo[287451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:49 np0005548788.localdomain sudo[287451]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287469]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287503]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287521]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:03:50 np0005548788.localdomain sudo[287539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287539]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:50 np0005548788.localdomain sudo[287557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287557]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:50 np0005548788.localdomain sudo[287575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287575]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287593]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:50 np0005548788.localdomain sudo[287611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287611]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287629]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287663]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:50 np0005548788.localdomain sudo[287681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287681]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:50 np0005548788.localdomain sudo[287699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:03:50 np0005548788.localdomain sudo[287699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:50 np0005548788.localdomain sudo[287699]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:03:51 np0005548788.localdomain sudo[287717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287717]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:03:51 np0005548788.localdomain sudo[287735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287735]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:51 np0005548788.localdomain sudo[287753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287753]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:51 np0005548788.localdomain sudo[287771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287771]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:51 np0005548788.localdomain sudo[287789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287789]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:51 np0005548788.localdomain sudo[287823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287823]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:51 np0005548788.localdomain sudo[287841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287841]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:03:51 np0005548788.localdomain sudo[287859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287859]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:51 np0005548788.localdomain sudo[287877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287877]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:51 np0005548788.localdomain sudo[287895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287895]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 np0005548788.localdomain sudo[287913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:51 np0005548788.localdomain sudo[287913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:51 np0005548788.localdomain sudo[287913]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:52 np0005548788.localdomain sudo[287931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:52 np0005548788.localdomain sudo[287931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548788.localdomain sudo[287931]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:52 np0005548788.localdomain sudo[287949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:52 np0005548788.localdomain sudo[287949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548788.localdomain sudo[287949]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:52 np0005548788.localdomain sudo[287983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:52 np0005548788.localdomain sudo[287983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548788.localdomain sudo[287983]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:52 np0005548788.localdomain sudo[288001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:52 np0005548788.localdomain sudo[288001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548788.localdomain sudo[288001]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:52 np0005548788.localdomain sudo[288019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:03:52 np0005548788.localdomain sudo[288019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548788.localdomain sudo[288019]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:52 np0005548788.localdomain sudo[288037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:52 np0005548788.localdomain sudo[288037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548788.localdomain sudo[288037]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:56 np0005548788.localdomain sudo[288055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:56 np0005548788.localdomain sudo[288055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:56 np0005548788.localdomain sudo[288055]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:03:57 np0005548788.localdomain podman[288073]: 2025-12-06 10:03:57.255432976 +0000 UTC m=+0.084073823 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:03:57 np0005548788.localdomain podman[288073]: 2025-12-06 10:03:57.267542828 +0000 UTC m=+0.096183755 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:03:57 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:03:59 np0005548788.localdomain sshd[288093]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:04:00 np0005548788.localdomain podman[288095]: 2025-12-06 10:04:00.240981783 +0000 UTC m=+0.069711212 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:04:00 np0005548788.localdomain sshd[288093]: Received disconnect from 148.227.3.232 port 43364:11: Bye Bye [preauth]
Dec 06 10:04:00 np0005548788.localdomain sshd[288093]: Disconnected from authenticating user root 148.227.3.232 port 43364 [preauth]
Dec 06 10:04:00 np0005548788.localdomain podman[288095]: 2025-12-06 10:04:00.255598491 +0000 UTC m=+0.084327920 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:04:00 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:04:02 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x557b4a2971e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:04:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:04:04 np0005548788.localdomain podman[288116]: 2025-12-06 10:04:04.254008489 +0000 UTC m=+0.078972996 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:04:04 np0005548788.localdomain podman[288116]: 2025-12-06 10:04:04.2885611 +0000 UTC m=+0.113525647 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 10:04:04 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:04:07 np0005548788.localdomain ceph-mds[285743]: mds.beacon.mds.np0005548788.erzujf missed beacon ack from the monitors
Dec 06 10:04:07 np0005548788.localdomain sudo[288134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:07 np0005548788.localdomain sudo[288134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:07 np0005548788.localdomain sudo[288134]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:07 np0005548788.localdomain sudo[288152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:07 np0005548788.localdomain sudo[288152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:07 np0005548788.localdomain podman[288215]: 
Dec 06 10:04:07 np0005548788.localdomain podman[288215]: 2025-12-06 10:04:07.986415898 +0000 UTC m=+0.084067272 container create 5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_hermann, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git)
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: Started libpod-conmon-5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba.scope.
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:08 np0005548788.localdomain podman[288215]: 2025-12-06 10:04:07.952120405 +0000 UTC m=+0.049771829 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:08 np0005548788.localdomain podman[288215]: 2025-12-06 10:04:08.059475861 +0000 UTC m=+0.157127235 container init 5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_hermann, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7)
Dec 06 10:04:08 np0005548788.localdomain podman[288215]: 2025-12-06 10:04:08.070057396 +0000 UTC m=+0.167708770 container start 5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_hermann, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1763362218, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Dec 06 10:04:08 np0005548788.localdomain podman[288215]: 2025-12-06 10:04:08.070537741 +0000 UTC m=+0.168189115 container attach 5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_hermann, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, release=1763362218, build-date=2025-11-26T19:44:28Z)
Dec 06 10:04:08 np0005548788.localdomain priceless_hermann[288230]: 167 167
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: libpod-5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba.scope: Deactivated successfully.
Dec 06 10:04:08 np0005548788.localdomain podman[288215]: 2025-12-06 10:04:08.07538118 +0000 UTC m=+0.173032614 container died 5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_hermann, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:04:08 np0005548788.localdomain podman[288235]: 2025-12-06 10:04:08.177530866 +0000 UTC m=+0.090176390 container remove 5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_hermann, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-11-26T19:44:28Z, release=1763362218, version=7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7)
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: libpod-conmon-5f46f052637299359f8307280ecc82a63485add41ebec19b164630abe9e3f4ba.scope: Deactivated successfully.
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 2025-12-06 10:04:08.295714624 +0000 UTC m=+0.080484441 container create 57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_ramanujan, RELEASE=main, GIT_CLEAN=True, ceph=True, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: Started libpod-conmon-57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad.scope.
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68eb797fc99f5300cb8e7dada81f9df018b5bbd1a8c87217e01023344e9d28d4/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 2025-12-06 10:04:08.260755481 +0000 UTC m=+0.045525338 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68eb797fc99f5300cb8e7dada81f9df018b5bbd1a8c87217e01023344e9d28d4/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68eb797fc99f5300cb8e7dada81f9df018b5bbd1a8c87217e01023344e9d28d4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68eb797fc99f5300cb8e7dada81f9df018b5bbd1a8c87217e01023344e9d28d4/merged/var/lib/ceph/mon/ceph-np0005548788 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 2025-12-06 10:04:08.368907592 +0000 UTC m=+0.153677419 container init 57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_ramanujan, release=1763362218, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 2025-12-06 10:04:08.377562488 +0000 UTC m=+0.162332315 container start 57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_ramanujan, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1763362218)
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 2025-12-06 10:04:08.377900138 +0000 UTC m=+0.162669955 container attach 57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_ramanujan, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: libpod-57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad.scope: Deactivated successfully.
Dec 06 10:04:08 np0005548788.localdomain podman[288251]: 2025-12-06 10:04:08.48283292 +0000 UTC m=+0.267602767 container died 57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_ramanujan, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:04:08 np0005548788.localdomain podman[288292]: 2025-12-06 10:04:08.582633114 +0000 UTC m=+0.086230979 container remove 57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_ramanujan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: libpod-conmon-57b630fc08925619d9716c33d553f36e7d963b4307cec979c92cf5fdd12d9cad.scope: Deactivated successfully.
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:04:08 np0005548788.localdomain systemd-sysv-generator[288332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:04:08 np0005548788.localdomain systemd-rc-local-generator[288327]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:04:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:04:08 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x557b4a296f20 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:04:08 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d2977d1344401c6fc7496465e7ca9e77c3f67ed17ffae1ed361f8033581dc44f-merged.mount: Deactivated successfully.
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:04:09 np0005548788.localdomain systemd-sysv-generator[288371]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:04:09 np0005548788.localdomain systemd-rc-local-generator[288367]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: Starting Ceph mon.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:04:09 np0005548788.localdomain podman[288437]: 
Dec 06 10:04:09 np0005548788.localdomain podman[288437]: 2025-12-06 10:04:09.783116033 +0000 UTC m=+0.086390213 container create 31023b3d2b24e86b720cbbaa8472e97c6b474f4bfb80ea5ff5aaf6796d2126bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, CEPH_POINT_RELEASE=, release=1763362218, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public)
Dec 06 10:04:09 np0005548788.localdomain podman[288437]: 2025-12-06 10:04:09.745082986 +0000 UTC m=+0.048357266 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd8ca3e55372ce01e34985d9deb70439c47e87b0479d8be96656e82b6d0781b0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd8ca3e55372ce01e34985d9deb70439c47e87b0479d8be96656e82b6d0781b0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd8ca3e55372ce01e34985d9deb70439c47e87b0479d8be96656e82b6d0781b0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:09 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd8ca3e55372ce01e34985d9deb70439c47e87b0479d8be96656e82b6d0781b0/merged/var/lib/ceph/mon/ceph-np0005548788 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:09 np0005548788.localdomain podman[288437]: 2025-12-06 10:04:09.851874985 +0000 UTC m=+0.155149165 container init 31023b3d2b24e86b720cbbaa8472e97c6b474f4bfb80ea5ff5aaf6796d2126bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, RELEASE=main, io.buildah.version=1.41.4, architecture=x86_64, release=1763362218, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7)
Dec 06 10:04:09 np0005548788.localdomain podman[288437]: 2025-12-06 10:04:09.861810119 +0000 UTC m=+0.165084309 container start 31023b3d2b24e86b720cbbaa8472e97c6b474f4bfb80ea5ff5aaf6796d2126bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, architecture=x86_64)
Dec 06 10:04:09 np0005548788.localdomain bash[288437]: 31023b3d2b24e86b720cbbaa8472e97c6b474f4bfb80ea5ff5aaf6796d2126bb
Dec 06 10:04:09 np0005548788.localdomain systemd[1]: Started Ceph mon.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: pidfile_write: ignore empty --pid-file
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: load: jerasure load: lrc 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: RocksDB version: 7.9.2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Git sha 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: DB SUMMARY
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: DB Session ID:  UD3OCE900E9OLE8LKH8M
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: CURRENT file:  CURRENT
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005548788/store.db dir, Total Num: 0, files: 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005548788/store.db: 000004.log size: 886 ; 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                         Options.error_if_exists: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.create_if_missing: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                                     Options.env: 0x55853be119e0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                                Options.info_log: 0x55853c668d20
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                              Options.statistics: (nil)
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                               Options.use_fsync: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                              Options.db_log_dir: 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                                 Options.wal_dir: 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 10:04:09 np0005548788.localdomain sudo[288152]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                    Options.write_buffer_manager: 0x55853c679540
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.unordered_write: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                               Options.row_cache: None
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                              Options.wal_filter: None
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.two_write_queues: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.wal_compression: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.atomic_flush: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.max_background_jobs: 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.max_background_compactions: -1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.max_subcompactions: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.max_total_wal_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                          Options.max_open_files: -1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:       Options.compaction_readahead_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Compression algorithms supported:
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kZSTD supported: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kXpressCompression supported: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kBZip2Compression supported: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kLZ4Compression supported: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kZlibCompression supported: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         kSnappyCompression supported: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005548788/store.db/MANIFEST-000005
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:           Options.merge_operator: 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:        Options.compaction_filter: None
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55853c668980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x55853c665350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:        Options.write_buffer_size: 33554432
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:  Options.max_write_buffer_number: 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.compression: NoCompression
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.num_levels: 7
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                           Options.bloom_locality: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                               Options.ttl: 2592000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                       Options.enable_blob_files: false
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                           Options.min_blob_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005548788/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f5ec553c-4bc8-419f-b463-0a44a503b01c
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015449922656, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015449924947, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015449, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f5ec553c-4bc8-419f-b463-0a44a503b01c", "db_session_id": "UD3OCE900E9OLE8LKH8M", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015449925069, "job": 1, "event": "recovery_finished"}
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55853c68ce00
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: DB pointer 0x55853c782000
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788 does not exist in monmap, will attempt to join an existing cluster
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.96 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.96 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55853c665350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.6e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0]
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: starting mon.np0005548788 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005548788 fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(???) e0 preinit fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing) e5 sync_obtain_latest_monmap
Dec 06 10:04:09 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).mds e16 new map
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-12-06T08:18:49.925523+0000
                                                           modified        2025-12-06T10:03:02.051468+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        87
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26356}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26356 members: 26356
                                                           [mds.mds.np0005548790.vhcezv{0:26356} state up:active seq 16 addr [v2:172.18.0.108:6808/1621657194,v1:172.18.0.108:6809/1621657194] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005548789.vxwwsq{-1:16884} state up:standby seq 1 addr [v2:172.18.0.107:6808/3033303281,v1:172.18.0.107:6809/3033303281] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005548788.erzujf{-1:16890} state up:standby seq 1 addr [v2:172.18.0.106:6808/309324236,v1:172.18.0.106:6809/309324236] compat {c=[1],r=[1],i=[17ff]}]
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).osd e87 crush map has features 3314933000854323200, adjusting msgr requires
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).osd e87 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).osd e87 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).osd e87 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3859: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17040 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Deploying daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17061 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548785.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label mon to host np0005548785.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3860: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17067 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548785.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label _admin to host np0005548785.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Deploying daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3861: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17079 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548786.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label mon to host np0005548786.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548786.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label _admin to host np0005548786.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3862: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mgrmap e11: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17097 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548787.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label mon to host np0005548787.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Standby manager daemon np0005548789.mzhmje started
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3863: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17103 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548787.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label _admin to host np0005548787.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mgrmap e12: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548789.mzhmje
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17109 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548788.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label mon to host np0005548788.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Standby manager daemon np0005548790.kvkfyr started
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3864: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17115 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548788.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label _admin to host np0005548788.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3865: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17121 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548789.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label mon to host np0005548789.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17127 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548789.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label _admin to host np0005548789.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3866: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17133 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548790.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label mon to host np0005548790.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3867: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17139 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548790.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Added label _admin to host np0005548790.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17145 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Saving service mon spec with placement label:mon
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3868: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='client.17151 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3869: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Deploying daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3870: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Deploying daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548785 calling monitor election
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 calling monitor election
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548786 calling monitor election
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3871: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548790 calling monitor election
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3872: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548785 is new leader, mons np0005548785,np0005548787,np0005548786,np0005548790 in quorum (ranks 0,1,2,3)
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: monmap epoch 4
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: last_changed 2025-12-06T10:04:02.181213+0000
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: min_mon_release 18 (reef)
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: election_strategy: 1
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548785
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: osdmap e87: 6 total, 6 up, 6 in
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: overall HEALTH_OK
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: pgmap v3873: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: Deploying daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:10 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Dec 06 10:04:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:04:13 np0005548788.localdomain podman[288494]: 2025-12-06 10:04:13.271932413 +0000 UTC m=+0.094463092 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 06 10:04:13 np0005548788.localdomain podman[288494]: 2025-12-06 10:04:13.342949203 +0000 UTC m=+0.165479872 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:04:13 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:04:14 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x557b4a296f20 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:04:14 np0005548788.localdomain sudo[288519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:14 np0005548788.localdomain sudo[288519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:14 np0005548788.localdomain sudo[288519]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:14 np0005548788.localdomain sudo[288537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:14 np0005548788.localdomain sudo[288537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:14 np0005548788.localdomain sudo[288537]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:14 np0005548788.localdomain sudo[288555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:04:14 np0005548788.localdomain sudo[288555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:14 np0005548788.localdomain sshd[288573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:15 np0005548788.localdomain podman[288645]: 2025-12-06 10:04:15.706392149 +0000 UTC m=+0.097282588 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, RELEASE=main, release=1763362218, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7)
Dec 06 10:04:15 np0005548788.localdomain podman[288645]: 2025-12-06 10:04:15.859801509 +0000 UTC m=+0.250691968 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, GIT_BRANCH=main, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 06 10:04:16 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@-1(probing) e6  my rank is now 5 (was -1)
Dec 06 10:04:16 np0005548788.localdomain ceph-mon[288455]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:04:16 np0005548788.localdomain ceph-mon[288455]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 06 10:04:16 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:16 np0005548788.localdomain sudo[288555]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:04:17 np0005548788.localdomain podman[288763]: 2025-12-06 10:04:17.276494447 +0000 UTC m=+0.095952587 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:04:17 np0005548788.localdomain podman[288763]: 2025-12-06 10:04:17.28861942 +0000 UTC m=+0.108077570 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 10:04:17 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:04:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:04:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:04:19 np0005548788.localdomain ceph-mds[285743]: mds.beacon.mds.np0005548788.erzujf missed beacon ack from the monitors
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: pgmap v3874: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548785 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548790 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548786 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: pgmap v3875: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548789 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: pgmap v3876: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548785 is new leader, mons np0005548785,np0005548787,np0005548786,np0005548790,np0005548789 in quorum (ranks 0,1,2,3,4)
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: monmap epoch 5
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: last_changed 2025-12-06T10:04:08.937568+0000
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: min_mon_release 18 (reef)
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: election_strategy: 1
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548785
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: osdmap e87: 6 total, 6 up, 6 in
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: overall HEALTH_OK
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mgrc update_daemon_metadata mon.np0005548788 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005548788.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005548788.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116604,os=Linux}
Dec 06 10:04:19 np0005548788.localdomain podman[288783]: 2025-12-06 10:04:19.337398584 +0000 UTC m=+0.167655639 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:04:19 np0005548788.localdomain podman[288783]: 2025-12-06 10:04:19.348806914 +0000 UTC m=+0.179063959 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:04:19 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:04:19 np0005548788.localdomain sshd[288573]: Received disconnect from 101.47.142.76 port 52608:11: Bye Bye [preauth]
Dec 06 10:04:19 np0005548788.localdomain sshd[288573]: Disconnected from authenticating user root 101.47.142.76 port 52608 [preauth]
Dec 06 10:04:19 np0005548788.localdomain podman[288784]: 2025-12-06 10:04:19.29917265 +0000 UTC m=+0.126400522 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548785 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548786 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548790 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548789 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='client.17165 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: pgmap v3877: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788 calling monitor election
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: pgmap v3878: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: pgmap v3879: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mon.np0005548785 is new leader, mons np0005548785,np0005548787,np0005548786,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3,4,5)
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: monmap epoch 6
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: last_changed 2025-12-06T10:04:14.235362+0000
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: min_mon_release 18 (reef)
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: election_strategy: 1
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548785
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005548788
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: osdmap e87: 6 total, 6 up, 6 in
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: overall HEALTH_OK
Dec 06 10:04:19 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:19 np0005548788.localdomain podman[288784]: 2025-12-06 10:04:19.42910397 +0000 UTC m=+0.256331862 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Dec 06 10:04:19 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:04:19 np0005548788.localdomain sudo[288826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:19 np0005548788.localdomain sudo[288826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:19 np0005548788.localdomain sudo[288826]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:04:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:04:19 np0005548788.localdomain sudo[288844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:04:19 np0005548788.localdomain sudo[288844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:04:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:04:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:04:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18207 "" "Go-http-client/1.1"
Dec 06 10:04:20 np0005548788.localdomain sudo[288844]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:20 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:20 np0005548788.localdomain sudo[288894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:04:20 np0005548788.localdomain sudo[288894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[288894]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain sudo[288912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:04:20 np0005548788.localdomain sudo[288912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[288912]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain sudo[288930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548788.localdomain sudo[288930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[288930]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain sudo[288948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:20 np0005548788.localdomain sudo[288948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[288948]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain sudo[288966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548788.localdomain sudo[288966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[288966]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain sudo[289000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548788.localdomain sudo[289000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[289000]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548788.localdomain sudo[289018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548788.localdomain sudo[289018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548788.localdomain sudo[289018]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:04:21 np0005548788.localdomain sudo[289036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289036]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:21 np0005548788.localdomain sudo[289054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289054]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:21 np0005548788.localdomain sudo[289072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289072]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548788.localdomain sudo[289090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289090]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:21 np0005548788.localdomain sudo[289108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289108]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548788.localdomain sudo[289126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289126]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548788.localdomain sudo[289160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289160]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548788.localdomain sudo[289178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289178]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548788.localdomain sudo[289196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:21 np0005548788.localdomain sudo[289196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548788.localdomain sudo[289196]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548785.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: pgmap v3880: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='client.34103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:22 np0005548788.localdomain sudo[289214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:22 np0005548788.localdomain sudo[289214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:22 np0005548788.localdomain sudo[289214]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:23 np0005548788.localdomain ceph-mon[288455]: from='client.17178 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:23 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:23 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:23 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548785 (monmap changed)...
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548785 on np0005548785.localdomain
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: pgmap v3881: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548785.vhqlsq (monmap changed)...
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548785.vhqlsq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548785.vhqlsq on np0005548785.localdomain
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.107:0/2303863447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:24 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='client.17190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548785 (monmap changed)...
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548785.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548785 on np0005548785.localdomain
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.103:0/3114499803' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:25 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: pgmap v3882: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.107:0/4040142409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:26 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.103:0/1185796831' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:27 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:27.559 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:27.560 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:27.561 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:27.561 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:27.561 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.002 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:04:28 np0005548788.localdomain podman[289232]: 2025-12-06 10:04:28.27237401 +0000 UTC m=+0.096591666 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:04:28 np0005548788.localdomain podman[289232]: 2025-12-06 10:04:28.31599743 +0000 UTC m=+0.140215076 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon).osd e87 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon).osd e87 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon).osd e88 e88: 6 total, 6 up, 6 in
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr handle_mgr_map Activating!
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr handle_mgr_map I am now activating
Dec 06 10:04:28 np0005548788.localdomain sshd[26359]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26323]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26209]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26340]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26285]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26168]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain sshd[26247]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-26.scope: Consumed 3min 35.345s CPU time.
Dec 06 10:04:28 np0005548788.localdomain sshd[26228]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain sshd[26266]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26304]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26190]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain sshd[26151]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 26 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 18 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 24 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 21 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 25 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 16 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 20 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: pgmap v3883: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.108:0/2694903603' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: Activating manager daemon np0005548788.yvwbqq
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.103:0/899954398' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:04:28 np0005548788.localdomain ceph-mon[288455]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 14 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 23 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 19 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 22 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Session 17 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 26.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 18.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 24.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 25.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 16.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 17.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 22.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 23.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 14.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 20.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 21.
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: Removed session 19.
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: balancer
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Optimize plan auto_2025-12-06_10:04:28
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.505 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.505 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.506 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: cephadm
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: crash
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: devicehealth
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [devicehealth INFO root] Starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: iostat
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.520 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.520 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:28.521 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: nfs
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: orchestrator
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: pg_autoscaler
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: progress
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] _maybe_adjust
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Loading...
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7fde1f34e5e0>, <progress.module.GhostEvent object at 0x7fde1f34e970>, <progress.module.GhostEvent object at 0x7fde1f34e9a0>, <progress.module.GhostEvent object at 0x7fde1f34e9d0>, <progress.module.GhostEvent object at 0x7fde1f34ea00>, <progress.module.GhostEvent object at 0x7fde1f34ea30>, <progress.module.GhostEvent object at 0x7fde1f34ea60>, <progress.module.GhostEvent object at 0x7fde1f34ea90>, <progress.module.GhostEvent object at 0x7fde1f34eac0>, <progress.module.GhostEvent object at 0x7fde1f34eaf0>, <progress.module.GhostEvent object at 0x7fde1f34eb20>, <progress.module.GhostEvent object at 0x7fde1f34eb50>, <progress.module.GhostEvent object at 0x7fde1f34eb80>, <progress.module.GhostEvent object at 0x7fde1f34ebb0>, <progress.module.GhostEvent object at 0x7fde1f34ebe0>, <progress.module.GhostEvent object at 0x7fde1f34ec10>, <progress.module.GhostEvent object at 0x7fde1f34ec40>, <progress.module.GhostEvent object at 0x7fde1f34ec70>, <progress.module.GhostEvent object at 0x7fde1f34eca0>, <progress.module.GhostEvent object at 0x7fde1f34ecd0>, <progress.module.GhostEvent object at 0x7fde1f34ed00>, <progress.module.GhostEvent object at 0x7fde1f34ed30>, <progress.module.GhostEvent object at 0x7fde1f34ed60>, <progress.module.GhostEvent object at 0x7fde1f34ed90>, <progress.module.GhostEvent object at 0x7fde1f34edc0>, <progress.module.GhostEvent object at 0x7fde1f34edf0>, <progress.module.GhostEvent object at 0x7fde1f34ee20>, <progress.module.GhostEvent object at 0x7fde1f34ee50>, <progress.module.GhostEvent object at 0x7fde1f34ee80>, <progress.module.GhostEvent object at 0x7fde1f34eeb0>, <progress.module.GhostEvent object at 0x7fde1f34eee0>, <progress.module.GhostEvent object at 0x7fde1f34ef10>, <progress.module.GhostEvent object at 0x7fde1f34ef40>, <progress.module.GhostEvent object at 0x7fde1f34ef70>, <progress.module.GhostEvent object at 0x7fde1f34efa0>, <progress.module.GhostEvent object at 0x7fde1f34efd0>, <progress.module.GhostEvent object at 0x7fde192f6040>, <progress.module.GhostEvent object at 0x7fde192f6070>, <progress.module.GhostEvent object at 0x7fde192f60a0>, <progress.module.GhostEvent object at 0x7fde192f60d0>, <progress.module.GhostEvent object at 0x7fde192f6100>, <progress.module.GhostEvent object at 0x7fde192f6130>, <progress.module.GhostEvent object at 0x7fde192f6160>, <progress.module.GhostEvent object at 0x7fde192f6190>, <progress.module.GhostEvent object at 0x7fde192f61c0>, <progress.module.GhostEvent object at 0x7fde192f61f0>, <progress.module.GhostEvent object at 0x7fde192f6220>, <progress.module.GhostEvent object at 0x7fde192f6250>, <progress.module.GhostEvent object at 0x7fde192f6280>, <progress.module.GhostEvent object at 0x7fde192f62b0>] historic events
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Loaded OSDMap, ready.
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] recovery thread starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] starting setup
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: rbd_support
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: restful
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [restful INFO root] server_addr: :: server_port: 8003
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: status
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: telemetry
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [restful WARNING root] server not running: no certificate configured
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: volumes
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] PerfHandler: starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: vms, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.633+0000 7fde05a1f640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.633+0000 7fde05a1f640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.633+0000 7fde05a1f640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.633+0000 7fde05a1f640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.633+0000 7fde05a1f640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.634+0000 7fde0a228640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.634+0000 7fde0a228640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.634+0000 7fde0a228640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.634+0000 7fde0a228640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:04:28.634+0000 7fde0a228640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: volumes, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: images, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: backups, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TaskHandler: starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 06 10:04:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] setup complete
Dec 06 10:04:28 np0005548788.localdomain sshd[289394]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:28 np0005548788.localdomain sshd[289394]: Accepted publickey for ceph-admin from 192.168.122.106 port 41704 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:04:28 np0005548788.localdomain systemd-logind[765]: New session 64 of user ceph-admin.
Dec 06 10:04:28 np0005548788.localdomain systemd[1]: Started Session 64 of User ceph-admin.
Dec 06 10:04:28 np0005548788.localdomain sshd[289394]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:04:28 np0005548788.localdomain sudo[289398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:29 np0005548788.localdomain sudo[289398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:29.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:29 np0005548788.localdomain sudo[289398]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:29 np0005548788.localdomain sudo[289416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:04:29 np0005548788.localdomain sudo[289416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: mgrmap e14: np0005548788.yvwbqq(active, starting, since 0.0774079s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548786.mczynb", "id": "np0005548786.mczynb"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: Manager daemon np0005548788.yvwbqq is now available
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.108:0/1147721137' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:29 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:29 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon).osd e88 _set_new_cache_sizes cache_size:1019657343 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:29 np0005548788.localdomain podman[289507]: 2025-12-06 10:04:29.983803887 +0000 UTC m=+0.096939867 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, version=7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.033 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.033 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.033 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.033 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.034 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:04:30 np0005548788.localdomain podman[289507]: 2025-12-06 10:04:30.087932394 +0000 UTC m=+0.201068394 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=)
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:04:30] ENGINE Bus STARTING
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:04:30] ENGINE Bus STARTING
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:04:30] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:04:30] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:04:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:04:30] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:04:30] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:04:30] ENGINE Bus STARTED
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:04:30] ENGINE Bus STARTED
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:04:30] ENGINE Client ('172.18.0.106', 43646) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:04:30] ENGINE Client ('172.18.0.106', 43646) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:04:30 np0005548788.localdomain podman[289619]: 2025-12-06 10:04:30.462541346 +0000 UTC m=+0.069785563 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:04:30 np0005548788.localdomain podman[289619]: 2025-12-06 10:04:30.472034289 +0000 UTC m=+0.079278516 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:04:30 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:04:30 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:04:30 np0005548788.localdomain ceph-mon[288455]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3456902707' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.513 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.748 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.750 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12332MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.751 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.751 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.827 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.828 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:04:30 np0005548788.localdomain sudo[289416]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:30.848 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:04:30 np0005548788.localdomain ceph-mgr[286998]: [devicehealth INFO root] Check health
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: mgrmap e15: np0005548788.yvwbqq(active, since 1.22065s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: [06/Dec/2025:10:04:30] ENGINE Bus STARTING
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: [06/Dec/2025:10:04:30] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: [06/Dec/2025:10:04:30] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: [06/Dec/2025:10:04:30] ENGINE Bus STARTED
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: [06/Dec/2025:10:04:30] ENGINE Client ('172.18.0.106', 43646) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:04:31 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.106:0/3456902707' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:31 np0005548788.localdomain sudo[289725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:31 np0005548788.localdomain sudo[289725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:31 np0005548788.localdomain sudo[289725]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:31 np0005548788.localdomain sudo[289743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:04:31 np0005548788.localdomain sudo[289743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:31.400 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:04:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:31.410 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:04:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:31.443 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:04:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:31.446 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:04:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:04:31.446 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: mgrmap e16: np0005548788.yvwbqq(active, since 2s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.106:0/497756778' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:32 np0005548788.localdomain sudo[289743]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:32 np0005548788.localdomain sudo[289795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:32 np0005548788.localdomain sudo[289795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:32 np0005548788.localdomain sudo[289795]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:32 np0005548788.localdomain sudo[289813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:04:32 np0005548788.localdomain sudo[289813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:32 np0005548788.localdomain sudo[289813]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548785.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548785.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain sudo[289849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:04:33 np0005548788.localdomain sudo[289849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289849]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548786", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548786", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548785", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548785", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:33 np0005548788.localdomain sudo[289867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:04:33 np0005548788.localdomain sudo[289867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289867]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[289885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548788.localdomain sudo[289885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289885]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[289903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:33 np0005548788.localdomain sudo[289903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289903]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[289921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548788.localdomain sudo[289921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289921]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[289955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548788.localdomain sudo[289955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289955]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[289973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548788.localdomain sudo[289973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[289973]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mgr.np0005548785.vhqlsq 172.18.0.103:0/2299561010; not ready for session (expect reconnect)
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain sudo[289991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain sudo[289991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain sudo[289991]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain sudo[290010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:33 np0005548788.localdomain sudo[290010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[290010]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:33 np0005548788.localdomain sudo[290028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:33 np0005548788.localdomain sudo[290028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[290028]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[290046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:33 np0005548788.localdomain sudo[290046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[290046]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[290064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:33 np0005548788.localdomain sudo[290064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[290064]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548788.localdomain sudo[290082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:33 np0005548788.localdomain sudo[290082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548788.localdomain sudo[290082]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Updating np0005548785.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: mgrmap e17: np0005548788.yvwbqq(active, since 4s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: Standby manager daemon np0005548785.vhqlsq started
Dec 06 10:04:34 np0005548788.localdomain sudo[290116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:34 np0005548788.localdomain sudo[290116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290116]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain sudo[290134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:34 np0005548788.localdomain sudo[290134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290134]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain sudo[290152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:34 np0005548788.localdomain sudo[290152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290152]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain sudo[290170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:04:34 np0005548788.localdomain sudo[290170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290170]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:34 np0005548788.localdomain systemd[1]: tmp-crun.oltlqs.mount: Deactivated successfully.
Dec 06 10:04:34 np0005548788.localdomain sudo[290194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:04:34 np0005548788.localdomain podman[290188]: 2025-12-06 10:04:34.484022131 +0000 UTC m=+0.076414628 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:04:34 np0005548788.localdomain sudo[290194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290194]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain podman[290188]: 2025-12-06 10:04:34.515699213 +0000 UTC m=+0.108091720 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:04:34 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:04:34 np0005548788.localdomain sudo[290223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548788.localdomain sudo[290223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290223]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain sudo[290241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:34 np0005548788.localdomain sudo[290241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290241]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain sudo[290259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548788.localdomain sudo[290259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290259]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain sudo[290293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548788.localdomain sudo[290293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290293]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain sudo[290311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548788.localdomain sudo[290311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290311]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon).osd e88 _set_new_cache_sizes cache_size:1020045346 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:34 np0005548788.localdomain sudo[290329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548788.localdomain sudo[290329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548788.localdomain sudo[290329]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain sudo[290347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:35 np0005548788.localdomain sudo[290347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290347]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain sudo[290365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:35 np0005548788.localdomain sudo[290365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290365]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: mgrmap e18: np0005548788.yvwbqq(active, since 5s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:35 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain sudo[290383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548788.localdomain sudo[290383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290383]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain sudo[290401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:35 np0005548788.localdomain sudo[290401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290401]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain sudo[290419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548788.localdomain sudo[290419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290419]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain sudo[290453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548788.localdomain sudo[290453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290453]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain sudo[290471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548788.localdomain sudo[290471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290471]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain sudo[290489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548788.localdomain sudo[290489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548788.localdomain sudo[290489]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev 4930b5cb-71db-41e6-8b98-60af6588e951 (Updating node-proxy deployment (+6 -> 6))
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev 4930b5cb-71db-41e6-8b98-60af6588e951 (Updating node-proxy deployment (+6 -> 6))
Dec 06 10:04:35 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event 4930b5cb-71db-41e6-8b98-60af6588e951 (Updating node-proxy deployment (+6 -> 6)) in 0 seconds
Dec 06 10:04:36 np0005548788.localdomain sudo[290508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:36 np0005548788.localdomain sudo[290508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:36 np0005548788.localdomain sudo[290508]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:36 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:04:36 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:04:36 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:04:36 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:04:36 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 06 10:04:37 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:04:37 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:04:37 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:04:37 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:37 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:38 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:04:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:04:38 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:04:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:38 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Dec 06 10:04:38 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Writing back 50 completed events
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:04:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:04:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:04:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:04:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:04:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:04:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:04:39 np0005548788.localdomain sudo[290526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:39 np0005548788.localdomain sudo[290526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:39 np0005548788.localdomain sudo[290526]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.32:0/1383583435' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='client.? 172.18.0.32:0/1383583435' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:39 np0005548788.localdomain sudo[290544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:39 np0005548788.localdomain sudo[290544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:39 np0005548788.localdomain podman[290578]: 
Dec 06 10:04:39 np0005548788.localdomain podman[290578]: 2025-12-06 10:04:39.910545606 +0000 UTC m=+0.078448410 container create 4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_keller, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7)
Dec 06 10:04:39 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon).osd e88 _set_new_cache_sizes cache_size:1020054504 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:39 np0005548788.localdomain systemd[1]: Started libpod-conmon-4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a.scope.
Dec 06 10:04:39 np0005548788.localdomain podman[290578]: 2025-12-06 10:04:39.878034327 +0000 UTC m=+0.045937131 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:39 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:40 np0005548788.localdomain podman[290578]: 2025-12-06 10:04:40.002131067 +0000 UTC m=+0.170033871 container init 4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_keller, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph)
Dec 06 10:04:40 np0005548788.localdomain podman[290578]: 2025-12-06 10:04:40.014624141 +0000 UTC m=+0.182526945 container start 4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_keller, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1763362218, RELEASE=main, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container)
Dec 06 10:04:40 np0005548788.localdomain podman[290578]: 2025-12-06 10:04:40.01491174 +0000 UTC m=+0.182814554 container attach 4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_keller, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:04:40 np0005548788.localdomain youthful_keller[290593]: 167 167
Dec 06 10:04:40 np0005548788.localdomain systemd[1]: libpod-4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a.scope: Deactivated successfully.
Dec 06 10:04:40 np0005548788.localdomain podman[290578]: 2025-12-06 10:04:40.019581374 +0000 UTC m=+0.187484188 container died 4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_keller, RELEASE=main, ceph=True, io.buildah.version=1.41.4, name=rhceph, release=1763362218, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:40 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.17340 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:40 np0005548788.localdomain podman[290598]: 2025-12-06 10:04:40.134246314 +0000 UTC m=+0.098965310 container remove 4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_keller, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, release=1763362218, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Dec 06 10:04:40 np0005548788.localdomain systemd[1]: libpod-conmon-4ec7e90101afba4b31da78e9b72b2053430a72e58a1f71f89903b5d70e7fd86a.scope: Deactivated successfully.
Dec 06 10:04:40 np0005548788.localdomain sudo[290544]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:40 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 06 10:04:40 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 06 10:04:40 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:04:40 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:04:40 np0005548788.localdomain sudo[290615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:40 np0005548788.localdomain sudo[290615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:40 np0005548788.localdomain sudo[290615]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:40 np0005548788.localdomain sudo[290633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:40 np0005548788.localdomain sudo[290633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:40 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:04:40 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:04:40 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:40 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:40 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:04:40 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:40 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 2025-12-06 10:04:40.838118556 +0000 UTC m=+0.081386950 container create bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_chandrasekhar, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, release=1763362218, RELEASE=main, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 06 10:04:40 np0005548788.localdomain systemd[1]: Started libpod-conmon-bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227.scope.
Dec 06 10:04:40 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 2025-12-06 10:04:40.804010059 +0000 UTC m=+0.047278483 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 2025-12-06 10:04:40.913328435 +0000 UTC m=+0.156596829 container init bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_chandrasekhar, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7, release=1763362218, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public)
Dec 06 10:04:40 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b46c0e6268548e28588ac1d08e76e17dbab40c0381d1c71d6502bf1b0d22255e-merged.mount: Deactivated successfully.
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 2025-12-06 10:04:40.922257449 +0000 UTC m=+0.165525843 container start bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_chandrasekhar, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 2025-12-06 10:04:40.922506467 +0000 UTC m=+0.165774891 container attach bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_chandrasekhar, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.41.4, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, name=rhceph, architecture=x86_64, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:04:40 np0005548788.localdomain stoic_chandrasekhar[290683]: 167 167
Dec 06 10:04:40 np0005548788.localdomain systemd[1]: libpod-bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227.scope: Deactivated successfully.
Dec 06 10:04:40 np0005548788.localdomain podman[290668]: 2025-12-06 10:04:40.925663263 +0000 UTC m=+0.168931707 container died bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_chandrasekhar, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, ceph=True, release=1763362218, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7)
Dec 06 10:04:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-46da831fbf5e1a39bf3a50eb2969407ecdbc8e03a1893255356905ea6131db23-merged.mount: Deactivated successfully.
Dec 06 10:04:41 np0005548788.localdomain podman[290688]: 2025-12-06 10:04:41.037425855 +0000 UTC m=+0.099718053 container remove bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_chandrasekhar, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2025-11-26T19:44:28Z, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:41 np0005548788.localdomain systemd[1]: libpod-conmon-bc358f7e1911d51fcaf5ecd0ce4127777e8d0d815625337c06bb3f2bf4d5a227.scope: Deactivated successfully.
Dec 06 10:04:41 np0005548788.localdomain sudo[290633]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 06 10:04:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 06 10:04:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:04:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:04:41 np0005548788.localdomain sudo[290711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:41 np0005548788.localdomain sudo[290711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:41 np0005548788.localdomain sudo[290711]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:41 np0005548788.localdomain sudo[290729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:41 np0005548788.localdomain sudo[290729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: from='client.17340 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:04:41 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 2025-12-06 10:04:41.90622475 +0000 UTC m=+0.091320694 container create 380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_banzai, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e.scope.
Dec 06 10:04:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 2025-12-06 10:04:41.871592757 +0000 UTC m=+0.056688751 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 2025-12-06 10:04:41.974169376 +0000 UTC m=+0.159265330 container init 380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_banzai, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 2025-12-06 10:04:41.983892875 +0000 UTC m=+0.168988819 container start 380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_banzai, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, distribution-scope=public, release=1763362218, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 2025-12-06 10:04:41.984144603 +0000 UTC m=+0.169240547 container attach 380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_banzai, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.tags=rhceph ceph)
Dec 06 10:04:41 np0005548788.localdomain romantic_banzai[290779]: 167 167
Dec 06 10:04:41 np0005548788.localdomain systemd[1]: libpod-380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e.scope: Deactivated successfully.
Dec 06 10:04:41 np0005548788.localdomain podman[290764]: 2025-12-06 10:04:41.987399513 +0000 UTC m=+0.172495497 container died 380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_banzai, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4)
Dec 06 10:04:42 np0005548788.localdomain podman[290784]: 2025-12-06 10:04:42.079605074 +0000 UTC m=+0.079871433 container remove 380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_banzai, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:42 np0005548788.localdomain systemd[1]: libpod-conmon-380db9239c1adfeafe94adc7fee77c8949c36c50412fb24f1a994310a7395b6e.scope: Deactivated successfully.
Dec 06 10:04:42 np0005548788.localdomain sudo[290729]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:04:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:04:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:04:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:04:42 np0005548788.localdomain sudo[290807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:42 np0005548788.localdomain sudo[290807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:42 np0005548788.localdomain sudo[290807]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:42 np0005548788.localdomain sudo[290825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:42 np0005548788.localdomain sudo[290825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:42 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.17346 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548785", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ce2fb0d13d76dd323ea38b232948176a7e834044dc75d86b7abb173b314f47d0-merged.mount: Deactivated successfully.
Dec 06 10:04:42 np0005548788.localdomain podman[290860]: 
Dec 06 10:04:42 np0005548788.localdomain podman[290860]: 2025-12-06 10:04:42.942329912 +0000 UTC m=+0.087525778 container create 890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_bhabha, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1763362218, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph)
Dec 06 10:04:42 np0005548788.localdomain systemd[1]: Started libpod-conmon-890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106.scope.
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:43 np0005548788.localdomain podman[290860]: 2025-12-06 10:04:42.908340629 +0000 UTC m=+0.053536495 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:43 np0005548788.localdomain podman[290860]: 2025-12-06 10:04:43.022138013 +0000 UTC m=+0.167333869 container init 890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_bhabha, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, RELEASE=main, name=rhceph)
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: tmp-crun.kSzgvY.mount: Deactivated successfully.
Dec 06 10:04:43 np0005548788.localdomain podman[290860]: 2025-12-06 10:04:43.036544586 +0000 UTC m=+0.181740442 container start 890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_bhabha, GIT_CLEAN=True, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 10:04:43 np0005548788.localdomain podman[290860]: 2025-12-06 10:04:43.036777963 +0000 UTC m=+0.181973819 container attach 890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_bhabha, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:04:43 np0005548788.localdomain recursing_bhabha[290875]: 167 167
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: libpod-890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106.scope: Deactivated successfully.
Dec 06 10:04:43 np0005548788.localdomain podman[290860]: 2025-12-06 10:04:43.03994516 +0000 UTC m=+0.185141036 container died 890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_bhabha, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, version=7, release=1763362218, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:04:43 np0005548788.localdomain podman[290880]: 2025-12-06 10:04:43.13570712 +0000 UTC m=+0.086637201 container remove 890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_bhabha, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, release=1763362218, version=7, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: libpod-conmon-890b2a4ee0c72b22320af6a00214303f44cbbe4b644f165b84dc8266e96b3106.scope: Deactivated successfully.
Dec 06 10:04:43 np0005548788.localdomain sudo[290825]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:04:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:04:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:04:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:04:43 np0005548788.localdomain sudo[290897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:43 np0005548788.localdomain sudo[290897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:43 np0005548788.localdomain sudo[290897]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:43 np0005548788.localdomain sudo[290915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:43 np0005548788.localdomain sudo[290915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:43 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:43 np0005548788.localdomain podman[290933]: 2025-12-06 10:04:43.544538462 +0000 UTC m=+0.100411624 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 06 10:04:43 np0005548788.localdomain podman[290933]: 2025-12-06 10:04:43.615821891 +0000 UTC m=+0.171695043 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 2025-12-06 10:04:43.878654501 +0000 UTC m=+0.077583883 container create 8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_carver, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=1763362218, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True)
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: Started libpod-conmon-8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe.scope.
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6205f4524e40e2fc3760aed9d7974315e897dd455ba00f6315f6a4a073956494-merged.mount: Deactivated successfully.
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 2025-12-06 10:04:43.847881306 +0000 UTC m=+0.046810718 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 2025-12-06 10:04:43.957853213 +0000 UTC m=+0.156782595 container init 8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_carver, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=1763362218, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.41.4)
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 2025-12-06 10:04:43.972378099 +0000 UTC m=+0.171307481 container start 8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_carver, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 2025-12-06 10:04:43.972931246 +0000 UTC m=+0.171860638 container attach 8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_carver, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph)
Dec 06 10:04:43 np0005548788.localdomain optimistic_carver[290990]: 167 167
Dec 06 10:04:43 np0005548788.localdomain systemd[1]: libpod-8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe.scope: Deactivated successfully.
Dec 06 10:04:43 np0005548788.localdomain podman[290975]: 2025-12-06 10:04:43.977362352 +0000 UTC m=+0.176291774 container died 8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_carver, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:04:44 np0005548788.localdomain podman[290995]: 2025-12-06 10:04:44.089731882 +0000 UTC m=+0.097715781 container remove 8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_carver, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7)
Dec 06 10:04:44 np0005548788.localdomain systemd[1]: libpod-conmon-8dac79c071c7042e78fd4caa9865096d863d1d4b790eacef4545c0b3f8b825fe.scope: Deactivated successfully.
Dec 06 10:04:44 np0005548788.localdomain sudo[290915]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.34161 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548785"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Remove daemons mon.np0005548785
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005548785
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005548785: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005548785: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005548785 from monmap...
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing monitor np0005548785 from monmap...
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: client.17286 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 06 10:04:44 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@5(peon) e7  my rank is now 4 (was 5)
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.106:3300/0
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.106:3300/0
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: client.34151 ms_handle_reset on v2:172.18.0.106:3300/0
Dec 06 10:04:44 np0005548788.localdomain ceph-mon[288455]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:04:44 np0005548788.localdomain ceph-mon[288455]: paxos.4).electionLogic(26) init, last seen epoch 26
Dec 06 10:04:44 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:44 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2757c26cbe847ac4a8c478525ea483a6088c82ed5e8b727ba44131e9b30e44ed-merged.mount: Deactivated successfully.
Dec 06 10:04:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:04:47.427 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:04:47.428 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:04:47.428 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:04:48 np0005548788.localdomain systemd[1]: tmp-crun.EUTfQv.mount: Deactivated successfully.
Dec 06 10:04:48 np0005548788.localdomain podman[291011]: 2025-12-06 10:04:48.281116481 +0000 UTC m=+0.105828120 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:04:48 np0005548788.localdomain podman[291011]: 2025-12-06 10:04:48.297656349 +0000 UTC m=+0.122367988 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:04:48 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:04:48 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:49 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:04:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:04:49 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(electing) e7 handle_auth_request failed to assign global_id
Dec 06 10:04:49 np0005548788.localdomain sudo[291029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:49 np0005548788.localdomain sudo[291029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:49 np0005548788.localdomain sudo[291029]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:49 np0005548788.localdomain sudo[291047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:49 np0005548788.localdomain sudo[291047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:04:49 np0005548788.localdomain podman[291065]: 2025-12-06 10:04:49.524503318 +0000 UTC m=+0.071969271 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:04:49 np0005548788.localdomain podman[291065]: 2025-12-06 10:04:49.543613074 +0000 UTC m=+0.091078987 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:04:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:04:49 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Removed label mon from host np0005548785.localdomain
Dec 06 10:04:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removed label mon from host np0005548785.localdomain
Dec 06 10:04:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:04:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:04:49 np0005548788.localdomain podman[291086]: 2025-12-06 10:04:49.638796656 +0000 UTC m=+0.090384305 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:04:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:04:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:04:49 np0005548788.localdomain podman[291086]: 2025-12-06 10:04:49.710822608 +0000 UTC m=+0.162410247 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec 06 10:04:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:04:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18220 "" "Go-http-client/1.1"
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:04:49 np0005548788.localdomain podman[291121]: 
Dec 06 10:04:49 np0005548788.localdomain podman[291121]: 2025-12-06 10:04:49.910863551 +0000 UTC m=+0.078301535 container create 8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, RELEASE=main, release=1763362218, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public)
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: Started libpod-conmon-8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95.scope.
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:49 np0005548788.localdomain podman[291121]: 2025-12-06 10:04:49.878877528 +0000 UTC m=+0.046315542 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:49 np0005548788.localdomain podman[291121]: 2025-12-06 10:04:49.986574025 +0000 UTC m=+0.154012009 container init 8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_lederberg, release=1763362218, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:04:49 np0005548788.localdomain podman[291121]: 2025-12-06 10:04:49.996543371 +0000 UTC m=+0.163981345 container start 8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_lederberg, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:49 np0005548788.localdomain podman[291121]: 2025-12-06 10:04:49.996808729 +0000 UTC m=+0.164246703 container attach 8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_lederberg, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, version=7)
Dec 06 10:04:49 np0005548788.localdomain interesting_lederberg[291137]: 167 167
Dec 06 10:04:49 np0005548788.localdomain systemd[1]: libpod-8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95.scope: Deactivated successfully.
Dec 06 10:04:50 np0005548788.localdomain podman[291121]: 2025-12-06 10:04:50.000950187 +0000 UTC m=+0.168388221 container died 8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_lederberg, version=7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=)
Dec 06 10:04:50 np0005548788.localdomain podman[291142]: 2025-12-06 10:04:50.093183399 +0000 UTC m=+0.078864424 container remove 8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_lederberg, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 06 10:04:50 np0005548788.localdomain systemd[1]: libpod-conmon-8818a18cddc9f8f6de8cfe02c29d90d7c9d8e8b50e2c6587d645027a21f4eb95.scope: Deactivated successfully.
Dec 06 10:04:50 np0005548788.localdomain sudo[291047]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: paxos.4).electionLogic(27) init, last seen epoch 27, mid-election, bumping
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='client.34161 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548785"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Remove daemons mon.np0005548785
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Safe to remove mon.np0005548785: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Removing monitor np0005548785 from monmap...
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon rm", "name": "np0005548785"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Removing daemon mon.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548786 calling monitor election
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548790 calling monitor election
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 calling monitor election
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548789 calling monitor election
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789 in quorum (ranks 0,1,2,3)
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: monmap epoch 7
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: last_changed 2025-12-06T10:04:44.209099+0000
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: min_mon_release 18 (reef)
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: election_strategy: 1
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005548788
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mgrmap e18: np0005548788.yvwbqq(active, since 20s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Health check failed: 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789 (MON_DOWN)
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]:     mon.np0005548788 (rank 4) addr [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] is down (out of quorum)
Dec 06 10:04:50 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-52ef7c0efc5fb8293c6ef2d5610bccc803cd7a3058d21989659699e1125ddec6-merged.mount: Deactivated successfully.
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.26575 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Removed label mgr from host np0005548785.localdomain
Dec 06 10:04:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005548785.localdomain
Dec 06 10:04:51 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 06 10:04:51 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 06 10:04:51 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:04:51 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788 calling monitor election
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: Removed label mon from host np0005548785.localdomain
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mon.np0005548786 calling monitor election
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mon.np0005548789 calling monitor election
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mon.np0005548790 calling monitor election
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 calling monitor election
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3,4)
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: monmap epoch 7
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: last_changed 2025-12-06T10:04:44.209099+0000
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: min_mon_release 18 (reef)
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: election_strategy: 1
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005548788
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: mgrmap e18: np0005548788.yvwbqq(active, since 22s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789)
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: Cluster is now healthy
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: overall HEALTH_OK
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:04:51 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.34189 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Removed label _admin from host np0005548785.localdomain
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005548785.localdomain
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: from='client.26575 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: Removed label mgr from host np0005548785.localdomain
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:52 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:52 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:53 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:04:53 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:04:53 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:04:53 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: from='client.34189 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: Removed label _admin from host np0005548785.localdomain
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:53 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:54 np0005548788.localdomain sshd[291158]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:54 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:04:54 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:04:54 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:04:54 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:54 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:54 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054726 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:55 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:04:55 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:04:55 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:04:55 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:55 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:56 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:04:56 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:04:56 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:04:56 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:56 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:56 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:56 np0005548788.localdomain sshd[291158]: Received disconnect from 45.78.194.186 port 35488:11: Bye Bye [preauth]
Dec 06 10:04:56 np0005548788.localdomain sshd[291158]: Disconnected from authenticating user root 45.78.194.186 port 35488 [preauth]
Dec 06 10:04:57 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 06 10:04:57 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 06 10:04:57 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:04:57 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.497607) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497497691, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11361, "num_deletes": 523, "total_data_size": 16231215, "memory_usage": 16642656, "flush_reason": "Manual Compaction"}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497576957, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11186563, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11366, "table_properties": {"data_size": 11133459, "index_size": 27526, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24517, "raw_key_size": 257404, "raw_average_key_size": 26, "raw_value_size": 10968900, "raw_average_value_size": 1120, "num_data_blocks": 1034, "num_entries": 9790, "num_filter_entries": 9790, "num_deletions": 522, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015449, "oldest_key_time": 1765015449, "file_creation_time": 1765015497, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f5ec553c-4bc8-419f-b463-0a44a503b01c", "db_session_id": "UD3OCE900E9OLE8LKH8M", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 79427 microseconds, and 23647 cpu microseconds.
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.577034) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11186563 bytes OK
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.577060) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.578926) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.578947) EVENT_LOG_v1 {"time_micros": 1765015497578942, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.578966) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16155521, prev total WAL file size 16155521, number of live WAL files 2.
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.581467) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(2012B)]
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497581569, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11188575, "oldest_snapshot_seqno": -1}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9271 keys, 11178644 bytes, temperature: kUnknown
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497661048, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11178644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11126845, "index_size": 27506, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23237, "raw_key_size": 248828, "raw_average_key_size": 26, "raw_value_size": 10968958, "raw_average_value_size": 1183, "num_data_blocks": 1033, "num_entries": 9271, "num_filter_entries": 9271, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015449, "oldest_key_time": 0, "file_creation_time": 1765015497, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f5ec553c-4bc8-419f-b463-0a44a503b01c", "db_session_id": "UD3OCE900E9OLE8LKH8M", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.661467) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11178644 bytes
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.663245) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.4 rd, 140.3 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.7, 0.0 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9795, records dropped: 524 output_compression: NoCompression
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.663274) EVENT_LOG_v1 {"time_micros": 1765015497663262, "job": 4, "event": "compaction_finished", "compaction_time_micros": 79674, "compaction_time_cpu_micros": 30488, "output_level": 6, "num_output_files": 1, "total_output_size": 11178644, "num_input_records": 9795, "num_output_records": 9271, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497664871, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497664918, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 06 10:04:57 np0005548788.localdomain ceph-mon[288455]: rocksdb: (Original Log Time 2025/12/06-10:04:57.581296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:58 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:04:58 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:04:58 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:58 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:58 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:04:58 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:04:58 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:04:59 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:04:59 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:04:59 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:04:59 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:04:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:04:59 np0005548788.localdomain systemd[1]: tmp-crun.gjGjt2.mount: Deactivated successfully.
Dec 06 10:04:59 np0005548788.localdomain podman[291160]: 2025-12-06 10:04:59.270507768 +0000 UTC m=+0.093969027 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:04:59 np0005548788.localdomain podman[291160]: 2025-12-06 10:04:59.309748752 +0000 UTC m=+0.133209961 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:04:59 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:59 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:00 np0005548788.localdomain ceph-mon[288455]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:05:00 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:05:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:05:01 np0005548788.localdomain podman[291179]: 2025-12-06 10:05:01.255603207 +0000 UTC m=+0.081037379 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:05:01 np0005548788.localdomain podman[291179]: 2025-12-06 10:05:01.298618718 +0000 UTC m=+0.124052840 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:05:01 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:01 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:02 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:02 np0005548788.localdomain ceph-mon[288455]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain sudo[291202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Removing np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:03 np0005548788.localdomain sudo[291202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291202]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:03 np0005548788.localdomain sudo[291220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:03 np0005548788.localdomain sudo[291220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291220]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain sudo[291238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548788.localdomain sudo[291238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291238]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548785.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Added label _no_schedule to host np0005548785.localdomain
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005548785.localdomain
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548785.localdomain
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548785.localdomain
Dec 06 10:05:03 np0005548788.localdomain sudo[291256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:03 np0005548788.localdomain sudo[291256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291256]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain sudo[291274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548788.localdomain sudo[291274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291274]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain sudo[291308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548788.localdomain sudo[291308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291308]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain sudo[291326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548788.localdomain sudo[291326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548788.localdomain sudo[291326]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:03 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain sudo[291344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain sudo[291344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain sudo[291344]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain sudo[291362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:04 np0005548788.localdomain sudo[291362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291362]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain sudo[291380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:04 np0005548788.localdomain sudo[291380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291380]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Removing np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548788.localdomain sudo[291398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548788.localdomain sudo[291398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291398]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain sudo[291416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:04 np0005548788.localdomain sudo[291416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291416]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain sudo[291434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548788.localdomain sudo[291434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291434]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:04 np0005548788.localdomain sudo[291468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548788.localdomain sudo[291468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:05:04 np0005548788.localdomain sudo[291468]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain sudo[291492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548788.localdomain sudo[291492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291492]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain podman[291486]: 2025-12-06 10:05:04.691346467 +0000 UTC m=+0.091310484 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 10:05:04 np0005548788.localdomain podman[291486]: 2025-12-06 10:05:04.726678372 +0000 UTC m=+0.126642359 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:05:04 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:05:04 np0005548788.localdomain sudo[291521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548788.localdomain sudo[291521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548788.localdomain sudo[291521]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev 28513737-dab7-4f20-a08d-4f799dc06b55 (Updating crash deployment (-1 -> 5))
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Removing daemon crash.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:05:04 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing daemon crash.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:05:04 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:05 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548785.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548785.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Added label _no_schedule to host np0005548785.localdomain
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548785.localdomain
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:06 np0005548788.localdomain ceph-mon[288455]: Removing daemon crash.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:05:06 np0005548788.localdomain ceph-mon[288455]: from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548785.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.26806 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548785.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Removed host np0005548785.localdomain
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removed host np0005548785.localdomain
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.services.cephadmservice] Removing key for client.crash.np0005548785.localdomain
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing key for client.crash.np0005548785.localdomain
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev 28513737-dab7-4f20-a08d-4f799dc06b55 (Updating crash deployment (-1 -> 5))
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event 28513737-dab7-4f20-a08d-4f799dc06b55 (Updating crash deployment (-1 -> 5)) in 2 seconds
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev b558c9b3-a4b3-4dd3-9a37-bf0ae6bc522b (Updating node-proxy deployment (+5 -> 5))
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev b558c9b3-a4b3-4dd3-9a37-bf0ae6bc522b (Updating node-proxy deployment (+5 -> 5))
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event b558c9b3-a4b3-4dd3-9a37-bf0ae6bc522b (Updating node-proxy deployment (+5 -> 5)) in 0 seconds
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:06 np0005548788.localdomain sudo[291541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:06 np0005548788.localdomain sudo[291541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:06 np0005548788.localdomain sudo[291541]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev de8ca804-6157-48c7-999b-96aca4f28fef (Updating node-proxy deployment (+5 -> 5))
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev de8ca804-6157-48c7-999b-96aca4f28fef (Updating node-proxy deployment (+5 -> 5))
Dec 06 10:05:06 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event de8ca804-6157-48c7-999b-96aca4f28fef (Updating node-proxy deployment (+5 -> 5)) in 0 seconds
Dec 06 10:05:07 np0005548788.localdomain sudo[291559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:07 np0005548788.localdomain sudo[291559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:07 np0005548788.localdomain sudo[291559]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:07 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:07 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:07 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:07 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='client.26806 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548785.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"}]': finished
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: Removed host np0005548785.localdomain
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: Removing key for client.crash.np0005548785.localdomain
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"}]': finished
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:05:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:08 np0005548788.localdomain sshd[291577]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:08 np0005548788.localdomain sshd[291577]: Accepted publickey for tripleo-admin from 192.168.122.11 port 57426 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:05:08 np0005548788.localdomain systemd-logind[765]: New session 65 of user tripleo-admin.
Dec 06 10:05:08 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 06 10:05:08 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 06 10:05:08 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 06 10:05:08 np0005548788.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:05:08 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:05:08 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:05:08 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:05:08 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Queued start job for default target Main User Target.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Created slice User Application Slice.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Reached target Paths.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Reached target Timers.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Starting D-Bus User Message Bus Socket...
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Starting Create User's Volatile Files and Directories...
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Reached target Sockets.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Finished Create User's Volatile Files and Directories.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Reached target Basic System.
Dec 06 10:05:08 np0005548788.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Reached target Main User Target.
Dec 06 10:05:08 np0005548788.localdomain systemd[291581]: Startup finished in 164ms.
Dec 06 10:05:08 np0005548788.localdomain systemd[1]: Started Session 65 of User tripleo-admin.
Dec 06 10:05:08 np0005548788.localdomain sshd[291577]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:05:08 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:08 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:08 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Writing back 50 completed events
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:05:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:05:08 np0005548788.localdomain sudo[291722]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtnvphlxcizsxmxthkdoulevlwanqxdt ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015508.517288-60809-149002370730244/AnsiballZ_lineinfile.py
Dec 06 10:05:08 np0005548788.localdomain sudo[291722]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:05:09 np0005548788.localdomain python3[291724]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 10:05:09 np0005548788.localdomain sudo[291722]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:09 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:09 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:09 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:09 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:09 np0005548788.localdomain sudo[291868]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tehpptqjdlsqxsogygfhycuvwwknohly ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015509.363559-60825-237874568953804/AnsiballZ_command.py
Dec 06 10:05:09 np0005548788.localdomain sudo[291868]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:05:09 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:10 np0005548788.localdomain python3[291870]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:05:10 np0005548788.localdomain sudo[291868]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:10 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:05:10 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:05:10 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:05:10 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:05:10 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:05:10 np0005548788.localdomain ceph-mon[288455]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:10 np0005548788.localdomain sudo[292013]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxqfzqmlmgcjmaazeqcbeyzgagcjyilb ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015510.1994-60836-257856868430089/AnsiballZ_command.py
Dec 06 10:05:10 np0005548788.localdomain sudo[292013]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:05:10 np0005548788.localdomain python3[292015]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:11 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:12 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:12 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:12 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:12 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:12 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:12 np0005548788.localdomain sudo[292017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:12 np0005548788.localdomain sudo[292017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:12 np0005548788.localdomain sudo[292017]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:12 np0005548788.localdomain sudo[292013]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:12 np0005548788.localdomain sudo[292035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:12 np0005548788.localdomain sudo[292035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:13 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 2025-12-06 10:05:13.402081469 +0000 UTC m=+0.086004772 container create cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_bell, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z)
Dec 06 10:05:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc.scope.
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 2025-12-06 10:05:13.368155147 +0000 UTC m=+0.052078540 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:13 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 2025-12-06 10:05:13.508004101 +0000 UTC m=+0.191927404 container init cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_bell, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2025-11-26T19:44:28Z, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64)
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 2025-12-06 10:05:13.525635002 +0000 UTC m=+0.209558335 container start cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_bell, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., release=1763362218)
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 2025-12-06 10:05:13.526072836 +0000 UTC m=+0.209996159 container attach cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_bell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:05:13 np0005548788.localdomain vigorous_bell[292103]: 167 167
Dec 06 10:05:13 np0005548788.localdomain systemd[1]: libpod-cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc.scope: Deactivated successfully.
Dec 06 10:05:13 np0005548788.localdomain podman[292088]: 2025-12-06 10:05:13.531640987 +0000 UTC m=+0.215564290 container died cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_bell, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main)
Dec 06 10:05:13 np0005548788.localdomain podman[292108]: 2025-12-06 10:05:13.640727166 +0000 UTC m=+0.095707499 container remove cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_bell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, release=1763362218, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 10:05:13 np0005548788.localdomain systemd[1]: libpod-conmon-cf13f8fa7943b08458cd43029901e63a671291cee7a51e3a1b3a0b391cdc9bdc.scope: Deactivated successfully.
Dec 06 10:05:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:05:13 np0005548788.localdomain sudo[292035]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:13 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:13 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:13 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:13 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:13 np0005548788.localdomain podman[292124]: 2025-12-06 10:05:13.787040539 +0000 UTC m=+0.105044837 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:05:13 np0005548788.localdomain podman[292124]: 2025-12-06 10:05:13.830371489 +0000 UTC m=+0.148375807 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:05:13 np0005548788.localdomain sudo[292138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:13 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:05:13 np0005548788.localdomain sudo[292138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:13 np0005548788.localdomain sudo[292138]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:13 np0005548788.localdomain sudo[292167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:13 np0005548788.localdomain sudo[292167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.26610 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Saving service mon spec with placement label:mon
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 2025-12-06 10:05:14.3948505 +0000 UTC m=+0.080392108 container create 49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_heyrovsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1763362218, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:05:14 np0005548788.localdomain systemd[1]: tmp-crun.z2HTx5.mount: Deactivated successfully.
Dec 06 10:05:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-3b1affe011bec132ecf3f5abdee5a8afa19e0da643373712ccecde042f3a2e19-merged.mount: Deactivated successfully.
Dec 06 10:05:14 np0005548788.localdomain systemd[1]: Started libpod-conmon-49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4.scope.
Dec 06 10:05:14 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 2025-12-06 10:05:14.363119946 +0000 UTC m=+0.048661564 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 2025-12-06 10:05:14.46452759 +0000 UTC m=+0.150069198 container init 49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_heyrovsky, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 2025-12-06 10:05:14.476598151 +0000 UTC m=+0.162139769 container start 49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_heyrovsky, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7)
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 2025-12-06 10:05:14.47689401 +0000 UTC m=+0.162435698 container attach 49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_heyrovsky, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True)
Dec 06 10:05:14 np0005548788.localdomain inspiring_heyrovsky[292218]: 167 167
Dec 06 10:05:14 np0005548788.localdomain systemd[1]: libpod-49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4.scope: Deactivated successfully.
Dec 06 10:05:14 np0005548788.localdomain podman[292203]: 2025-12-06 10:05:14.482546314 +0000 UTC m=+0.168087952 container died 49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_heyrovsky, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64)
Dec 06 10:05:14 np0005548788.localdomain podman[292223]: 2025-12-06 10:05:14.582483211 +0000 UTC m=+0.090629363 container remove 49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_heyrovsky, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:05:14 np0005548788.localdomain systemd[1]: libpod-conmon-49063f74f36ed9ee56ec1f3dc46a92af27e2168908b680b900d97e3d2f8beff4.scope: Deactivated successfully.
Dec 06 10:05:14 np0005548788.localdomain sudo[292167]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:14 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:14 np0005548788.localdomain sudo[292247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:14 np0005548788.localdomain sudo[292247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:14 np0005548788.localdomain sudo[292247]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:14 np0005548788.localdomain sudo[292265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:14 np0005548788.localdomain sudo[292265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:14 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: from='client.26610 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: Saving service mon spec with placement label:mon
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:05:15 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 2025-12-06 10:05:15.343005103 +0000 UTC m=+0.062307064 container create a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_hopper, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., ceph=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: Started libpod-conmon-a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340.scope.
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: tmp-crun.BCgP2W.mount: Deactivated successfully.
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2bbe958fa5293bfa81c211fcb188b5c952bd47fff81b448cf97db8652f9236cd-merged.mount: Deactivated successfully.
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 2025-12-06 10:05:15.31363726 +0000 UTC m=+0.032939251 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 2025-12-06 10:05:15.415367905 +0000 UTC m=+0.134669866 container init a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_hopper, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, release=1763362218, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph)
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 2025-12-06 10:05:15.425793414 +0000 UTC m=+0.145095365 container start a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_hopper, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main)
Dec 06 10:05:15 np0005548788.localdomain distracted_hopper[292314]: 167 167
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 2025-12-06 10:05:15.426269329 +0000 UTC m=+0.145571280 container attach a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_hopper, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4)
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: libpod-a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340.scope: Deactivated successfully.
Dec 06 10:05:15 np0005548788.localdomain podman[292299]: 2025-12-06 10:05:15.429508109 +0000 UTC m=+0.148810110 container died a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_hopper, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=)
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-622624b93873299a74ce7b3faee76cf776404d09bfcfff8fac94cfe773f7ec95-merged.mount: Deactivated successfully.
Dec 06 10:05:15 np0005548788.localdomain podman[292319]: 2025-12-06 10:05:15.527779056 +0000 UTC m=+0.088150198 container remove a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_hopper, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 06 10:05:15 np0005548788.localdomain systemd[1]: libpod-conmon-a6105ee9195d56e44753b2917ae018bc136060b516d1485917bccc3bee612340.scope: Deactivated successfully.
Dec 06 10:05:15 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.34209 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:05:15 np0005548788.localdomain sudo[292265]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:15 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:05:15 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:05:15 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:05:15 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:05:15 np0005548788.localdomain sudo[292342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:15 np0005548788.localdomain sudo[292342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:15 np0005548788.localdomain sudo[292342]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:15 np0005548788.localdomain sudo[292360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:15 np0005548788.localdomain sudo[292360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:16 np0005548788.localdomain ceph-mon[288455]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 2025-12-06 10:05:16.344952386 +0000 UTC m=+0.076333325 container create d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_cori, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1763362218, io.openshift.expose-services=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, version=7)
Dec 06 10:05:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e.scope.
Dec 06 10:05:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 2025-12-06 10:05:16.404796453 +0000 UTC m=+0.136177392 container init d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_cori, name=rhceph, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, release=1763362218, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 2025-12-06 10:05:16.314912164 +0000 UTC m=+0.046293123 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 2025-12-06 10:05:16.414578083 +0000 UTC m=+0.145959022 container start d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_cori, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1763362218, ceph=True)
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 2025-12-06 10:05:16.415049609 +0000 UTC m=+0.146430548 container attach d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_cori, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Dec 06 10:05:16 np0005548788.localdomain dreamy_cori[292412]: 167 167
Dec 06 10:05:16 np0005548788.localdomain systemd[1]: libpod-d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e.scope: Deactivated successfully.
Dec 06 10:05:16 np0005548788.localdomain podman[292396]: 2025-12-06 10:05:16.418514815 +0000 UTC m=+0.149895814 container died d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_cori, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container)
Dec 06 10:05:16 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-edb384b2e9e7accfd4ab5fddbcd5e03f17a1ae739f64a0448da0c74224e0196d-merged.mount: Deactivated successfully.
Dec 06 10:05:16 np0005548788.localdomain podman[292417]: 2025-12-06 10:05:16.523711934 +0000 UTC m=+0.087851388 container remove d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_cori, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True)
Dec 06 10:05:16 np0005548788.localdomain systemd[1]: libpod-conmon-d11a7a7a54405cd93da88df9a96fc5680a6aa0ad674bd26e03c5983c4579866e.scope: Deactivated successfully.
Dec 06 10:05:16 np0005548788.localdomain sudo[292360]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:16 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:05:16 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:05:16 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:05:16 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:05:16 np0005548788.localdomain sudo[292435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:16 np0005548788.localdomain sudo[292435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:16 np0005548788.localdomain sudo[292435]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:16 np0005548788.localdomain sudo[292453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:16 np0005548788.localdomain sudo[292453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.26625 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548788"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Remove daemons mon.np0005548788
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005548788
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005548788: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'])
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005548788: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'])
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005548788 from monmap...
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing monitor np0005548788 from monmap...
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005548788 from np0005548788.localdomain -- ports []
Dec 06 10:05:17 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005548788 from np0005548788.localdomain -- ports []
Dec 06 10:05:17 np0005548788.localdomain ceph-mon[288455]: mon.np0005548788@4(peon) e8  removed from monmap, suicide.
Dec 06 10:05:17 np0005548788.localdomain sudo[292471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:17 np0005548788.localdomain sudo[292471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:17 np0005548788.localdomain sudo[292471]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:17 np0005548788.localdomain podman[292494]: 2025-12-06 10:05:17.199342309 +0000 UTC m=+0.056876068 container died 31023b3d2b24e86b720cbbaa8472e97c6b474f4bfb80ea5ff5aaf6796d2126bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=1763362218, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, vcs-type=git, architecture=x86_64, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:05:17 np0005548788.localdomain podman[292494]: 2025-12-06 10:05:17.237877122 +0000 UTC m=+0.095410881 container remove 31023b3d2b24e86b720cbbaa8472e97c6b474f4bfb80ea5ff5aaf6796d2126bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, release=1763362218, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:05:17 np0005548788.localdomain sudo[292511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 --name mon.np0005548788 --force
Dec 06 10:05:17 np0005548788.localdomain sudo[292511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 2025-12-06 10:05:17.313755461 +0000 UTC m=+0.079265144 container create d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_pasteur, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git)
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3.scope.
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 2025-12-06 10:05:17.369105391 +0000 UTC m=+0.134615074 container init d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_pasteur, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, version=7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph)
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 2025-12-06 10:05:17.37755438 +0000 UTC m=+0.143064053 container start d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_pasteur, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True)
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 2025-12-06 10:05:17.377824489 +0000 UTC m=+0.143334172 container attach d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_pasteur, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7)
Dec 06 10:05:17 np0005548788.localdomain thirsty_pasteur[292562]: 167 167
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 2025-12-06 10:05:17.379443769 +0000 UTC m=+0.144953522 container died d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_pasteur, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph)
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: libpod-d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3.scope: Deactivated successfully.
Dec 06 10:05:17 np0005548788.localdomain podman[292537]: 2025-12-06 10:05:17.281575704 +0000 UTC m=+0.047085417 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-dd8ca3e55372ce01e34985d9deb70439c47e87b0479d8be96656e82b6d0781b0-merged.mount: Deactivated successfully.
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: tmp-crun.jRKnlh.mount: Deactivated successfully.
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6b19b5bbcd00cd966724707a0b74929eeba52d51e4f5c2719b1bf5baca862156-merged.mount: Deactivated successfully.
Dec 06 10:05:17 np0005548788.localdomain podman[292572]: 2025-12-06 10:05:17.487497476 +0000 UTC m=+0.101228259 container remove d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_pasteur, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, release=1763362218, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:05:17 np0005548788.localdomain systemd[1]: libpod-conmon-d7251464326daa177ce94b06d82a4570696ffcd760af09b43c79364f1de994b3.scope: Deactivated successfully.
Dec 06 10:05:17 np0005548788.localdomain sudo[292453]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:17 np0005548788.localdomain sshd[292629]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8@mon.np0005548788.service: Deactivated successfully.
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: Stopped Ceph mon.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8@mon.np0005548788.service: Consumed 4.147s CPU time.
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:05:18 np0005548788.localdomain systemd-sysv-generator[292701]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:05:18 np0005548788.localdomain systemd-rc-local-generator[292697]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:05:18 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:05:18 np0005548788.localdomain sudo[292511]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:18 np0005548788.localdomain podman[292713]: 2025-12-06 10:05:18.783695614 +0000 UTC m=+0.103534129 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:05:18 np0005548788.localdomain podman[292713]: 2025-12-06 10:05:18.798664314 +0000 UTC m=+0.118502429 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 06 10:05:18 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:05:18 np0005548788.localdomain sshd[292629]: Connection closed by authenticating user root 45.10.175.77 port 43812 [preauth]
Dec 06 10:05:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:05:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:05:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:05:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150831 "" "Go-http-client/1.1"
Dec 06 10:05:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:05:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17732 "" "Go-http-client/1.1"
Dec 06 10:05:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:05:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:05:20 np0005548788.localdomain systemd[1]: tmp-crun.9ERfeq.mount: Deactivated successfully.
Dec 06 10:05:20 np0005548788.localdomain podman[292731]: 2025-12-06 10:05:20.265022407 +0000 UTC m=+0.090068516 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:05:20 np0005548788.localdomain podman[292731]: 2025-12-06 10:05:20.301630251 +0000 UTC m=+0.126676380 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:05:20 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:05:20 np0005548788.localdomain podman[292732]: 2025-12-06 10:05:20.328800436 +0000 UTC m=+0.146187520 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec 06 10:05:20 np0005548788.localdomain podman[292732]: 2025-12-06 10:05:20.340777903 +0000 UTC m=+0.158164957 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Dec 06 10:05:20 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:05:20 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:21 np0005548788.localdomain systemd[1]: tmp-crun.FGKrkF.mount: Deactivated successfully.
Dec 06 10:05:22 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:05:22 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:05:22 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:05:22 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:05:22 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:23 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 06 10:05:23 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 06 10:05:23 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:05:23 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:05:24 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:24 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 06 10:05:24 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 06 10:05:24 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:05:24 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:05:25 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:05:25 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:05:25 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:05:25 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:05:26 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:26 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:05:26 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:05:26 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:05:26 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:05:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:27.447 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:27.447 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:27.448 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:27.448 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:05:27 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:05:27 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:05:27 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:05:27 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:05:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:28.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:28.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Optimize plan auto_2025-12-06_10:05:28
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] do_upmap
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] pools ['backups', 'images', 'manila_data', '.mgr', 'manila_metadata', 'vms', 'volumes']
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] prepared 0/10 changes
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] _maybe_adjust
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033250017448352874 of space, bias 1.0, pg target 0.6650003489670575 quantized to 32 (current 32)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.1810441094360693e-06 of space, bias 4.0, pg target 0.001741927228736274 quantized to 16 (current 16)
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:05:28 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:05:29 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Dec 06 10:05:29 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Dec 06 10:05:29 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:05:29 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:05:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:30.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:30.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:05:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:30.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:05:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:30.019 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:05:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:30.020 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:30.021 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:05:30 np0005548788.localdomain systemd[1]: tmp-crun.X4M67h.mount: Deactivated successfully.
Dec 06 10:05:30 np0005548788.localdomain podman[292772]: 2025-12-06 10:05:30.265948503 +0000 UTC m=+0.091760719 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:05:30 np0005548788.localdomain podman[292772]: 2025-12-06 10:05:30.283710008 +0000 UTC m=+0.109522204 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 06 10:05:30 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:05:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:30 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.024 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.024 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.025 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.025 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.026 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.504 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:05:31 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:31 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:31 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:31 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.728 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.730 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12389MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.730 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.731 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.808 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.809 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:05:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:31.825 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:05:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:05:32 np0005548788.localdomain podman[292833]: 2025-12-06 10:05:32.251437915 +0000 UTC m=+0.079869254 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:05:32 np0005548788.localdomain podman[292833]: 2025-12-06 10:05:32.260537484 +0000 UTC m=+0.088968833 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:05:32 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:05:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:32.302 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:05:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:32.310 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:05:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:32.328 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:05:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:32.331 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:05:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:05:32.332 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:32 np0005548788.localdomain sudo[292859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:32 np0005548788.localdomain sudo[292859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:32 np0005548788.localdomain sudo[292859]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:32 np0005548788.localdomain sudo[292877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:05:32 np0005548788.localdomain sudo[292877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:33 np0005548788.localdomain podman[292967]: 2025-12-06 10:05:33.588672962 +0000 UTC m=+0.092875142 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True)
Dec 06 10:05:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.26676 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548788.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:33 np0005548788.localdomain podman[292967]: 2025-12-06 10:05:33.720750477 +0000 UTC m=+0.224952717 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 06 10:05:33 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:05:33 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:05:33 np0005548788.localdomain sudo[292998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:33 np0005548788.localdomain sudo[292998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:33 np0005548788.localdomain sudo[292998]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:33 np0005548788.localdomain sudo[293033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:33 np0005548788.localdomain sudo[293033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:34 np0005548788.localdomain sudo[292877]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:34 np0005548788.localdomain sudo[293120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:34 np0005548788.localdomain sudo[293120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:34 np0005548788.localdomain sudo[293120]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:34 np0005548788.localdomain sudo[293155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:05:34 np0005548788.localdomain sudo[293155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 2025-12-06 10:05:34.558381136 +0000 UTC m=+0.088357454 container create 4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_cohen, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7)
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: Started libpod-conmon-4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4.scope.
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 2025-12-06 10:05:34.52430364 +0000 UTC m=+0.054279978 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 2025-12-06 10:05:34.637929299 +0000 UTC m=+0.167905617 container init 4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_cohen, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: tmp-crun.ejW6GK.mount: Deactivated successfully.
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 2025-12-06 10:05:34.658608173 +0000 UTC m=+0.188584481 container start 4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_cohen, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, RELEASE=main)
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 2025-12-06 10:05:34.658881072 +0000 UTC m=+0.188857380 container attach 4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_cohen, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:34 np0005548788.localdomain strange_cohen[293193]: 167 167
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: libpod-4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4.scope: Deactivated successfully.
Dec 06 10:05:34 np0005548788.localdomain podman[293177]: 2025-12-06 10:05:34.671082146 +0000 UTC m=+0.201058524 container died 4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_cohen, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:05:34 np0005548788.localdomain podman[293198]: 2025-12-06 10:05:34.763859655 +0000 UTC m=+0.084089964 container remove 4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_cohen, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1763362218, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, architecture=x86_64, version=7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: libpod-conmon-4cf3b0809cecbd06a3849424fba311407432c127aa3204ec24664fb93392c0e4.scope: Deactivated successfully.
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:05:34 np0005548788.localdomain podman[293224]: 
Dec 06 10:05:34 np0005548788.localdomain podman[293222]: 2025-12-06 10:05:34.931394338 +0000 UTC m=+0.132284262 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:05:34 np0005548788.localdomain podman[293224]: 2025-12-06 10:05:34.850131483 +0000 UTC m=+0.043816925 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:34 np0005548788.localdomain podman[293222]: 2025-12-06 10:05:34.963518285 +0000 UTC m=+0.164408279 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:05:34 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:05:34 np0005548788.localdomain podman[293224]: 2025-12-06 10:05:34.990486883 +0000 UTC m=+0.184172295 container create 9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_mirzakhani, architecture=x86_64, GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: Started libpod-conmon-9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e.scope.
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01a3344e11ff73e3591e2f7c8b081d370ac5e7cf904626c9aa077cf7558067c4/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01a3344e11ff73e3591e2f7c8b081d370ac5e7cf904626c9aa077cf7558067c4/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01a3344e11ff73e3591e2f7c8b081d370ac5e7cf904626c9aa077cf7558067c4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:35 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01a3344e11ff73e3591e2f7c8b081d370ac5e7cf904626c9aa077cf7558067c4/merged/var/lib/ceph/mon/ceph-np0005548788 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:35 np0005548788.localdomain podman[293224]: 2025-12-06 10:05:35.061335999 +0000 UTC m=+0.255021481 container init 9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_mirzakhani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:05:35 np0005548788.localdomain podman[293224]: 2025-12-06 10:05:35.070483679 +0000 UTC m=+0.264169101 container start 9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_mirzakhani, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, architecture=x86_64, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main)
Dec 06 10:05:35 np0005548788.localdomain podman[293224]: 2025-12-06 10:05:35.071083127 +0000 UTC m=+0.264768549 container attach 9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_mirzakhani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:35 np0005548788.localdomain sudo[293155]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: libpod-9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e.scope: Deactivated successfully.
Dec 06 10:05:35 np0005548788.localdomain podman[293224]: 2025-12-06 10:05:35.178901908 +0000 UTC m=+0.372587310 container died 9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_mirzakhani, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.buildah.version=1.41.4, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True)
Dec 06 10:05:35 np0005548788.localdomain sudo[293303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:35 np0005548788.localdomain sudo[293303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548788.localdomain sudo[293303]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548788.localdomain podman[293319]: 2025-12-06 10:05:35.268472259 +0000 UTC m=+0.076341546 container remove 9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_mirzakhani, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, RELEASE=main, release=1763362218, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: libpod-conmon-9b2e8ad73aefac371c22e4f3aae46fa0e9796d13c3559c21c03065209720587e.scope: Deactivated successfully.
Dec 06 10:05:35 np0005548788.localdomain sudo[293335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:35 np0005548788.localdomain sudo[293335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548788.localdomain sudo[293335]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:05:35 np0005548788.localdomain sudo[293355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:35 np0005548788.localdomain systemd-sysv-generator[293399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:05:35 np0005548788.localdomain systemd-rc-local-generator[293394]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-398aaa5aa1eccbc09c5de3eaa7e4d7b966ab0225f9449cf74728e42cf8f88d83-merged.mount: Deactivated successfully.
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain sudo[293355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548788.localdomain sudo[293355]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: Reloading.
Dec 06 10:05:35 np0005548788.localdomain sudo[293411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain systemd-rc-local-generator[293445]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:05:35 np0005548788.localdomain systemd-sysv-generator[293453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:35 np0005548788.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:05:36 np0005548788.localdomain sudo[293411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293411]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain systemd[1]: Starting Ceph mon.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:05:36 np0005548788.localdomain sudo[293468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:36 np0005548788.localdomain sudo[293468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293468]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain sudo[293524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:36 np0005548788.localdomain sudo[293524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293524]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain sudo[293559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:36 np0005548788.localdomain sudo[293559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293559]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain sudo[293589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain sudo[293589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293589]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain podman[293605]: 
Dec 06 10:05:36 np0005548788.localdomain podman[293605]: 2025-12-06 10:05:36.462374266 +0000 UTC m=+0.069720212 container create 314d98beac8c4593b161da2602f8412f10c90afaa83b6435702a9d4d03120a2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.component=rhceph-container, version=7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, architecture=x86_64)
Dec 06 10:05:36 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain sudo[293619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:36 np0005548788.localdomain sudo[293619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293619]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6851abc6444d34520f9220c9b021fd039ba78a6db03518525adcbaf89010f4ac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6851abc6444d34520f9220c9b021fd039ba78a6db03518525adcbaf89010f4ac/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6851abc6444d34520f9220c9b021fd039ba78a6db03518525adcbaf89010f4ac/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:36 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6851abc6444d34520f9220c9b021fd039ba78a6db03518525adcbaf89010f4ac/merged/var/lib/ceph/mon/ceph-np0005548788 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:05:36 np0005548788.localdomain podman[293605]: 2025-12-06 10:05:36.52080284 +0000 UTC m=+0.128148766 container init 314d98beac8c4593b161da2602f8412f10c90afaa83b6435702a9d4d03120a2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:05:36 np0005548788.localdomain podman[293605]: 2025-12-06 10:05:36.530337682 +0000 UTC m=+0.137683598 container start 314d98beac8c4593b161da2602f8412f10c90afaa83b6435702a9d4d03120a2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548788, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, version=7, release=1763362218, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:36 np0005548788.localdomain bash[293605]: 314d98beac8c4593b161da2602f8412f10c90afaa83b6435702a9d4d03120a2b
Dec 06 10:05:36 np0005548788.localdomain podman[293605]: 2025-12-06 10:05:36.430460135 +0000 UTC m=+0.037806091 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:36 np0005548788.localdomain systemd[1]: Started Ceph mon.np0005548788 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pidfile_write: ignore empty --pid-file
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: load: jerasure load: lrc 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: RocksDB version: 7.9.2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Git sha 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: DB SUMMARY
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: DB Session ID:  IM8HDYJUB6SO5D2H95ML
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: CURRENT file:  CURRENT
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005548788/store.db dir, Total Num: 0, files: 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005548788/store.db: 000004.log size: 886 ; 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                         Options.error_if_exists: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.create_if_missing: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                                     Options.env: 0x564f181e59e0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                                Options.info_log: 0x564f1a09ed20
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                              Options.statistics: (nil)
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                               Options.use_fsync: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                              Options.db_log_dir: 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                                 Options.wal_dir: 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                    Options.write_buffer_manager: 0x564f1a0af540
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.unordered_write: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                               Options.row_cache: None
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                              Options.wal_filter: None
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.two_write_queues: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.wal_compression: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.atomic_flush: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.max_background_jobs: 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.max_background_compactions: -1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.max_subcompactions: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.max_total_wal_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                          Options.max_open_files: -1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:       Options.compaction_readahead_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Compression algorithms supported:
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kZSTD supported: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kXpressCompression supported: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kBZip2Compression supported: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kLZ4Compression supported: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kZlibCompression supported: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         kSnappyCompression supported: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005548788/store.db/MANIFEST-000005
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:           Options.merge_operator: 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:        Options.compaction_filter: None
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564f1a09e980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x564f1a09b350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:        Options.write_buffer_size: 33554432
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:  Options.max_write_buffer_number: 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.compression: NoCompression
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.num_levels: 7
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 10:05:36 np0005548788.localdomain sudo[293033]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                           Options.bloom_locality: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                               Options.ttl: 2592000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                       Options.enable_blob_files: false
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                           Options.min_blob_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005548788/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: e6488537-0b5c-4b08-a28c-5ae83bf6aba9
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015536577776, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015536580227, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015536580331, "job": 1, "event": "recovery_finished"}
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x564f1a0c2e00
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: DB pointer 0x564f1a1b8000
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 does not exist in monmap, will attempt to join an existing cluster
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.96 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.96 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x564f1a09b350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: starting mon.np0005548788 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005548788 fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(???) e0 preinit fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:36 np0005548788.localdomain sudo[293642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:36 np0005548788.localdomain sudo[293642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293642]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing) e8 sync_obtain_latest_monmap
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8
Dec 06 10:05:36 np0005548788.localdomain sudo[293699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:36 np0005548788.localdomain sudo[293699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293699]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain sudo[293717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:36 np0005548788.localdomain sudo[293717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293717]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain sudo[293735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:36 np0005548788.localdomain sudo[293735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548788.localdomain sudo[293735]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).mds e16 new map
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-12-06T08:18:49.925523+0000
                                                           modified        2025-12-06T10:03:02.051468+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        87
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26356}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26356 members: 26356
                                                           [mds.mds.np0005548790.vhcezv{0:26356} state up:active seq 16 addr [v2:172.18.0.108:6808/1621657194,v1:172.18.0.108:6809/1621657194] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005548789.vxwwsq{-1:16884} state up:standby seq 1 addr [v2:172.18.0.107:6808/3033303281,v1:172.18.0.107:6809/3033303281] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005548788.erzujf{-1:16890} state up:standby seq 1 addr [v2:172.18.0.106:6808/309324236,v1:172.18.0.106:6809/309324236] compat {c=[1],r=[1],i=[17ff]}]
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).osd e88 crush map has features 3314933000854323200, adjusting msgr requires
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).osd e88 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).osd e88 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).osd e88 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Removing np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548785.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Added label _no_schedule to host np0005548785.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548785.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Removing daemon crash.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548785.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='client.26806 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548785.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"}]': finished
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Removed host np0005548785.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Removing key for client.crash.np0005548785.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"}]': finished
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.26610 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Saving service mon spec with placement label:mon
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.26625 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548788"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Remove daemons mon.np0005548788
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Safe to remove mon.np0005548788: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'])
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Removing monitor np0005548788 from monmap...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Removing daemon mon.np0005548788 from np0005548788.localdomain -- ports []
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548786 calling monitor election
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789 in quorum (ranks 0,2,3)
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_OK
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789 in quorum (ranks 0,1,2,3)
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: monmap epoch 8
Dec 06 10:05:37 np0005548788.localdomain sudo[293769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:05:17.086581+0000
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mgrmap e18: np0005548788.yvwbqq(active, since 53s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_OK
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain sudo[293769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3699539753' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2165954404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2317264595' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2669262318' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/556966387' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2556899461' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='client.26676 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548788.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Deploying daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(synchronizing).paxosservice(auth 1..36) refresh upgraded, format 0 -> 3
Dec 06 10:05:37 np0005548788.localdomain sudo[293769]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:37 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:37 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (2) No such file or directory
Dec 06 10:05:37 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (22) Invalid argument
Dec 06 10:05:37 np0005548788.localdomain sudo[293787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:37 np0005548788.localdomain sudo[293787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:37 np0005548788.localdomain sudo[293787]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:37 np0005548788.localdomain sudo[293805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:37 np0005548788.localdomain sudo[293805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:37 np0005548788.localdomain sudo[293805]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:38 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:38 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (22) Invalid argument
Dec 06 10:05:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:05:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:05:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:05:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(probing) e8 handle_auth_request failed to assign global_id
Dec 06 10:05:39 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:39 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (22) Invalid argument
Dec 06 10:05:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@-1(probing) e9  my rank is now 4 (was -1)
Dec 06 10:05:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:05:39 np0005548788.localdomain ceph-mon[293643]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 06 10:05:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:40 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:40 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (22) Invalid argument
Dec 06 10:05:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:40 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:41 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:41 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (22) Invalid argument
Dec 06 10:05:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: mgr finish mon failed to return metadata for mon.np0005548788: (22) Invalid argument
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mgrc update_daemon_metadata mon.np0005548788 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005548788.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005548788.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116604,os=Linux}
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 handle_auth_request failed to assign global_id
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548786 calling monitor election
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789 in quorum (ranks 0,2,3)
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_OK
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3,4)
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: monmap epoch 9
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:05:37.030029+0000
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: mgrmap e18: np0005548788.yvwbqq(active, since 73s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_OK
Dec 06 10:05:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev 539df3e4-9f49-4310-9096-be43cd6a1bea (Updating node-proxy deployment (+5 -> 5))
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev 539df3e4-9f49-4310-9096-be43cd6a1bea (Updating node-proxy deployment (+5 -> 5))
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event 539df3e4-9f49-4310-9096-be43cd6a1bea (Updating node-proxy deployment (+5 -> 5)) in 0 seconds
Dec 06 10:05:42 np0005548788.localdomain sudo[293823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:42 np0005548788.localdomain sudo[293823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:42 np0005548788.localdomain sudo[293823]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:43 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mon.np0005548788 172.18.0.106:0/2135287473; not ready for session (expect reconnect)
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1008829953' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1008829953' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:43 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Writing back 50 completed events
Dec 06 10:05:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:44 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_report got status from non-daemon mon.np0005548788
Dec 06 10:05:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:44.028+0000 7fde36c01640 -1 mgr.server handle_report got status from non-daemon mon.np0005548788
Dec 06 10:05:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:05:44 np0005548788.localdomain podman[293841]: 2025-12-06 10:05:44.268730589 +0000 UTC m=+0.090188670 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:05:44 np0005548788.localdomain podman[293841]: 2025-12-06 10:05:44.316782975 +0000 UTC m=+0.138241136 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:05:44 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:05:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:45 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:45 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:45 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:45 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:46 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:46 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:46 np0005548788.localdomain sudo[293866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:46 np0005548788.localdomain sudo[293866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:46 np0005548788.localdomain sudo[293866]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:46 np0005548788.localdomain sudo[293884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:46 np0005548788.localdomain sudo[293884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 2025-12-06 10:05:47.110710249 +0000 UTC m=+0.083518146 container create 0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_allen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1763362218, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True)
Dec 06 10:05:47 np0005548788.localdomain systemd[1]: Started libpod-conmon-0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221.scope.
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 2025-12-06 10:05:47.071584128 +0000 UTC m=+0.044392095 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:47 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 2025-12-06 10:05:47.191858301 +0000 UTC m=+0.164666198 container init 0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_allen, io.buildah.version=1.41.4, name=rhceph, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7)
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 2025-12-06 10:05:47.202881989 +0000 UTC m=+0.175689886 container start 0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_allen, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z)
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 2025-12-06 10:05:47.20322802 +0000 UTC m=+0.176035967 container attach 0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_allen, version=7, com.redhat.component=rhceph-container, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True)
Dec 06 10:05:47 np0005548788.localdomain determined_allen[293934]: 167 167
Dec 06 10:05:47 np0005548788.localdomain systemd[1]: libpod-0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221.scope: Deactivated successfully.
Dec 06 10:05:47 np0005548788.localdomain podman[293919]: 2025-12-06 10:05:47.207087698 +0000 UTC m=+0.179895615 container died 0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_allen, release=1763362218, ceph=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-26T19:44:28Z)
Dec 06 10:05:47 np0005548788.localdomain podman[293939]: 2025-12-06 10:05:47.305081087 +0000 UTC m=+0.090148149 container remove 0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_allen, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.41.4)
Dec 06 10:05:47 np0005548788.localdomain systemd[1]: libpod-conmon-0d0938e4aa360751feede7d6297109560bd8f00731eea106c8f9f7d93f227221.scope: Deactivated successfully.
Dec 06 10:05:47 np0005548788.localdomain sudo[293884]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:47 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:47 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:05:47.427 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:05:47.429 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:05:47.429 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:47 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:47 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:47 np0005548788.localdomain sudo[293954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:47 np0005548788.localdomain sudo[293954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:47 np0005548788.localdomain sudo[293954]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:47 np0005548788.localdomain sudo[293972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:47 np0005548788.localdomain sudo[293972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 2025-12-06 10:05:48.025564038 +0000 UTC m=+0.078218122 container create 3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_edison, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_CLEAN=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True)
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: Started libpod-conmon-3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda.scope.
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 2025-12-06 10:05:48.087793479 +0000 UTC m=+0.140447563 container init 3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_edison, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 2025-12-06 10:05:47.99370662 +0000 UTC m=+0.046360784 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 2025-12-06 10:05:48.097064034 +0000 UTC m=+0.149718128 container start 3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_edison, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 2025-12-06 10:05:48.097296961 +0000 UTC m=+0.149951045 container attach 3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_edison, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, architecture=x86_64, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:48 np0005548788.localdomain busy_edison[294021]: 167 167
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: libpod-3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda.scope: Deactivated successfully.
Dec 06 10:05:48 np0005548788.localdomain podman[294006]: 2025-12-06 10:05:48.100049055 +0000 UTC m=+0.152703149 container died 3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_edison, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4)
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6ad86428410a7c5487d63dce00848e2e4ed35c4aca55b02e2fdbd1734c41ca0a-merged.mount: Deactivated successfully.
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-516038c253a7af080d6d5b33f583c916eb7aa987bcd68db0aac8f6d07c6a57d6-merged.mount: Deactivated successfully.
Dec 06 10:05:48 np0005548788.localdomain podman[294026]: 2025-12-06 10:05:48.212469587 +0000 UTC m=+0.098510746 container remove 3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_edison, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: libpod-conmon-3f291cad5749df6ca6f2cd366f45efffde4581328acdeace5e66a7d119c0afda.scope: Deactivated successfully.
Dec 06 10:05:48 np0005548788.localdomain sudo[293972]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3950593935' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:05:48 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:48 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:48 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:48 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:48 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:48 np0005548788.localdomain sudo[294049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:48 np0005548788.localdomain sudo[294049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:48 np0005548788.localdomain sudo[294049]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:48 np0005548788.localdomain sudo[294067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:48 np0005548788.localdomain sudo[294067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:05:49 np0005548788.localdomain podman[294101]: 2025-12-06 10:05:49.097294525 +0000 UTC m=+0.116311172 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 06 10:05:49 np0005548788.localdomain podman[294101]: 2025-12-06 10:05:49.111558602 +0000 UTC m=+0.130575199 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:05:49 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 2025-12-06 10:05:49.142895994 +0000 UTC m=+0.139031439 container create 6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lichterman, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Dec 06 10:05:49 np0005548788.localdomain systemd[1]: Started libpod-conmon-6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c.scope.
Dec 06 10:05:49 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 2025-12-06 10:05:49.104936719 +0000 UTC m=+0.101072224 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 2025-12-06 10:05:49.216737532 +0000 UTC m=+0.212872977 container init 6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lichterman, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, CEPH_POINT_RELEASE=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, version=7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 2025-12-06 10:05:49.225678257 +0000 UTC m=+0.221813722 container start 6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lichterman, GIT_BRANCH=main, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 2025-12-06 10:05:49.225956595 +0000 UTC m=+0.222092070 container attach 6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lichterman, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, name=rhceph, ceph=True, release=1763362218, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:05:49 np0005548788.localdomain objective_lichterman[294137]: 167 167
Dec 06 10:05:49 np0005548788.localdomain systemd[1]: libpod-6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c.scope: Deactivated successfully.
Dec 06 10:05:49 np0005548788.localdomain podman[294109]: 2025-12-06 10:05:49.230560647 +0000 UTC m=+0.226696092 container died 6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lichterman, distribution-scope=public, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main)
Dec 06 10:05:49 np0005548788.localdomain podman[294142]: 2025-12-06 10:05:49.32676687 +0000 UTC m=+0.083401362 container remove 6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lichterman, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=1763362218, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.41.4)
Dec 06 10:05:49 np0005548788.localdomain systemd[1]: libpod-conmon-6e128696ce176beef23bdbdd8e3217222295f9662da672f31b1f744be81d1c8c.scope: Deactivated successfully.
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Reconfig service osd.default_drive_group
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:49 np0005548788.localdomain ceph-mon[293643]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:49 np0005548788.localdomain sudo[294067]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:05:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:05:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:05:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:05:49 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:05:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:05:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18214 "" "Go-http-client/1.1"
Dec 06 10:05:49 np0005548788.localdomain sudo[294166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:49 np0005548788.localdomain sudo[294166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:49 np0005548788.localdomain sudo[294166]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:49 np0005548788.localdomain sudo[294184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:49 np0005548788.localdomain sudo[294184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-20b6e4c27c534efab1cbf35c3446e27506abc61747371254cc87c1f5d083babb-merged.mount: Deactivated successfully.
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 2025-12-06 10:05:50.254760973 +0000 UTC m=+0.074214779 container create bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_knuth, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1763362218, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: Started libpod-conmon-bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1.scope.
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 2025-12-06 10:05:50.316776817 +0000 UTC m=+0.136230633 container init bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_knuth, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph)
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 2025-12-06 10:05:50.325772904 +0000 UTC m=+0.145226720 container start bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_knuth, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 2025-12-06 10:05:50.22534164 +0000 UTC m=+0.044795486 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:50 np0005548788.localdomain fervent_knuth[294233]: 167 167
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 2025-12-06 10:05:50.326851067 +0000 UTC m=+0.146304883 container attach bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_knuth, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: libpod-bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1.scope: Deactivated successfully.
Dec 06 10:05:50 np0005548788.localdomain podman[294219]: 2025-12-06 10:05:50.33248768 +0000 UTC m=+0.151941526 container died bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_knuth, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:05:50 np0005548788.localdomain podman[294238]: 2025-12-06 10:05:50.451246436 +0000 UTC m=+0.112235587 container remove bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_knuth, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: libpod-conmon-bd69a0aae28c1af06b5289bf0b2d6dc7573328368ec5ba31c1b0cd12d656d3d1.scope: Deactivated successfully.
Dec 06 10:05:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: Reconfig service osd.default_drive_group
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548788.localdomain podman[294244]: 2025-12-06 10:05:50.509985819 +0000 UTC m=+0.158287180 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:05:50 np0005548788.localdomain sudo[294184]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:50 np0005548788.localdomain podman[294244]: 2025-12-06 10:05:50.52463711 +0000 UTC m=+0.172938441 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:05:50 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:05:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:05:50 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:05:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:05:50 np0005548788.localdomain podman[294263]: 2025-12-06 10:05:50.606317198 +0000 UTC m=+0.152451963 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Dec 06 10:05:50 np0005548788.localdomain podman[294263]: 2025-12-06 10:05:50.624421223 +0000 UTC m=+0.170556028 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:05:50 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:05:50 np0005548788.localdomain sudo[294289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:50 np0005548788.localdomain sudo[294289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:50 np0005548788.localdomain sudo[294289]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:50 np0005548788.localdomain sudo[294316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:50 np0005548788.localdomain sudo[294316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-13ef41e82584d7e1bbb79a583669b485f29c0de320bf7eadf161e249d3167c27-merged.mount: Deactivated successfully.
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 2025-12-06 10:05:51.23137012 +0000 UTC m=+0.081218104 container create 0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_germain, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True)
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: Started libpod-conmon-0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de.scope.
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3205170338' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 2025-12-06 10:05:51.199719959 +0000 UTC m=+0.049567943 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 2025-12-06 10:05:51.302421441 +0000 UTC m=+0.152269425 container init 0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_germain, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc.)
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 2025-12-06 10:05:51.312105669 +0000 UTC m=+0.161953653 container start 0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_germain, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, RELEASE=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4)
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 2025-12-06 10:05:51.313405199 +0000 UTC m=+0.163253253 container attach 0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_germain, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218)
Dec 06 10:05:51 np0005548788.localdomain naughty_germain[294367]: 167 167
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: libpod-0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de.scope: Deactivated successfully.
Dec 06 10:05:51 np0005548788.localdomain podman[294352]: 2025-12-06 10:05:51.317395992 +0000 UTC m=+0.167243996 container died 0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_germain, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, vcs-type=git, release=1763362218, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e88 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e88 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e89 e89: 6 total, 6 up, 6 in
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr handle_mgr_map I was active but no longer am
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 06 10:05:51 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:51.378+0000 7fde93153640 -1 mgr handle_mgr_map I was active but no longer am
Dec 06 10:05:51 np0005548788.localdomain podman[294372]: 2025-12-06 10:05:51.423491018 +0000 UTC m=+0.096424561 container remove 0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_germain, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, build-date=2025-11-26T19:44:28Z, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1763362218, name=rhceph, ceph=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: libpod-conmon-0c9877c948b6f1874c7004f9d3aeca9b8f44a1a9aa5e60d8cf91fbf68806b4de.scope: Deactivated successfully.
Dec 06 10:05:51 np0005548788.localdomain sshd[289394]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:05:51 np0005548788.localdomain systemd-logind[765]: Session 64 logged out. Waiting for processes to exit.
Dec 06 10:05:51 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: ignoring --setuser ceph since I am not root
Dec 06 10:05:51 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: ignoring --setgroup ceph since I am not root
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: pidfile_write: ignore empty --pid-file
Dec 06 10:05:51 np0005548788.localdomain sudo[294316]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: session-64.scope: Deactivated successfully.
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: session-64.scope: Consumed 26.744s CPU time.
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'alerts'
Dec 06 10:05:51 np0005548788.localdomain systemd-logind[765]: Removed session 64.
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3205170338' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548787.umwsra
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: mgrmap e19: np0005548787.umwsra(active, starting, since 0.0519269s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548786.mczynb", "id": "np0005548786.mczynb"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: Manager daemon np0005548787.umwsra is now available
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"}]': finished
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"} : dispatch
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"}]': finished
Dec 06 10:05:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e89 _set_new_cache_sizes cache_size:1019500406 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'balancer'
Dec 06 10:05:51 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:51.611+0000 7f68adc50140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:05:51 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'cephadm'
Dec 06 10:05:51 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:51.679+0000 7f68adc50140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:05:51 np0005548788.localdomain sshd[294412]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:51 np0005548788.localdomain sshd[294412]: Accepted publickey for ceph-admin from 192.168.122.105 port 59996 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:05:51 np0005548788.localdomain systemd-logind[765]: New session 67 of user ceph-admin.
Dec 06 10:05:51 np0005548788.localdomain systemd[1]: Started Session 67 of User ceph-admin.
Dec 06 10:05:51 np0005548788.localdomain sshd[294412]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:05:51 np0005548788.localdomain sudo[294416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:51 np0005548788.localdomain sudo[294416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:51 np0005548788.localdomain sudo[294416]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:52 np0005548788.localdomain sudo[294434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:05:52 np0005548788.localdomain sudo[294434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ef9a77f97555d9b9f96ba603f56a651f034f2a99dbf66b58807a971f8ef65e22-merged.mount: Deactivated successfully.
Dec 06 10:05:52 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'crash'
Dec 06 10:05:52 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:05:52 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'dashboard'
Dec 06 10:05:52 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:52.336+0000 7f68adc50140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:05:52 np0005548788.localdomain ceph-mon[293643]: removing stray HostCache host record np0005548785.localdomain.devices.0
Dec 06 10:05:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548787.umwsra/mirror_snapshot_schedule"} : dispatch
Dec 06 10:05:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548787.umwsra/trash_purge_schedule"} : dispatch
Dec 06 10:05:52 np0005548788.localdomain ceph-mon[293643]: mgrmap e20: np0005548787.umwsra(active, since 1.0797s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:52 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'devicehealth'
Dec 06 10:05:52 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:05:52 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 10:05:52 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:52.902+0000 7f68adc50140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:05:52 np0005548788.localdomain systemd[1]: tmp-crun.z0jCAP.mount: Deactivated successfully.
Dec 06 10:05:52 np0005548788.localdomain podman[294528]: 2025-12-06 10:05:52.943280102 +0000 UTC m=+0.108551544 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., release=1763362218, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]:   from numpy import show_config as show_numpy_config
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:53.046+0000 7f68adc50140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'influx'
Dec 06 10:05:53 np0005548788.localdomain podman[294528]: 2025-12-06 10:05:53.069385554 +0000 UTC m=+0.234656976 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public)
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'insights'
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:53.106+0000 7f68adc50140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'iostat'
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'k8sevents'
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:53.218+0000 7f68adc50140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mon[293643]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:53 np0005548788.localdomain ceph-mon[293643]: mgrmap e21: np0005548787.umwsra(active, since 2s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'localpool'
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 10:05:53 np0005548788.localdomain sudo[294434]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'mirroring'
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'nfs'
Dec 06 10:05:53 np0005548788.localdomain sudo[294642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:53 np0005548788.localdomain sudo[294642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:53 np0005548788.localdomain sudo[294642]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:53 np0005548788.localdomain sudo[294660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:05:53 np0005548788.localdomain sudo[294660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:05:53 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'orchestrator'
Dec 06 10:05:53 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:53.953+0000 7f68adc50140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.095+0000 7f68adc50140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'osd_support'
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.158+0000 7f68adc50140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.212+0000 7f68adc50140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'progress'
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.276+0000 7f68adc50140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'prometheus'
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.335+0000 7f68adc50140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain sudo[294660]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:05:52] ENGINE Bus STARTING
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:05:52] ENGINE Serving on https://172.18.0.105:7150
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:05:52] ENGINE Client ('172.18.0.105', 55368) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:05:53] ENGINE Serving on http://172.18.0.105:8765
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:05:53] ENGINE Bus STARTED
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rbd_support'
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.628+0000 7f68adc50140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain sudo[294711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:54 np0005548788.localdomain sudo[294711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:54 np0005548788.localdomain sudo[294711]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'restful'
Dec 06 10:05:54 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:54.708+0000 7f68adc50140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:05:54 np0005548788.localdomain sudo[294729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:05:54 np0005548788.localdomain sudo[294729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:54 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rgw'
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rook'
Dec 06 10:05:55 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:55.027+0000 7f68adc50140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain sudo[294729]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain sudo[294767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:55 np0005548788.localdomain sudo[294767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294767]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain sudo[294785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:55 np0005548788.localdomain sudo[294785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294785]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:55.452+0000 7f68adc50140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'selftest'
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'snap_schedule'
Dec 06 10:05:55 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:55.513+0000 7f68adc50140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain sudo[294803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548788.localdomain sudo[294803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294803]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'stats'
Dec 06 10:05:55 np0005548788.localdomain sudo[294821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:55 np0005548788.localdomain sudo[294821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294821]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'status'
Dec 06 10:05:55 np0005548788.localdomain sudo[294839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548788.localdomain sudo[294839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294839]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'telegraf'
Dec 06 10:05:55 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:55.699+0000 7f68adc50140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'telemetry'
Dec 06 10:05:55 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:55.756+0000 7f68adc50140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain sudo[294873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548788.localdomain sudo[294873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294873]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain sudo[294891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548788.localdomain sudo[294891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294891]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 10:05:55 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:55.885+0000 7f68adc50140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:05:55 np0005548788.localdomain sudo[294909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:05:55 np0005548788.localdomain sudo[294909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548788.localdomain sudo[294909]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[294927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:56 np0005548788.localdomain sudo[294927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[294927]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'volumes'
Dec 06 10:05:56 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:56.029+0000 7f68adc50140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:05:56 np0005548788.localdomain sudo[294945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:56 np0005548788.localdomain sudo[294945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[294945]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd/host:np0005548786", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: mgrmap e22: np0005548787.umwsra(active, since 4s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:56 np0005548788.localdomain sudo[294963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548788.localdomain sudo[294963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[294963]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'zabbix'
Dec 06 10:05:56 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:56.212+0000 7f68adc50140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:05:56 np0005548788.localdomain sudo[294981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:56 np0005548788.localdomain sudo[294981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[294981]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:05:56 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:05:56.269+0000 7f68adc50140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x5599fb291600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Dec 06 10:05:56 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.105:6800/2780596136
Dec 06 10:05:56 np0005548788.localdomain sudo[294999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548788.localdomain sudo[294999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[294999]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548788.localdomain sudo[295033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295033]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548788.localdomain sudo[295051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295051]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e89 _set_new_cache_sizes cache_size:1020040244 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:56 np0005548788.localdomain sudo[295069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:56 np0005548788.localdomain sudo[295069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295069]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:56 np0005548788.localdomain sudo[295087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295087]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:56 np0005548788.localdomain sudo[295105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295105]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:56 np0005548788.localdomain sudo[295123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295123]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:56 np0005548788.localdomain sudo[295141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295141]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548788.localdomain sudo[295159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:56 np0005548788.localdomain sudo[295159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548788.localdomain sudo[295159]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548788.localdomain sudo[295193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295193]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548788.localdomain sudo[295211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295211]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548788.localdomain ceph-mon[293643]: Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:05:57 np0005548788.localdomain sudo[295229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:57 np0005548788.localdomain sudo[295229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295229]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:57 np0005548788.localdomain sudo[295247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295247]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:57 np0005548788.localdomain sudo[295265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295265]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548788.localdomain sudo[295283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295283]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:57 np0005548788.localdomain sudo[295301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295301]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548788.localdomain sudo[295319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548788.localdomain sudo[295353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295353]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548788.localdomain sudo[295371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295371]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548788.localdomain sudo[295389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:57 np0005548788.localdomain sudo[295389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548788.localdomain sudo[295389]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: mgrmap e23: np0005548787.umwsra(active, since 5s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548788.localdomain sudo[295407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:58 np0005548788.localdomain sudo[295407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:58 np0005548788.localdomain sudo[295407]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:58 np0005548788.localdomain sudo[295425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:58 np0005548788.localdomain sudo[295425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:58 np0005548788.localdomain sudo[295425]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:58 np0005548788.localdomain sudo[295443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:58 np0005548788.localdomain sudo[295443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 2025-12-06 10:05:59.186671057 +0000 UTC m=+0.074579701 container create 8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rosalind, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:59 np0005548788.localdomain systemd[1]: Started libpod-conmon-8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907.scope.
Dec 06 10:05:59 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 2025-12-06 10:05:59.156371317 +0000 UTC m=+0.044279961 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 2025-12-06 10:05:59.262232097 +0000 UTC m=+0.150140741 container init 8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rosalind, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, version=7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph)
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 2025-12-06 10:05:59.273766572 +0000 UTC m=+0.161675216 container start 8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rosalind, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 2025-12-06 10:05:59.273966338 +0000 UTC m=+0.161874992 container attach 8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rosalind, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, release=1763362218, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=)
Dec 06 10:05:59 np0005548788.localdomain fervent_rosalind[295493]: 167 167
Dec 06 10:05:59 np0005548788.localdomain systemd[1]: libpod-8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907.scope: Deactivated successfully.
Dec 06 10:05:59 np0005548788.localdomain podman[295478]: 2025-12-06 10:05:59.276775594 +0000 UTC m=+0.164684238 container died 8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rosalind, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True)
Dec 06 10:05:59 np0005548788.localdomain podman[295498]: 2025-12-06 10:05:59.371337217 +0000 UTC m=+0.084840366 container remove 8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rosalind, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.41.4, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git)
Dec 06 10:05:59 np0005548788.localdomain systemd[1]: libpod-conmon-8ef41936e54d3a6df030f2abe81573fe5133818683da1d2abf2e93564e3c2907.scope: Deactivated successfully.
Dec 06 10:05:59 np0005548788.localdomain sudo[295443]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.587575) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559588008, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11219, "num_deletes": 257, "total_data_size": 19110250, "memory_usage": 19897984, "flush_reason": "Manual Compaction"}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559667831, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 15400836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11224, "table_properties": {"data_size": 15342034, "index_size": 31944, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25797, "raw_key_size": 272609, "raw_average_key_size": 26, "raw_value_size": 15166007, "raw_average_value_size": 1470, "num_data_blocks": 1224, "num_entries": 10310, "num_filter_entries": 10310, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 1765015536, "file_creation_time": 1765015559, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 80319 microseconds, and 31958 cpu microseconds.
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.667903) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 15400836 bytes OK
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.667937) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.669840) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.669886) EVENT_LOG_v1 {"time_micros": 1765015559669879, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.669908) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 19034340, prev total WAL file size 19082829, number of live WAL files 2.
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.673048) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(14MB) 8(2012B)]
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559673140, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 15402848, "oldest_snapshot_seqno": -1}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10059 keys, 15397442 bytes, temperature: kUnknown
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559770749, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 15397442, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15339346, "index_size": 31883, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25157, "raw_key_size": 267766, "raw_average_key_size": 26, "raw_value_size": 15166662, "raw_average_value_size": 1507, "num_data_blocks": 1223, "num_entries": 10059, "num_filter_entries": 10059, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015559, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.771048) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 15397442 bytes
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.772745) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.6 rd, 157.6 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(14.7, 0.0 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10315, records dropped: 256 output_compression: NoCompression
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.772773) EVENT_LOG_v1 {"time_micros": 1765015559772761, "job": 4, "event": "compaction_finished", "compaction_time_micros": 97717, "compaction_time_cpu_micros": 37341, "output_level": 6, "num_output_files": 1, "total_output_size": 15397442, "num_input_records": 10315, "num_output_records": 10059, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:05:59.672952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559776624, "job": 0, "event": "table_file_deletion", "file_number": 14}
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:59 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559776713, "job": 0, "event": "table_file_deletion", "file_number": 8}
Dec 06 10:05:59 np0005548788.localdomain sudo[295523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:59 np0005548788.localdomain sudo[295523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:59 np0005548788.localdomain sudo[295523]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:59 np0005548788.localdomain sudo[295541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:59 np0005548788.localdomain sudo[295541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4508a5afa918ec9dee22a30c3694550aab3f45cf4bfa008c063c96e028c92f25-merged.mount: Deactivated successfully.
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 2025-12-06 10:06:00.363619494 +0000 UTC m=+0.075988194 container create 3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_mendeleev, ceph=True, vcs-type=git, release=1763362218, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.expose-services=, version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: Started libpod-conmon-3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af.scope.
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 2025-12-06 10:06:00.333909051 +0000 UTC m=+0.046277791 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 2025-12-06 10:06:00.437062549 +0000 UTC m=+0.149431229 container init 3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_mendeleev, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, RELEASE=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True)
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 2025-12-06 10:06:00.448678595 +0000 UTC m=+0.161047275 container start 3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_mendeleev, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.41.4, architecture=x86_64, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=)
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 2025-12-06 10:06:00.448950143 +0000 UTC m=+0.161318873 container attach 3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_mendeleev, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-26T19:44:28Z)
Dec 06 10:06:00 np0005548788.localdomain priceless_mendeleev[295591]: 167 167
Dec 06 10:06:00 np0005548788.localdomain podman[295575]: 2025-12-06 10:06:00.454110402 +0000 UTC m=+0.166479082 container died 3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_mendeleev, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, release=1763362218, RELEASE=main, GIT_CLEAN=True)
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: libpod-3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af.scope: Deactivated successfully.
Dec 06 10:06:00 np0005548788.localdomain podman[295590]: 2025-12-06 10:06:00.544811507 +0000 UTC m=+0.143321722 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 06 10:06:00 np0005548788.localdomain podman[295590]: 2025-12-06 10:06:00.58659422 +0000 UTC m=+0.185104405 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:06:00 np0005548788.localdomain podman[295606]: 2025-12-06 10:06:00.648739348 +0000 UTC m=+0.185404414 container remove 3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_mendeleev, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.41.4, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:00 np0005548788.localdomain systemd[1]: libpod-conmon-3d3893d394e1c916e4f5fa34f7e035ac1e2239ff4ecb6caf45fc72916d57e9af.scope: Deactivated successfully.
Dec 06 10:06:00 np0005548788.localdomain sudo[295541]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d24d66052b3f4e49dfcec6f5b63e6c9c56dd67713547a1421dfe277bc8b2aa0f-merged.mount: Deactivated successfully.
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e89 _set_new_cache_sizes cache_size:1020054482 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:06:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:06:03 np0005548788.localdomain systemd[1]: tmp-crun.WJnYsK.mount: Deactivated successfully.
Dec 06 10:06:03 np0005548788.localdomain podman[295631]: 2025-12-06 10:06:03.261354805 +0000 UTC m=+0.091916073 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:06:03 np0005548788.localdomain podman[295631]: 2025-12-06 10:06:03.275724266 +0000 UTC m=+0.106285494 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:06:03 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:06:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='client.44134 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:06:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:06:05 np0005548788.localdomain systemd[1]: tmp-crun.9eLhPE.mount: Deactivated successfully.
Dec 06 10:06:05 np0005548788.localdomain podman[295654]: 2025-12-06 10:06:05.249688084 +0000 UTC m=+0.080392500 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:06:05 np0005548788.localdomain podman[295654]: 2025-12-06 10:06:05.285603757 +0000 UTC m=+0.116308193 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:06:05 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: Saving service mon spec with placement label:mon
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e89 _set_new_cache_sizes cache_size:1020054727 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:06:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e9 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 06 10:06:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2211595861' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:06:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/2211595861' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:11 np0005548788.localdomain sudo[295672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:11 np0005548788.localdomain sudo[295672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:11 np0005548788.localdomain sudo[295672]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:12 np0005548788.localdomain sshd[291597]: Received disconnect from 192.168.122.11 port 57426:11: disconnected by user
Dec 06 10:06:12 np0005548788.localdomain sshd[291597]: Disconnected from user tripleo-admin 192.168.122.11 port 57426
Dec 06 10:06:12 np0005548788.localdomain sshd[291577]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 06 10:06:12 np0005548788.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Dec 06 10:06:12 np0005548788.localdomain systemd[1]: session-65.scope: Consumed 1.792s CPU time.
Dec 06 10:06:12 np0005548788.localdomain systemd-logind[765]: Session 65 logged out. Waiting for processes to exit.
Dec 06 10:06:12 np0005548788.localdomain systemd-logind[765]: Removed session 65.
Dec 06 10:06:13 np0005548788.localdomain sudo[295690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:13 np0005548788.localdomain sudo[295690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:13 np0005548788.localdomain sudo[295690]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:13 np0005548788.localdomain sudo[295708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:13 np0005548788.localdomain sudo[295708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/2329222999' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 2025-12-06 10:06:13.802626961 +0000 UTC m=+0.088957422 container create 8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_keldysh, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.tags=rhceph ceph, name=rhceph, release=1763362218, vcs-type=git, distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 10:06:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a.scope.
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 2025-12-06 10:06:13.762132658 +0000 UTC m=+0.048463169 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:13 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 2025-12-06 10:06:13.889413716 +0000 UTC m=+0.175744177 container init 8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_keldysh, RELEASE=main, release=1763362218, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public)
Dec 06 10:06:13 np0005548788.localdomain systemd[1]: tmp-crun.vRKcCd.mount: Deactivated successfully.
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 2025-12-06 10:06:13.90517848 +0000 UTC m=+0.191508951 container start 8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_keldysh, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, name=rhceph, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 2025-12-06 10:06:13.90548067 +0000 UTC m=+0.191811131 container attach 8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_keldysh, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 10:06:13 np0005548788.localdomain relaxed_keldysh[295758]: 167 167
Dec 06 10:06:13 np0005548788.localdomain systemd[1]: libpod-8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a.scope: Deactivated successfully.
Dec 06 10:06:13 np0005548788.localdomain podman[295744]: 2025-12-06 10:06:13.911721351 +0000 UTC m=+0.198051872 container died 8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_keldysh, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, RELEASE=main, release=1763362218, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:06:14 np0005548788.localdomain podman[295763]: 2025-12-06 10:06:14.010126052 +0000 UTC m=+0.089225921 container remove 8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_keldysh, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, GIT_BRANCH=main)
Dec 06 10:06:14 np0005548788.localdomain systemd[1]: libpod-conmon-8601ad6ea8021d7308da81c95d2169aff9755620fa690f97c8eadc6b2b15171a.scope: Deactivated successfully.
Dec 06 10:06:14 np0005548788.localdomain sudo[295708]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:06:14 np0005548788.localdomain podman[295779]: 2025-12-06 10:06:14.746916424 +0000 UTC m=+0.076526020 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:06:14 np0005548788.localdomain podman[295779]: 2025-12-06 10:06:14.787564623 +0000 UTC m=+0.117174169 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:06:14 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:06:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-38cf8c8f77ba3434cc6b75b54bb8762940b251c2ad7b1045a6d49bcdcc5100b6-merged.mount: Deactivated successfully.
Dec 06 10:06:15 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:06:15 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:06:15 np0005548788.localdomain ceph-mon[293643]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:16 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x5599fb291600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Dec 06 10:06:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@4(peon) e10  my rank is now 3 (was 4)
Dec 06 10:06:16 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:06:16 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:06:16 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x559a0377c000 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Dec 06 10:06:16 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:06:16 np0005548788.localdomain ceph-mon[293643]: paxos.3).electionLogic(44) init, last seen epoch 44
Dec 06 10:06:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: from='client.26906 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548786"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: Remove daemons mon.np0005548786
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: Safe to remove mon.np0005548786: new quorum should be ['np0005548787', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: Removing monitor np0005548786 from monmap...
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: Removing daemon mon.np0005548786 from np0005548786.localdomain -- ports []
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3)
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: monmap epoch 10
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:06:16.211793+0000
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: mgrmap e23: np0005548787.umwsra(active, since 26s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:18 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:06:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:06:19 np0005548788.localdomain systemd[1]: tmp-crun.seDDWi.mount: Deactivated successfully.
Dec 06 10:06:19 np0005548788.localdomain podman[295803]: 2025-12-06 10:06:19.263363066 +0000 UTC m=+0.092884292 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:06:19 np0005548788.localdomain sudo[295815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:06:19 np0005548788.localdomain sudo[295815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295815]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain podman[295803]: 2025-12-06 10:06:19.301093565 +0000 UTC m=+0.130614811 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:06:19 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:06:19 np0005548788.localdomain sudo[295840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:06:19 np0005548788.localdomain sudo[295840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295840]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain ceph-mon[293643]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:19 np0005548788.localdomain sudo[295858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548788.localdomain sudo[295858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295858]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain sudo[295876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:19 np0005548788.localdomain sudo[295876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295876]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:06:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:06:19 np0005548788.localdomain sudo[295894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548788.localdomain sudo[295894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295894]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:06:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:06:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:06:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18216 "" "Go-http-client/1.1"
Dec 06 10:06:19 np0005548788.localdomain sudo[295928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548788.localdomain sudo[295928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295928]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain sudo[295946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548788.localdomain sudo[295946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295946]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548788.localdomain sudo[295964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:06:19 np0005548788.localdomain sudo[295964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548788.localdomain sudo[295964]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[295982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:20 np0005548788.localdomain sudo[295982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain sudo[295982]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[296000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:20 np0005548788.localdomain sudo[296000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain sudo[296000]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[296018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548788.localdomain sudo[296018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain sudo[296018]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[296036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:20 np0005548788.localdomain sudo[296036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain sudo[296036]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: Removed label mon from host np0005548786.localdomain
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:20 np0005548788.localdomain sudo[296054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548788.localdomain sudo[296054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain sudo[296054]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[296088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548788.localdomain sudo[296088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain sudo[296088]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[296106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548788.localdomain sudo[296106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:06:20 np0005548788.localdomain sudo[296106]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain sudo[296125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:20 np0005548788.localdomain sudo[296125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:06:20 np0005548788.localdomain sudo[296125]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548788.localdomain podman[296124]: 2025-12-06 10:06:20.731976009 +0000 UTC m=+0.098251819 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:06:20 np0005548788.localdomain podman[296124]: 2025-12-06 10:06:20.742347877 +0000 UTC m=+0.108623637 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:06:20 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:06:20 np0005548788.localdomain podman[296160]: 2025-12-06 10:06:20.82354734 +0000 UTC m=+0.084375422 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, config_id=edpm, name=ubi9-minimal, vcs-type=git, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Dec 06 10:06:20 np0005548788.localdomain podman[296160]: 2025-12-06 10:06:20.841529973 +0000 UTC m=+0.102358085 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7)
Dec 06 10:06:20 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.041740) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581041777, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1243, "num_deletes": 257, "total_data_size": 2169704, "memory_usage": 2212488, "flush_reason": "Manual Compaction"}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581054662, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1158451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11229, "largest_seqno": 12467, "table_properties": {"data_size": 1152861, "index_size": 2869, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13700, "raw_average_key_size": 20, "raw_value_size": 1140793, "raw_average_value_size": 1728, "num_data_blocks": 120, "num_entries": 660, "num_filter_entries": 660, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015559, "oldest_key_time": 1765015559, "file_creation_time": 1765015581, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 12968 microseconds, and 4704 cpu microseconds.
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.054708) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1158451 bytes OK
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.054729) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.056823) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.056873) EVENT_LOG_v1 {"time_micros": 1765015581056866, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.056892) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2163230, prev total WAL file size 2163230, number of live WAL files 2.
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.057539) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303330' seq:72057594037927935, type:22 .. '6B760031323837' seq:0, type:0; will stop at (end)
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1131KB)], [15(14MB)]
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581057571, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 16555893, "oldest_snapshot_seqno": -1}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10183 keys, 15590733 bytes, temperature: kUnknown
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581137675, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 15590733, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15531974, "index_size": 32226, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25477, "raw_key_size": 272539, "raw_average_key_size": 26, "raw_value_size": 15357103, "raw_average_value_size": 1508, "num_data_blocks": 1220, "num_entries": 10183, "num_filter_entries": 10183, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015581, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.138272) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 15590733 bytes
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.140287) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.2 rd, 194.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 14.7 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(27.7) write-amplify(13.5) OK, records in: 10719, records dropped: 536 output_compression: NoCompression
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.140326) EVENT_LOG_v1 {"time_micros": 1765015581140309, "job": 6, "event": "compaction_finished", "compaction_time_micros": 80302, "compaction_time_cpu_micros": 41369, "output_level": 6, "num_output_files": 1, "total_output_size": 15590733, "num_input_records": 10719, "num_output_records": 10183, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581140703, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581143471, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.057495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.143553) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.143563) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.143568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.143572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:21.143576) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='client.34289 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: Removed label mgr from host np0005548786.localdomain
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:22 np0005548788.localdomain ceph-mon[293643]: Removing daemon mgr.np0005548786.mczynb from np0005548786.localdomain -- ports [8765]
Dec 06 10:06:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Activating special unit Exit the Session...
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped target Main User Target.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped target Basic System.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped target Paths.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped target Sockets.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped target Timers.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Closed D-Bus User Message Bus Socket.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Removed slice User Application Slice.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Reached target Shutdown.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Finished Exit the Session.
Dec 06 10:06:22 np0005548788.localdomain systemd[291581]: Reached target Exit the Session.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 06 10:06:22 np0005548788.localdomain systemd[1]: user-1003.slice: Consumed 2.438s CPU time.
Dec 06 10:06:23 np0005548788.localdomain sudo[296187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:23 np0005548788.localdomain sudo[296187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:23 np0005548788.localdomain sudo[296187]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: from='client.44183 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: Removed label _admin from host np0005548786.localdomain
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth rm", "entity": "mgr.np0005548786.mczynb"} : dispatch
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005548786.mczynb"}]': finished
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:24 np0005548788.localdomain ceph-mon[293643]: Removing key for mgr.np0005548786.mczynb
Dec 06 10:06:25 np0005548788.localdomain sudo[296205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:25 np0005548788.localdomain sudo[296205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:25 np0005548788.localdomain sudo[296205]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: Removing np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: Removing np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: Removing np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/544071182' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1605363813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:28.332 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:28.349 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:28.349 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:28.349 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:28.349 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:06:28 np0005548788.localdomain ceph-mon[293643]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:28 np0005548788.localdomain sudo[296223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:28 np0005548788.localdomain sudo[296223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:28 np0005548788.localdomain sudo[296223]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:29.017 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:29 np0005548788.localdomain sudo[296241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:29 np0005548788.localdomain sudo[296241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 2025-12-06 10:06:29.526404073 +0000 UTC m=+0.077454782 container create 16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_rhodes, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True)
Dec 06 10:06:29 np0005548788.localdomain systemd[1]: Started libpod-conmon-16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113.scope.
Dec 06 10:06:29 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 2025-12-06 10:06:29.495272766 +0000 UTC m=+0.046323516 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 2025-12-06 10:06:29.606165104 +0000 UTC m=+0.157215813 container init 16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_rhodes, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 2025-12-06 10:06:29.615471481 +0000 UTC m=+0.166522170 container start 16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_rhodes, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, release=1763362218, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 2025-12-06 10:06:29.615673037 +0000 UTC m=+0.166723836 container attach 16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_rhodes, name=rhceph, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z)
Dec 06 10:06:29 np0005548788.localdomain hardcore_rhodes[296292]: 167 167
Dec 06 10:06:29 np0005548788.localdomain systemd[1]: libpod-16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113.scope: Deactivated successfully.
Dec 06 10:06:29 np0005548788.localdomain podman[296277]: 2025-12-06 10:06:29.620446013 +0000 UTC m=+0.171496702 container died 16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_rhodes, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, RELEASE=main, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, release=1763362218, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph)
Dec 06 10:06:29 np0005548788.localdomain podman[296297]: 2025-12-06 10:06:29.717650201 +0000 UTC m=+0.085145428 container remove 16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_rhodes, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container)
Dec 06 10:06:29 np0005548788.localdomain systemd[1]: libpod-conmon-16c0756a8ef056f553ec92fa9f2dc77d2b253bedaf813c9f37cbf8dfa70af113.scope: Deactivated successfully.
Dec 06 10:06:29 np0005548788.localdomain sudo[296241]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/308530152' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:29 np0005548788.localdomain sudo[296313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:29 np0005548788.localdomain sudo[296313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:29 np0005548788.localdomain sudo[296313]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/308530152' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:29 np0005548788.localdomain sudo[296331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:29 np0005548788.localdomain sudo[296331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:30.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:30.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:06:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:30.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:06:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:30.022 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:06:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:30.023 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:30.023 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 2025-12-06 10:06:30.424573527 +0000 UTC m=+0.075194441 container create 704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_burnell, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1763362218, ceph=True)
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: Started libpod-conmon-704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0.scope.
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 2025-12-06 10:06:30.491894916 +0000 UTC m=+0.142515830 container init 704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_burnell, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4)
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 2025-12-06 10:06:30.394090241 +0000 UTC m=+0.044711185 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 2025-12-06 10:06:30.502017258 +0000 UTC m=+0.152638172 container start 704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_burnell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1763362218, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-11-26T19:44:28Z)
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 2025-12-06 10:06:30.502264065 +0000 UTC m=+0.152885019 container attach 704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_burnell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-type=git, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:30 np0005548788.localdomain objective_burnell[296382]: 167 167
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: libpod-704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0.scope: Deactivated successfully.
Dec 06 10:06:30 np0005548788.localdomain podman[296367]: 2025-12-06 10:06:30.506339711 +0000 UTC m=+0.156960685 container died 704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_burnell, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=1763362218, version=7, RELEASE=main, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-94fb4ee1e738789674ee68fdd55dfcd25ba02ef156b38e581d3a8003e06ef23b-merged.mount: Deactivated successfully.
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: tmp-crun.FrVvd0.mount: Deactivated successfully.
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6ea1ebeaf532e269f1c0c6daf09d79fb4ebcacbdde32a577e1d7aa365f42835a-merged.mount: Deactivated successfully.
Dec 06 10:06:30 np0005548788.localdomain podman[296387]: 2025-12-06 10:06:30.628579897 +0000 UTC m=+0.108800755 container remove 704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_burnell, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: libpod-conmon-704bce2d2d5cc76a0caaf8e4886df549f32b341dc59a3da377fdc2def0047de0.scope: Deactivated successfully.
Dec 06 10:06:30 np0005548788.localdomain podman[296401]: 2025-12-06 10:06:30.744520161 +0000 UTC m=+0.100436908 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:06:30 np0005548788.localdomain podman[296401]: 2025-12-06 10:06:30.787826752 +0000 UTC m=+0.143743509 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:06:30 np0005548788.localdomain sudo[296331]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:30 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:06:30 np0005548788.localdomain sudo[296426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3072124168' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:06:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:30 np0005548788.localdomain sudo[296426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:30 np0005548788.localdomain sudo[296426]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:30 np0005548788.localdomain sudo[296444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:30 np0005548788.localdomain sudo[296444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.025 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.027 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.028 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 2025-12-06 10:06:31.443511664 +0000 UTC m=+0.082997072 container create 9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_cori, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, release=1763362218, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4)
Dec 06 10:06:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510.scope.
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.483 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:06:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 2025-12-06 10:06:31.503363324 +0000 UTC m=+0.142848732 container init 9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_cori, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, vendor=Red Hat, Inc., architecture=x86_64, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 2025-12-06 10:06:31.512941548 +0000 UTC m=+0.152426956 container start 9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_cori, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph)
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 2025-12-06 10:06:31.413049577 +0000 UTC m=+0.052535015 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 2025-12-06 10:06:31.513225367 +0000 UTC m=+0.152710785 container attach 9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_cori, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main)
Dec 06 10:06:31 np0005548788.localdomain systemd[1]: libpod-9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510.scope: Deactivated successfully.
Dec 06 10:06:31 np0005548788.localdomain optimistic_cori[296515]: 167 167
Dec 06 10:06:31 np0005548788.localdomain podman[296498]: 2025-12-06 10:06:31.515635351 +0000 UTC m=+0.155120779 container died 9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_cori, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph)
Dec 06 10:06:31 np0005548788.localdomain systemd[1]: tmp-crun.T8BNpb.mount: Deactivated successfully.
Dec 06 10:06:31 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bc5bcc26eb72ab7600521906dea4faf99c0e7da3670769d3cabba514a2be86f0-merged.mount: Deactivated successfully.
Dec 06 10:06:31 np0005548788.localdomain podman[296520]: 2025-12-06 10:06:31.60541224 +0000 UTC m=+0.077388760 container remove 9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_cori, vcs-type=git, com.redhat.component=rhceph-container, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Dec 06 10:06:31 np0005548788.localdomain systemd[1]: libpod-conmon-9b95f791c0abc2d50a648176fc69411e25fdda9b9232d324bd13362716a2c510.scope: Deactivated successfully.
Dec 06 10:06:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.694 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.696 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12361MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.697 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.697 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:31 np0005548788.localdomain sudo[296444]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.751 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.752 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:06:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:31.770 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:06:31 np0005548788.localdomain sudo[296545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:31 np0005548788.localdomain sudo[296545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:31 np0005548788.localdomain sudo[296545]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:31 np0005548788.localdomain sudo[296563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:31 np0005548788.localdomain sudo[296563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/406415283' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:06:32 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3079514338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:32.225 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:06:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:32.234 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:06:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:32.252 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:06:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:32.254 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:06:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:06:32.255 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 2025-12-06 10:06:32.381977217 +0000 UTC m=+0.071603911 container create fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_curran, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc.)
Dec 06 10:06:32 np0005548788.localdomain systemd[1]: Started libpod-conmon-fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b.scope.
Dec 06 10:06:32 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 2025-12-06 10:06:32.448736299 +0000 UTC m=+0.138362993 container init fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1763362218, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 2025-12-06 10:06:32.352709368 +0000 UTC m=+0.042336102 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 2025-12-06 10:06:32.457962242 +0000 UTC m=+0.147588936 container start fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_curran, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 2025-12-06 10:06:32.45822401 +0000 UTC m=+0.147850714 container attach fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_curran, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 10:06:32 np0005548788.localdomain reverent_curran[296632]: 167 167
Dec 06 10:06:32 np0005548788.localdomain systemd[1]: libpod-fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b.scope: Deactivated successfully.
Dec 06 10:06:32 np0005548788.localdomain podman[296618]: 2025-12-06 10:06:32.463016028 +0000 UTC m=+0.152642732 container died fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_curran, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, io.buildah.version=1.41.4, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:06:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9ba521933d698a77782c934f3050f88337020bed467b0e50e41c0d4e3bb49832-merged.mount: Deactivated successfully.
Dec 06 10:06:33 np0005548788.localdomain podman[296637]: 2025-12-06 10:06:33.071369355 +0000 UTC m=+0.594510463 container remove fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_curran, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=1763362218, build-date=2025-11-26T19:44:28Z, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: libpod-conmon-fdb4ae68373a9c17164e98f7ec7e2e465c64285414b2dc8315adf55eb477843b.scope: Deactivated successfully.
Dec 06 10:06:33 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:06:33 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:06:33 np0005548788.localdomain ceph-mon[293643]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:33 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3079514338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:33 np0005548788.localdomain sudo[296563]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:33 np0005548788.localdomain sudo[296654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:33 np0005548788.localdomain sudo[296654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:33 np0005548788.localdomain sudo[296654]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:33 np0005548788.localdomain sudo[296672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:33 np0005548788.localdomain sudo[296672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:06:33 np0005548788.localdomain podman[296690]: 2025-12-06 10:06:33.413981595 +0000 UTC m=+0.080466954 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:06:33 np0005548788.localdomain podman[296690]: 2025-12-06 10:06:33.427678786 +0000 UTC m=+0.094164145 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 2025-12-06 10:06:33.770728169 +0000 UTC m=+0.074900983 container create a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ramanujan, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True, version=7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: Started libpod-conmon-a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23.scope.
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 2025-12-06 10:06:33.837544423 +0000 UTC m=+0.141717227 container init a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ramanujan, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, distribution-scope=public, name=rhceph, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container)
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 2025-12-06 10:06:33.739331605 +0000 UTC m=+0.043504459 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 2025-12-06 10:06:33.847170059 +0000 UTC m=+0.151342863 container start a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ramanujan, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 2025-12-06 10:06:33.847466848 +0000 UTC m=+0.151639682 container attach a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ramanujan, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, release=1763362218, version=7, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:33 np0005548788.localdomain admiring_ramanujan[296744]: 167 167
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: libpod-a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23.scope: Deactivated successfully.
Dec 06 10:06:33 np0005548788.localdomain podman[296729]: 2025-12-06 10:06:33.853517763 +0000 UTC m=+0.157690597 container died a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ramanujan, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public)
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-90f4f71df27135595df00f346b189c638abfd8486a119198d25faeeae2227542-merged.mount: Deactivated successfully.
Dec 06 10:06:33 np0005548788.localdomain podman[296749]: 2025-12-06 10:06:33.945532341 +0000 UTC m=+0.083818708 container remove a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ramanujan, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:06:33 np0005548788.localdomain systemd[1]: libpod-conmon-a85f99b3aa975a77fc1cb643a54dc79349255badc39d66569dd7341cf4406e23.scope: Deactivated successfully.
Dec 06 10:06:34 np0005548788.localdomain sudo[296672]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:34 np0005548788.localdomain sudo[296766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:34 np0005548788.localdomain sudo[296766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:34 np0005548788.localdomain sudo[296766]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:34 np0005548788.localdomain sudo[296784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:34 np0005548788.localdomain sudo[296784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 2025-12-06 10:06:34.695041008 +0000 UTC m=+0.075077339 container create f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lewin, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:06:34 np0005548788.localdomain systemd[1]: Started libpod-conmon-f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b.scope.
Dec 06 10:06:34 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 2025-12-06 10:06:34.760147868 +0000 UTC m=+0.140184209 container init f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lewin, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, version=7, RELEASE=main)
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 2025-12-06 10:06:34.66455526 +0000 UTC m=+0.044591621 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 2025-12-06 10:06:34.769030081 +0000 UTC m=+0.149066412 container start f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lewin, ceph=True, distribution-scope=public, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218)
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 2025-12-06 10:06:34.769435884 +0000 UTC m=+0.149472255 container attach f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lewin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, version=7, io.buildah.version=1.41.4, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1763362218)
Dec 06 10:06:34 np0005548788.localdomain modest_lewin[296833]: 167 167
Dec 06 10:06:34 np0005548788.localdomain systemd[1]: libpod-f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b.scope: Deactivated successfully.
Dec 06 10:06:34 np0005548788.localdomain podman[296818]: 2025-12-06 10:06:34.772440516 +0000 UTC m=+0.152476917 container died f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lewin, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, architecture=x86_64, name=rhceph, io.buildah.version=1.41.4, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 06 10:06:34 np0005548788.localdomain systemd[1]: tmp-crun.p1V8Td.mount: Deactivated successfully.
Dec 06 10:06:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-45906e4786f4e655337ac028cf995d580a77fe4674ac305bc7b11d31073d2c95-merged.mount: Deactivated successfully.
Dec 06 10:06:34 np0005548788.localdomain podman[296838]: 2025-12-06 10:06:34.872731848 +0000 UTC m=+0.089272645 container remove f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lewin, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:06:34 np0005548788.localdomain systemd[1]: libpod-conmon-f5d36ccccabe665975615e790003788e3cd37e87fb40eaa341624210741fe34b.scope: Deactivated successfully.
Dec 06 10:06:34 np0005548788.localdomain sudo[296784]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: from='client.34370 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548786.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: Added label _no_schedule to host np0005548786.localdomain
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548786.localdomain
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: from='client.44219 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548786.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:36 np0005548788.localdomain systemd[1]: tmp-crun.u5IIfu.mount: Deactivated successfully.
Dec 06 10:06:36 np0005548788.localdomain podman[296854]: 2025-12-06 10:06:36.246993156 +0000 UTC m=+0.079072651 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:06:36 np0005548788.localdomain podman[296854]: 2025-12-06 10:06:36.256693453 +0000 UTC m=+0.088772938 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:06:36 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:06:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain"} : dispatch
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain"}]': finished
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:06:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: from='client.26868 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548786.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: Removed host np0005548786.localdomain
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:06:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:06:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon) e10 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon) e10 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:06:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:06:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.048554) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601048637, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1049, "num_deletes": 258, "total_data_size": 1528838, "memory_usage": 1555000, "flush_reason": "Manual Compaction"}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601058604, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 889561, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12472, "largest_seqno": 13516, "table_properties": {"data_size": 884640, "index_size": 2328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12862, "raw_average_key_size": 21, "raw_value_size": 873892, "raw_average_value_size": 1442, "num_data_blocks": 100, "num_entries": 606, "num_filter_entries": 606, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015581, "oldest_key_time": 1765015581, "file_creation_time": 1765015601, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 10101 microseconds, and 3314 cpu microseconds.
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.058665) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 889561 bytes OK
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.058688) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.061113) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.061133) EVENT_LOG_v1 {"time_micros": 1765015601061127, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.061156) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1523229, prev total WAL file size 1523553, number of live WAL files 2.
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.061901) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353135' seq:72057594037927935, type:22 .. '6C6F676D0033373638' seq:0, type:0; will stop at (end)
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(868KB)], [18(14MB)]
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601061959, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 16480294, "oldest_snapshot_seqno": -1}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10245 keys, 16342382 bytes, temperature: kUnknown
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601151436, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 16342382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16281797, "index_size": 33860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25669, "raw_key_size": 275387, "raw_average_key_size": 26, "raw_value_size": 16104514, "raw_average_value_size": 1571, "num_data_blocks": 1290, "num_entries": 10245, "num_filter_entries": 10245, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015601, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.151786) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 16342382 bytes
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.153416) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.0 rd, 182.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 14.9 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(36.9) write-amplify(18.4) OK, records in: 10789, records dropped: 544 output_compression: NoCompression
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.153450) EVENT_LOG_v1 {"time_micros": 1765015601153434, "job": 8, "event": "compaction_finished", "compaction_time_micros": 89575, "compaction_time_cpu_micros": 43825, "output_level": 6, "num_output_files": 1, "total_output_size": 16342382, "num_input_records": 10789, "num_output_records": 10245, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601153791, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601156398, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.061774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.156594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.156607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.156610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.156613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:06:41.156617) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:42 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:06:42 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:06:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:06:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:06:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:06:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:06:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:06:45 np0005548788.localdomain podman[296872]: 2025-12-06 10:06:45.254897526 +0000 UTC m=+0.081550157 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:45 np0005548788.localdomain podman[296872]: 2025-12-06 10:06:45.324060912 +0000 UTC m=+0.150713603 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:06:45 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:06:46 np0005548788.localdomain sudo[296898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:46 np0005548788.localdomain sudo[296898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:46 np0005548788.localdomain sudo[296898]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='client.44243 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: Saving service mon spec with placement label:mon
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: from='client.44251 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:06:47.428 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:06:47.429 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:06:47.429 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:47 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x5599fb291600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@3(peon) e11  my rank is now 2 (was 3)
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: paxos.2).electionLogic(46) init, last seen epoch 46
Dec 06 10:06:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:06:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:06:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:06:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:06:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:06:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18218 "" "Go-http-client/1.1"
Dec 06 10:06:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:06:50 np0005548788.localdomain podman[296916]: 2025-12-06 10:06:50.28000121 +0000 UTC m=+0.103218823 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 06 10:06:50 np0005548788.localdomain podman[296916]: 2025-12-06 10:06:50.293587788 +0000 UTC m=+0.116805361 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:06:50 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:06:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:06:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:06:51 np0005548788.localdomain podman[296935]: 2025-12-06 10:06:51.262433205 +0000 UTC m=+0.083269571 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64)
Dec 06 10:06:51 np0005548788.localdomain podman[296935]: 2025-12-06 10:06:51.274742993 +0000 UTC m=+0.095579309 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Dec 06 10:06:51 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:06:51 np0005548788.localdomain podman[296934]: 2025-12-06 10:06:51.3631457 +0000 UTC m=+0.185461531 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:06:51 np0005548788.localdomain podman[296934]: 2025-12-06 10:06:51.37486313 +0000 UTC m=+0.197178961 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:06:51 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:06:52 np0005548788.localdomain ceph-mon[293643]: paxos.2).electionLogic(47) init, last seen epoch 47, mid-election, bumping
Dec 06 10:06:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:52 np0005548788.localdomain sudo[296978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:06:52 np0005548788.localdomain sudo[296978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548788.localdomain sudo[296978]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 np0005548788.localdomain sudo[296996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:06:52 np0005548788.localdomain sudo[296996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548788.localdomain sudo[296996]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 np0005548788.localdomain sudo[297014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:52 np0005548788.localdomain sudo[297014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548788.localdomain sudo[297014]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 np0005548788.localdomain sudo[297032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:52 np0005548788.localdomain sudo[297032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548788.localdomain sudo[297032]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297050]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297084]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297102]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain sudo[297120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297120]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:53 np0005548788.localdomain sudo[297138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297138]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:53 np0005548788.localdomain sudo[297156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297156]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297174]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:53 np0005548788.localdomain sudo[297192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297192]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: monmap epoch 11
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:06:47.518948+0000
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mgrmap e23: np0005548787.umwsra(active, since 61s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1/3 mons down, quorum np0005548787,np0005548790 (MON_DOWN)
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548788 in quorum (ranks 0,1,2)
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: monmap epoch 11
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:06:47.518948+0000
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: mgrmap e23: np0005548787.umwsra(active, since 61s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548787,np0005548790)
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain sudo[297210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297210]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297244]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548788.localdomain sudo[297262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297262]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548788.localdomain sudo[297280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:53 np0005548788.localdomain sudo[297280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548788.localdomain sudo[297280]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:54 np0005548788.localdomain sudo[297298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:54 np0005548788.localdomain sudo[297298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:54 np0005548788.localdomain sudo[297298]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:54 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:06:56 np0005548788.localdomain sudo[297316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:56 np0005548788.localdomain sudo[297316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:56 np0005548788.localdomain sudo[297316]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:56 np0005548788.localdomain sudo[297334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:56 np0005548788.localdomain sudo[297334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:56 np0005548788.localdomain podman[297368]: 
Dec 06 10:06:56 np0005548788.localdomain podman[297368]: 2025-12-06 10:06:56.919109909 +0000 UTC m=+0.082398493 container create 9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_noyce, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, build-date=2025-11-26T19:44:28Z, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:56 np0005548788.localdomain systemd[1]: Started libpod-conmon-9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1.scope.
Dec 06 10:06:56 np0005548788.localdomain podman[297368]: 2025-12-06 10:06:56.883579566 +0000 UTC m=+0.046868180 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:56 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:57 np0005548788.localdomain podman[297368]: 2025-12-06 10:06:56.999932603 +0000 UTC m=+0.163221197 container init 9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_noyce, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True)
Dec 06 10:06:57 np0005548788.localdomain podman[297368]: 2025-12-06 10:06:57.016713989 +0000 UTC m=+0.180002573 container start 9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_noyce, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:06:57 np0005548788.localdomain podman[297368]: 2025-12-06 10:06:57.017105061 +0000 UTC m=+0.180393645 container attach 9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_noyce, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, release=1763362218, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4)
Dec 06 10:06:57 np0005548788.localdomain clever_noyce[297383]: 167 167
Dec 06 10:06:57 np0005548788.localdomain systemd[1]: libpod-9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1.scope: Deactivated successfully.
Dec 06 10:06:57 np0005548788.localdomain podman[297368]: 2025-12-06 10:06:57.024445187 +0000 UTC m=+0.187733821 container died 9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_noyce, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, distribution-scope=public, version=7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 10:06:57 np0005548788.localdomain podman[297388]: 2025-12-06 10:06:57.134578441 +0000 UTC m=+0.096787806 container remove 9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_noyce, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:57 np0005548788.localdomain systemd[1]: libpod-conmon-9cc8d317b11da3813bc72a27a3fb6361141c73bdbdd90b3e1b264802f837d7b1.scope: Deactivated successfully.
Dec 06 10:06:57 np0005548788.localdomain sudo[297334]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:57 np0005548788.localdomain sudo[297405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:57 np0005548788.localdomain sudo[297405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:57 np0005548788.localdomain sudo[297405]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:57 np0005548788.localdomain sudo[297423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:57 np0005548788.localdomain sudo[297423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:06:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 2025-12-06 10:06:57.841933 +0000 UTC m=+0.075947945 container create 8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_mclaren, com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:57 np0005548788.localdomain systemd[1]: Started libpod-conmon-8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d.scope.
Dec 06 10:06:57 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 2025-12-06 10:06:57.908093314 +0000 UTC m=+0.142108249 container init 8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_mclaren, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 2025-12-06 10:06:57.811500506 +0000 UTC m=+0.045515471 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 2025-12-06 10:06:57.918525335 +0000 UTC m=+0.152540270 container start 8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_mclaren, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public)
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 2025-12-06 10:06:57.918778273 +0000 UTC m=+0.152793218 container attach 8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_mclaren, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, distribution-scope=public, release=1763362218, ceph=True)
Dec 06 10:06:57 np0005548788.localdomain gallant_mclaren[297473]: 167 167
Dec 06 10:06:57 np0005548788.localdomain systemd[1]: libpod-8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d.scope: Deactivated successfully.
Dec 06 10:06:57 np0005548788.localdomain podman[297458]: 2025-12-06 10:06:57.92195965 +0000 UTC m=+0.155974595 container died 8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_mclaren, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, version=7, com.redhat.component=rhceph-container, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Dec 06 10:06:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-04faf119aa2ad7910ead5766389bd7c67eb61f0a53f58bfa95c10dc563eef49b-merged.mount: Deactivated successfully.
Dec 06 10:06:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-97ab9bc162e674abeed11da6704d53827fbc6f3032fd153d7df3573ff0408ecd-merged.mount: Deactivated successfully.
Dec 06 10:06:58 np0005548788.localdomain podman[297478]: 2025-12-06 10:06:58.026480072 +0000 UTC m=+0.090447800 container remove 8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_mclaren, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, release=1763362218, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:06:58 np0005548788.localdomain systemd[1]: libpod-conmon-8557c7b554b097a1c91ff6e3676c22ad08f571dc6f3b4457c4af11de40c7de5d.scope: Deactivated successfully.
Dec 06 10:06:58 np0005548788.localdomain sudo[297423]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:58 np0005548788.localdomain sudo[297501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:58 np0005548788.localdomain sudo[297501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:58 np0005548788.localdomain sudo[297501]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:58 np0005548788.localdomain sudo[297519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:58 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:06:58 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:06:58 np0005548788.localdomain sudo[297519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:06:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 2025-12-06 10:06:58.879132389 +0000 UTC m=+0.077307288 container create 707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_meitner, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:06:58 np0005548788.localdomain systemd[1]: Started libpod-conmon-707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5.scope.
Dec 06 10:06:58 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 2025-12-06 10:06:58.94266105 +0000 UTC m=+0.140835949 container init 707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_meitner, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, distribution-scope=public, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 2025-12-06 10:06:58.848575589 +0000 UTC m=+0.046750548 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:58 np0005548788.localdomain systemd[1]: tmp-crun.Z0ATsl.mount: Deactivated successfully.
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 2025-12-06 10:06:58.956120734 +0000 UTC m=+0.154295633 container start 707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_meitner, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, build-date=2025-11-26T19:44:28Z, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 2025-12-06 10:06:58.956492546 +0000 UTC m=+0.154667485 container attach 707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_meitner, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=)
Dec 06 10:06:58 np0005548788.localdomain infallible_meitner[297568]: 167 167
Dec 06 10:06:58 np0005548788.localdomain systemd[1]: libpod-707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5.scope: Deactivated successfully.
Dec 06 10:06:58 np0005548788.localdomain podman[297553]: 2025-12-06 10:06:58.959659513 +0000 UTC m=+0.157834442 container died 707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_meitner, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7)
Dec 06 10:06:59 np0005548788.localdomain podman[297573]: 2025-12-06 10:06:59.051935399 +0000 UTC m=+0.084075715 container remove 707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_meitner, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.buildah.version=1.41.4, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=1763362218)
Dec 06 10:06:59 np0005548788.localdomain systemd[1]: libpod-conmon-707c67eff5360883f6f33480987409e67cc4f79ebc388fb0be87d1cf4a463cb5.scope: Deactivated successfully.
Dec 06 10:06:59 np0005548788.localdomain sudo[297519]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:59 np0005548788.localdomain sudo[297597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:59 np0005548788.localdomain sudo[297597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:59 np0005548788.localdomain sudo[297597]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:59 np0005548788.localdomain sudo[297615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:59 np0005548788.localdomain sudo[297615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 2025-12-06 10:06:59.886288632 +0000 UTC m=+0.080670260 container create 09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_kepler, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, name=rhceph, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public)
Dec 06 10:06:59 np0005548788.localdomain systemd[1]: Started libpod-conmon-09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9.scope.
Dec 06 10:06:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cdad481f36ebeeaebd803a9066e8171f4160ee4cd8eec78a2b16b42797a82c5e-merged.mount: Deactivated successfully.
Dec 06 10:06:59 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 2025-12-06 10:06:59.949094363 +0000 UTC m=+0.143475981 container init 09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_kepler, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 2025-12-06 10:06:59.853877407 +0000 UTC m=+0.048259065 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 2025-12-06 10:06:59.95843999 +0000 UTC m=+0.152821608 container start 09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_kepler, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, RELEASE=main, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z)
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 2025-12-06 10:06:59.958690898 +0000 UTC m=+0.153072556 container attach 09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_kepler, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, release=1763362218, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64)
Dec 06 10:06:59 np0005548788.localdomain epic_kepler[297665]: 167 167
Dec 06 10:06:59 np0005548788.localdomain systemd[1]: libpod-09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9.scope: Deactivated successfully.
Dec 06 10:06:59 np0005548788.localdomain podman[297649]: 2025-12-06 10:06:59.961415642 +0000 UTC m=+0.155797290 container died 09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_kepler, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1763362218, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:07:00 np0005548788.localdomain podman[297670]: 2025-12-06 10:07:00.058120074 +0000 UTC m=+0.083985802 container remove 09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_kepler, architecture=x86_64, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218)
Dec 06 10:07:00 np0005548788.localdomain systemd[1]: libpod-conmon-09bd735ef9fb2cce34a9d939edba4363a6239ddbe9792192a4bf5ad34f8b62c9.scope: Deactivated successfully.
Dec 06 10:07:00 np0005548788.localdomain sudo[297615]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:00 np0005548788.localdomain sudo[297686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:00 np0005548788.localdomain sudo[297686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:00 np0005548788.localdomain sudo[297686]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:00 np0005548788.localdomain sudo[297704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:00 np0005548788.localdomain sudo[297704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 2025-12-06 10:07:00.804239935 +0000 UTC m=+0.076077409 container create cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_moore, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:07:00 np0005548788.localdomain systemd[1]: Started libpod-conmon-cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc.scope.
Dec 06 10:07:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:07:00 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 2025-12-06 10:07:00.867506439 +0000 UTC m=+0.139343903 container init cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_moore, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1763362218, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, version=7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 2025-12-06 10:07:00.774410598 +0000 UTC m=+0.046248102 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 2025-12-06 10:07:00.876269079 +0000 UTC m=+0.148106543 container start cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_moore, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., release=1763362218, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 2025-12-06 10:07:00.876520557 +0000 UTC m=+0.148358041 container attach cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_moore, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.tags=rhceph ceph)
Dec 06 10:07:00 np0005548788.localdomain blissful_moore[297753]: 167 167
Dec 06 10:07:00 np0005548788.localdomain systemd[1]: libpod-cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc.scope: Deactivated successfully.
Dec 06 10:07:00 np0005548788.localdomain podman[297738]: 2025-12-06 10:07:00.879078756 +0000 UTC m=+0.150916260 container died cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_moore, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, version=7)
Dec 06 10:07:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-276b1783a4e2bbc3be1d5efeadf2e51769933cb5801e8058e3ba71c1d265d87d-merged.mount: Deactivated successfully.
Dec 06 10:07:00 np0005548788.localdomain podman[297755]: 2025-12-06 10:07:00.954294637 +0000 UTC m=+0.096856438 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:07:00 np0005548788.localdomain podman[297755]: 2025-12-06 10:07:00.99473727 +0000 UTC m=+0.137299051 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:07:01 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:07:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f964854274fdb71ef39143b048a6990c7c67e937b0261e3cd58bacfa9bc15bc7-merged.mount: Deactivated successfully.
Dec 06 10:07:01 np0005548788.localdomain podman[297767]: 2025-12-06 10:07:01.026156206 +0000 UTC m=+0.133643179 container remove cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_moore, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:01 np0005548788.localdomain systemd[1]: libpod-conmon-cfbf8f5776fa4821d01f4097899309f4fb53c12a888e7cee04838fccff4741cc.scope: Deactivated successfully.
Dec 06 10:07:01 np0005548788.localdomain sudo[297704]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='client.44258 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548789.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:02 np0005548788.localdomain ceph-mon[293643]: Deploying daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:07:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:07:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:07:02 np0005548788.localdomain ceph-mon[293643]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.161361) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623161442, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1018, "num_deletes": 251, "total_data_size": 1530224, "memory_usage": 1553328, "flush_reason": "Manual Compaction"}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623170221, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 881085, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13521, "largest_seqno": 14534, "table_properties": {"data_size": 876229, "index_size": 2263, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 13172, "raw_average_key_size": 22, "raw_value_size": 865662, "raw_average_value_size": 1467, "num_data_blocks": 94, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015601, "oldest_key_time": 1765015601, "file_creation_time": 1765015623, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8900 microseconds, and 3596 cpu microseconds.
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.170268) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 881085 bytes OK
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.170290) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.172318) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.172341) EVENT_LOG_v1 {"time_micros": 1765015623172335, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.172363) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1524743, prev total WAL file size 1539611, number of live WAL files 2.
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.175248) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(860KB)], [21(15MB)]
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623175297, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 17223467, "oldest_snapshot_seqno": -1}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10303 keys, 13423424 bytes, temperature: kUnknown
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623274420, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13423424, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13364204, "index_size": 32367, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25797, "raw_key_size": 277835, "raw_average_key_size": 26, "raw_value_size": 13187599, "raw_average_value_size": 1279, "num_data_blocks": 1224, "num_entries": 10303, "num_filter_entries": 10303, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015623, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.274689) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13423424 bytes
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276429) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.6 rd, 135.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 15.6 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(34.8) write-amplify(15.2) OK, records in: 10835, records dropped: 532 output_compression: NoCompression
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276498) EVENT_LOG_v1 {"time_micros": 1765015623276485, "job": 10, "event": "compaction_finished", "compaction_time_micros": 99202, "compaction_time_cpu_micros": 40904, "output_level": 6, "num_output_files": 1, "total_output_size": 13423424, "num_input_records": 10835, "num_output_records": 10303, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.174501) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276642) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:07:03.276657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623550089, "job": 0, "event": "table_file_deletion", "file_number": 23}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623554813, "job": 0, "event": "table_file_deletion", "file_number": 21}
Dec 06 10:07:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:07:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:07:04 np0005548788.localdomain podman[297793]: 2025-12-06 10:07:04.256037513 +0000 UTC m=+0.082048953 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:07:04 np0005548788.localdomain podman[297793]: 2025-12-06 10:07:04.268654631 +0000 UTC m=+0.094666081 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:07:04 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:07:07 np0005548788.localdomain podman[297816]: 2025-12-06 10:07:07.261846634 +0000 UTC m=+0.085075205 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Dec 06 10:07:07 np0005548788.localdomain podman[297816]: 2025-12-06 10:07:07.270704296 +0000 UTC m=+0.093932827 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:07:07 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.495 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:07:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:07:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:07:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:11 np0005548788.localdomain sudo[297834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:11 np0005548788.localdomain sudo[297834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:11 np0005548788.localdomain sudo[297834]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:12 np0005548788.localdomain sudo[297852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:07:12 np0005548788.localdomain sudo[297852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:12 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:07:12 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:07:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:12 np0005548788.localdomain sudo[297852]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:13 np0005548788.localdomain ceph-mon[293643]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:14 np0005548788.localdomain sudo[297903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:07:14 np0005548788.localdomain sudo[297903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:14 np0005548788.localdomain sudo[297903]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 06 10:07:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/325418580' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:07:15 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/325418580' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:07:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:07:16 np0005548788.localdomain podman[297921]: 2025-12-06 10:07:16.262289427 +0000 UTC m=+0.089614076 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 10:07:16 np0005548788.localdomain sudo[297932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:07:16 np0005548788.localdomain sudo[297932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:16 np0005548788.localdomain sudo[297932]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:16 np0005548788.localdomain podman[297921]: 2025-12-06 10:07:16.333704242 +0000 UTC m=+0.161028890 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:07:16 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:07:16 np0005548788.localdomain sudo[297964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:16 np0005548788.localdomain sudo[297964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:16 np0005548788.localdomain sudo[297964]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:16 np0005548788.localdomain sudo[297982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:16 np0005548788.localdomain sudo[297982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='client.44270 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: Reconfig service osd.default_drive_group
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:07:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 2025-12-06 10:07:17.045254301 +0000 UTC m=+0.077151112 container create 2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_wing, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, version=7, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e.scope.
Dec 06 10:07:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 2025-12-06 10:07:17.012089622 +0000 UTC m=+0.043986403 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 2025-12-06 10:07:17.116073338 +0000 UTC m=+0.147970089 container init 2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_wing, version=7, io.openshift.expose-services=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 2025-12-06 10:07:17.125603801 +0000 UTC m=+0.157500552 container start 2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_wing, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 2025-12-06 10:07:17.125986282 +0000 UTC m=+0.157883073 container attach 2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_wing, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, architecture=x86_64, ceph=True, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:17 np0005548788.localdomain tender_wing[298032]: 167 167
Dec 06 10:07:17 np0005548788.localdomain systemd[1]: libpod-2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e.scope: Deactivated successfully.
Dec 06 10:07:17 np0005548788.localdomain podman[298016]: 2025-12-06 10:07:17.13045208 +0000 UTC m=+0.162348831 container died 2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_wing, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, release=1763362218, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Dec 06 10:07:17 np0005548788.localdomain podman[298037]: 2025-12-06 10:07:17.226049068 +0000 UTC m=+0.082662142 container remove 2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_wing, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_CLEAN=True, release=1763362218, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:07:17 np0005548788.localdomain systemd[1]: libpod-conmon-2f0b3a4c392b7e2f9e9261ad7ac459976593d412bf051c8c22439376fdd7ba1e.scope: Deactivated successfully.
Dec 06 10:07:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-91b35404396e65995bd7e08ef87483d7623362861bf4e813369d026d008ad40f-merged.mount: Deactivated successfully.
Dec 06 10:07:17 np0005548788.localdomain sudo[297982]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:17 np0005548788.localdomain sudo[298060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:17 np0005548788.localdomain sudo[298060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:17 np0005548788.localdomain sudo[298060]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:17 np0005548788.localdomain sudo[298078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:17 np0005548788.localdomain sudo[298078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/2080000025' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:07:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 e90: 6 total, 6 up, 6 in
Dec 06 10:07:17 np0005548788.localdomain sshd[294412]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:07:17 np0005548788.localdomain systemd-logind[765]: Session 67 logged out. Waiting for processes to exit.
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 2025-12-06 10:07:18.11630175 +0000 UTC m=+0.077242016 container create 829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_almeida, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph, release=1763362218, build-date=2025-11-26T19:44:28Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c.scope.
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 2025-12-06 10:07:18.084680637 +0000 UTC m=+0.045620933 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 2025-12-06 10:07:18.19312467 +0000 UTC m=+0.154064966 container init 829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_almeida, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, ceph=True, release=1763362218, distribution-scope=public, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 2025-12-06 10:07:18.202833629 +0000 UTC m=+0.163773915 container start 829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_almeida, CEPH_POINT_RELEASE=, release=1763362218, GIT_BRANCH=main, RELEASE=main, version=7, vcs-type=git, build-date=2025-11-26T19:44:28Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 2025-12-06 10:07:18.203625953 +0000 UTC m=+0.164566249 container attach 829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_almeida, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218)
Dec 06 10:07:18 np0005548788.localdomain trusting_almeida[298127]: 167 167
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: libpod-829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c.scope: Deactivated successfully.
Dec 06 10:07:18 np0005548788.localdomain podman[298112]: 2025-12-06 10:07:18.205865142 +0000 UTC m=+0.166805458 container died 829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_almeida, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container)
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5545303f6e68883b9b3a9b8e137faf1509dedd5d1e66675c978a3cc08f69a45a-merged.mount: Deactivated successfully.
Dec 06 10:07:18 np0005548788.localdomain podman[298132]: 2025-12-06 10:07:18.304421371 +0000 UTC m=+0.086362965 container remove 829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_almeida, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1763362218, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph)
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: libpod-conmon-829b7e88cfa596d30887e577c93c5df4cebd158530561fc606536afbd9f12f4c.scope: Deactivated successfully.
Dec 06 10:07:18 np0005548788.localdomain sudo[298078]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Dec 06 10:07:18 np0005548788.localdomain systemd[1]: session-67.scope: Consumed 22.636s CPU time.
Dec 06 10:07:18 np0005548788.localdomain systemd-logind[765]: Removed session 67.
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/2080000025' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548786.mczynb
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: osdmap e90: 6 total, 6 up, 6 in
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:07:18 np0005548788.localdomain ceph-mon[293643]: mgrmap e24: np0005548786.mczynb(active, starting, since 0.0565768s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:07:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:07:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:07:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:07:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:07:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:07:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18223 "" "Go-http-client/1.1"
Dec 06 10:07:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:07:21 np0005548788.localdomain podman[298155]: 2025-12-06 10:07:21.260320609 +0000 UTC m=+0.085895932 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:07:21 np0005548788.localdomain podman[298155]: 2025-12-06 10:07:21.271648627 +0000 UTC m=+0.097223980 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 10:07:21 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:07:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:07:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:07:22 np0005548788.localdomain podman[298175]: 2025-12-06 10:07:22.249505131 +0000 UTC m=+0.078826054 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:07:22 np0005548788.localdomain podman[298175]: 2025-12-06 10:07:22.264679948 +0000 UTC m=+0.094000901 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:07:22 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:07:22 np0005548788.localdomain podman[298176]: 2025-12-06 10:07:22.356829129 +0000 UTC m=+0.183856871 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Dec 06 10:07:22 np0005548788.localdomain podman[298176]: 2025-12-06 10:07:22.393257879 +0000 UTC m=+0.220285601 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Dec 06 10:07:22 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:07:23 np0005548788.localdomain ceph-mon[293643]: Standby manager daemon np0005548787.umwsra started
Dec 06 10:07:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:24 np0005548788.localdomain ceph-mon[293643]: mgrmap e25: np0005548786.mczynb(active, starting, since 5s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:07:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:07:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3985914868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3985914868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:07:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1488514553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:27 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1488514553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: Stopping User Manager for UID 1002...
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Activating special unit Exit the Session...
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Removed slice User Background Tasks Slice.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped target Main User Target.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped target Basic System.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped target Paths.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped target Sockets.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped target Timers.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Closed D-Bus User Message Bus Socket.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Removed slice User Application Slice.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Reached target Shutdown.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Finished Exit the Session.
Dec 06 10:07:28 np0005548788.localdomain systemd[26155]: Reached target Exit the Session.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: user@1002.service: Deactivated successfully.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: Stopped User Manager for UID 1002.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: user@1002.service: Consumed 13.907s CPU time, read 0B from disk, written 7.0K to disk.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1002...
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: run-user-1002.mount: Deactivated successfully.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1002.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: Removed slice User Slice of UID 1002.
Dec 06 10:07:28 np0005548788.localdomain systemd[1]: user-1002.slice: Consumed 4min 42.441s CPU time.
Dec 06 10:07:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:29.255 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:29.256 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:29.256 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:07:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:30.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:30.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:30.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:31.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:31.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:07:31 np0005548788.localdomain podman[298219]: 2025-12-06 10:07:31.25141217 +0000 UTC m=+0.081786865 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:07:31 np0005548788.localdomain podman[298219]: 2025-12-06 10:07:31.267649319 +0000 UTC m=+0.098024014 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:07:31 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:07:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:32.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:32.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:07:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:32.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:07:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:32.032 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:07:32 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3750572853' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.033 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.034 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.034 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.034 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.035 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:07:33 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1577274021' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:07:33 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2638595726' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.486 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.703 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.704 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12404MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.705 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.706 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.779 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.779 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:07:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:33.798 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:07:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2638595726' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:07:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3442107100' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:34.189 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:07:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:34.196 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:07:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:34.214 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:07:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:34.218 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:07:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:07:34.218 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3442107100' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:07:35 np0005548788.localdomain podman[298282]: 2025-12-06 10:07:35.258562457 +0000 UTC m=+0.084621572 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:07:35 np0005548788.localdomain podman[298282]: 2025-12-06 10:07:35.267269704 +0000 UTC m=+0.093328809 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:07:35 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:07:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:07:38 np0005548788.localdomain podman[298304]: 2025-12-06 10:07:38.254856825 +0000 UTC m=+0.082216327 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:07:38 np0005548788.localdomain podman[298304]: 2025-12-06 10:07:38.290583103 +0000 UTC m=+0.117942555 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 10:07:38 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:07:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:07:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:07:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3222977501' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:07:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3222977501' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:07:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Dec 06 10:07:43 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x5599fb291600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Dec 06 10:07:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:07:43 np0005548788.localdomain ceph-mon[293643]: paxos.2).electionLogic(52) init, last seen epoch 52
Dec 06 10:07:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:07:47 np0005548788.localdomain podman[298322]: 2025-12-06 10:07:47.249953643 +0000 UTC m=+0.076899214 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:07:47 np0005548788.localdomain podman[298322]: 2025-12-06 10:07:47.33118348 +0000 UTC m=+0.158129061 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 06 10:07:47 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:07:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:07:47.429 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:07:47.430 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:07:47.430 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 calling monitor election
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548788,np0005548789 in quorum (ranks 0,1,2,3)
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: monmap epoch 12
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:07:43.610976+0000
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: osdmap e90: 6 total, 6 up, 6 in
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: mgrmap e25: np0005548786.mczynb(active, starting, since 30s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:48 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:07:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:07:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:07:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:07:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:07:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:07:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18221 "" "Go-http-client/1.1"
Dec 06 10:07:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:07:52 np0005548788.localdomain podman[298347]: 2025-12-06 10:07:52.268086072 +0000 UTC m=+0.092363819 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:07:52 np0005548788.localdomain podman[298347]: 2025-12-06 10:07:52.306767892 +0000 UTC m=+0.131045639 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:07:52 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:07:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:07:52 np0005548788.localdomain podman[298366]: 2025-12-06 10:07:52.428555454 +0000 UTC m=+0.079536375 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:07:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:07:52 np0005548788.localdomain podman[298366]: 2025-12-06 10:07:52.444570167 +0000 UTC m=+0.095551158 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:07:52 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:07:52 np0005548788.localdomain podman[298391]: 2025-12-06 10:07:52.518784737 +0000 UTC m=+0.064356899 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:07:52 np0005548788.localdomain podman[298391]: 2025-12-06 10:07:52.559301343 +0000 UTC m=+0.104873525 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:07:52 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:07:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:08:02 np0005548788.localdomain podman[298411]: 2025-12-06 10:08:02.253927351 +0000 UTC m=+0.082096764 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:08:02 np0005548788.localdomain podman[298411]: 2025-12-06 10:08:02.270693937 +0000 UTC m=+0.098863340 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:08:02 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:08:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:08:06 np0005548788.localdomain podman[298430]: 2025-12-06 10:08:06.250275517 +0000 UTC m=+0.078279597 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:08:06 np0005548788.localdomain podman[298430]: 2025-12-06 10:08:06.288275784 +0000 UTC m=+0.116279814 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:08:06 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:08:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e12 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:08:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1889957737' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e91 e91: 6 total, 6 up, 6 in
Dec 06 10:08:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:06 np0005548788.localdomain sshd[298453]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:06 np0005548788.localdomain sshd[298453]: Accepted publickey for ceph-admin from 192.168.122.108 port 50882 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:08:06 np0005548788.localdomain systemd[1]: Created slice User Slice of UID 1002.
Dec 06 10:08:06 np0005548788.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Dec 06 10:08:06 np0005548788.localdomain systemd-logind[765]: New session 68 of user ceph-admin.
Dec 06 10:08:06 np0005548788.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Dec 06 10:08:06 np0005548788.localdomain systemd[1]: Starting User Manager for UID 1002...
Dec 06 10:08:06 np0005548788.localdomain systemd[298457]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Queued start job for default target Main User Target.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Created slice User Application Slice.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Reached target Paths.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Reached target Timers.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Starting D-Bus User Message Bus Socket...
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Starting Create User's Volatile Files and Directories...
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Finished Create User's Volatile Files and Directories.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Reached target Sockets.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Reached target Basic System.
Dec 06 10:08:07 np0005548788.localdomain systemd[1]: Started User Manager for UID 1002.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Reached target Main User Target.
Dec 06 10:08:07 np0005548788.localdomain systemd[298457]: Startup finished in 157ms.
Dec 06 10:08:07 np0005548788.localdomain systemd[1]: Started Session 68 of User ceph-admin.
Dec 06 10:08:07 np0005548788.localdomain sshd[298453]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/1889957737' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548790.kvkfyr
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: osdmap e91: 6 total, 6 up, 6 in
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: mgrmap e26: np0005548790.kvkfyr(active, starting, since 0.0602071s), standbys: np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: Manager daemon np0005548790.kvkfyr is now available
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"}]': finished
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"}]': finished
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:08:07 np0005548788.localdomain sudo[298474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:07 np0005548788.localdomain sudo[298474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:07 np0005548788.localdomain sudo[298474]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:07 np0005548788.localdomain sudo[298492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:08:07 np0005548788.localdomain sudo[298492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:08 np0005548788.localdomain ceph-mon[293643]: removing stray HostCache host record np0005548786.localdomain.devices.0
Dec 06 10:08:08 np0005548788.localdomain ceph-mon[293643]: mgrmap e27: np0005548790.kvkfyr(active, since 1.07281s), standbys: np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:08 np0005548788.localdomain podman[298580]: 2025-12-06 10:08:08.2165864 +0000 UTC m=+0.088434869 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.buildah.version=1.41.4, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main)
Dec 06 10:08:08 np0005548788.localdomain podman[298580]: 2025-12-06 10:08:08.302364396 +0000 UTC m=+0.174212875 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1763362218, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:08:08 np0005548788.localdomain podman[298612]: 2025-12-06 10:08:08.454115001 +0000 UTC m=+0.084727786 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:08:08 np0005548788.localdomain podman[298612]: 2025-12-06 10:08:08.460132255 +0000 UTC m=+0.090745050 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:08:08 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:08:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:08:08 np0005548788.localdomain sudo[298492]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:09 np0005548788.localdomain sudo[298715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:09 np0005548788.localdomain sudo[298715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:09 np0005548788.localdomain sudo[298715]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:09 np0005548788.localdomain sudo[298733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:08:09 np0005548788.localdomain sudo[298733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='client.26943 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:07] ENGINE Bus STARTING
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:07] ENGINE Serving on https://172.18.0.108:7150
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:07] ENGINE Client ('172.18.0.108', 58740) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:07] ENGINE Serving on http://172.18.0.108:8765
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:07] ENGINE Bus STARTED
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='client.44342 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: Cluster is now healthy
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548788.localdomain sudo[298733]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:09 np0005548788.localdomain sudo[298783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:09 np0005548788.localdomain sudo[298783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:09 np0005548788.localdomain sudo[298783]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548788.localdomain sudo[298801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:08:10 np0005548788.localdomain sudo[298801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548788.localdomain ceph-mon[293643]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:10 np0005548788.localdomain sudo[298801]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548788.localdomain sudo[298839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:10 np0005548788.localdomain sudo[298839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548788.localdomain sudo[298839]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548788.localdomain sudo[298857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:10 np0005548788.localdomain sudo[298857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548788.localdomain sudo[298857]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548788.localdomain sudo[298875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:10 np0005548788.localdomain sudo[298875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548788.localdomain sudo[298875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548788.localdomain sudo[298893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:10 np0005548788.localdomain sudo[298893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548788.localdomain sudo[298893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548788.localdomain sudo[298911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:10 np0005548788.localdomain sudo[298911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[298911]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[298945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:11 np0005548788.localdomain sudo[298945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[298945]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[298963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:11 np0005548788.localdomain sudo[298963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[298963]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: mgrmap e28: np0005548790.kvkfyr(active, since 3s), standbys: np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='client.54127 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Saving service mon spec with placement label:mon
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain sudo[298981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain sudo[298981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[298981]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[298999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:11 np0005548788.localdomain sudo[298999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[298999]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[299017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:11 np0005548788.localdomain sudo[299017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[299017]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[299035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548788.localdomain sudo[299035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[299035]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[299053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:11 np0005548788.localdomain sudo[299053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[299053]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:11 np0005548788.localdomain sudo[299071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548788.localdomain sudo[299071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[299071]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[299105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548788.localdomain sudo[299105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[299105]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[299123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548788.localdomain sudo[299123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548788.localdomain sudo[299123]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548788.localdomain sudo[299141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:11 np0005548788.localdomain sudo[299141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299141]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:12 np0005548788.localdomain sudo[299159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299159]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:12 np0005548788.localdomain sudo[299177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299177]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548788.localdomain sudo[299195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299195]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: from='client.44354 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:12 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548788.localdomain sudo[299213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:12 np0005548788.localdomain sudo[299213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299213]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548788.localdomain sudo[299231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299231]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548788.localdomain sudo[299265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299265]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548788.localdomain sudo[299283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299283]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548788.localdomain sudo[299301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299301]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:12 np0005548788.localdomain sudo[299319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:12 np0005548788.localdomain sudo[299337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299337]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548788.localdomain sudo[299355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548788.localdomain sudo[299355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548788.localdomain sudo[299355]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain sudo[299373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:13 np0005548788.localdomain sudo[299373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548788.localdomain sudo[299373]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain sudo[299391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:13 np0005548788.localdomain sudo[299391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548788.localdomain sudo[299391]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain sudo[299425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:13 np0005548788.localdomain sudo[299425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548788.localdomain sudo[299425]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain sudo[299443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:13 np0005548788.localdomain sudo[299443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548788.localdomain sudo[299443]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain sudo[299461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:13 np0005548788.localdomain sudo[299461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548788.localdomain sudo[299461]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain sudo[299479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:13 np0005548788.localdomain sudo[299479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548788.localdomain sudo[299479]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e12 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 06 10:08:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3734042444' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3734042444' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 0 B/s wr, 19 op/s
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:16 np0005548788.localdomain sudo[299497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:16 np0005548788.localdomain sudo[299497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:16 np0005548788.localdomain sudo[299497]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:16 np0005548788.localdomain sudo[299515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:16 np0005548788.localdomain sudo[299515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:16 np0005548788.localdomain podman[299551]: 
Dec 06 10:08:16 np0005548788.localdomain podman[299551]: 2025-12-06 10:08:16.92314632 +0000 UTC m=+0.061881613 container create 47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_feynman, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7)
Dec 06 10:08:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790.scope.
Dec 06 10:08:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:16 np0005548788.localdomain podman[299551]: 2025-12-06 10:08:16.89388032 +0000 UTC m=+0.032615653 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:17 np0005548788.localdomain podman[299551]: 2025-12-06 10:08:17.002058615 +0000 UTC m=+0.140793908 container init 47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_feynman, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, vcs-type=git)
Dec 06 10:08:17 np0005548788.localdomain podman[299551]: 2025-12-06 10:08:17.012893278 +0000 UTC m=+0.151628581 container start 47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_feynman, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, release=1763362218, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:08:17 np0005548788.localdomain podman[299551]: 2025-12-06 10:08:17.013104785 +0000 UTC m=+0.151840078 container attach 47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_feynman, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:17 np0005548788.localdomain suspicious_feynman[299566]: 167 167
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: libpod-47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790.scope: Deactivated successfully.
Dec 06 10:08:17 np0005548788.localdomain podman[299551]: 2025-12-06 10:08:17.01784243 +0000 UTC m=+0.156577753 container died 47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_feynman, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:08:17 np0005548788.localdomain podman[299571]: 2025-12-06 10:08:17.105381201 +0000 UTC m=+0.079995720 container remove 47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_feynman, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, ceph=True, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: libpod-conmon-47c61329034541b2357679ed846ebe88f82da0516ba747d587324214122a0790.scope: Deactivated successfully.
Dec 06 10:08:17 np0005548788.localdomain sudo[299515]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:08:17 np0005548788.localdomain sudo[299588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:17 np0005548788.localdomain sudo[299588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:17 np0005548788.localdomain sudo[299588]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:17 np0005548788.localdomain sudo[299606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:17 np0005548788.localdomain sudo[299606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:08:17 np0005548788.localdomain podman[299624]: 2025-12-06 10:08:17.47268169 +0000 UTC m=+0.082303032 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:08:17 np0005548788.localdomain podman[299624]: 2025-12-06 10:08:17.538864653 +0000 UTC m=+0.148485975 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 2025-12-06 10:08:17.825957207 +0000 UTC m=+0.078724230 container create 85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_volhard, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1763362218, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, build-date=2025-11-26T19:44:28Z, ceph=True, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed.scope.
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 2025-12-06 10:08:17.792102447 +0000 UTC m=+0.044869510 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 2025-12-06 10:08:17.897820386 +0000 UTC m=+0.150587399 container init 85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_volhard, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph)
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 2025-12-06 10:08:17.906807902 +0000 UTC m=+0.159574915 container start 85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_volhard, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.expose-services=, version=7, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, io.openshift.tags=rhceph ceph)
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 2025-12-06 10:08:17.907008058 +0000 UTC m=+0.159775081 container attach 85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_volhard, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True)
Dec 06 10:08:17 np0005548788.localdomain ecstatic_volhard[299678]: 167 167
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: libpod-85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed.scope: Deactivated successfully.
Dec 06 10:08:17 np0005548788.localdomain podman[299663]: 2025-12-06 10:08:17.911823427 +0000 UTC m=+0.164590490 container died 85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_volhard, vcs-type=git, release=1763362218, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b99574702bbe39f9c43d8d6a3590d34751c3b59175dbe98351c8ea46d84d0430-merged.mount: Deactivated successfully.
Dec 06 10:08:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-385c1f9dcad5cfe11e3fbb60b7516ec08194e877d5e4d1292665762ab2a97763-merged.mount: Deactivated successfully.
Dec 06 10:08:18 np0005548788.localdomain podman[299683]: 2025-12-06 10:08:18.011971524 +0000 UTC m=+0.088699427 container remove 85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_volhard, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:18 np0005548788.localdomain systemd[1]: libpod-conmon-85464417100cd76788c46a3d10b319a53047925b73fc48686cec008841bb35ed.scope: Deactivated successfully.
Dec 06 10:08:18 np0005548788.localdomain sudo[299606]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:18 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:08:18 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:08:18 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:08:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:18 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:08:18 np0005548788.localdomain sudo[299706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:18 np0005548788.localdomain sudo[299706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:18 np0005548788.localdomain sudo[299706]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:18 np0005548788.localdomain sudo[299724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:18 np0005548788.localdomain sudo[299724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 2025-12-06 10:08:18.891159875 +0000 UTC m=+0.090777640 container create 093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hopper, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, RELEASE=main, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public)
Dec 06 10:08:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06.scope.
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 2025-12-06 10:08:18.849716551 +0000 UTC m=+0.049334326 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 2025-12-06 10:08:18.962693444 +0000 UTC m=+0.162311189 container init 093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hopper, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, distribution-scope=public, architecture=x86_64, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:08:18 np0005548788.localdomain systemd[1]: tmp-crun.aahlON.mount: Deactivated successfully.
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 2025-12-06 10:08:18.978905713 +0000 UTC m=+0.178523448 container start 093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hopper, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 2025-12-06 10:08:18.979169331 +0000 UTC m=+0.178787106 container attach 093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hopper, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, release=1763362218)
Dec 06 10:08:18 np0005548788.localdomain strange_hopper[299776]: 167 167
Dec 06 10:08:18 np0005548788.localdomain systemd[1]: libpod-093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06.scope: Deactivated successfully.
Dec 06 10:08:18 np0005548788.localdomain podman[299759]: 2025-12-06 10:08:18.982700069 +0000 UTC m=+0.182317804 container died 093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hopper, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-type=git, com.redhat.component=rhceph-container, release=1763362218, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc.)
Dec 06 10:08:19 np0005548788.localdomain podman[299781]: 2025-12-06 10:08:19.069353302 +0000 UTC m=+0.078227705 container remove 093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_hopper, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1763362218, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, architecture=x86_64, RELEASE=main)
Dec 06 10:08:19 np0005548788.localdomain systemd[1]: libpod-conmon-093bc7631d400377ed3da41c8ae7ea8493510ec15af483183a4f8fc4c94f1a06.scope: Deactivated successfully.
Dec 06 10:08:19 np0005548788.localdomain sudo[299724]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:08:19 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/327302380' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:08:19 np0005548788.localdomain sudo[299804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:19 np0005548788.localdomain sudo[299804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:19 np0005548788.localdomain sudo[299804]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:19 np0005548788.localdomain sudo[299822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:19 np0005548788.localdomain sudo[299822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:08:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:08:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:08:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:08:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:08:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18216 "" "Go-http-client/1.1"
Dec 06 10:08:19 np0005548788.localdomain podman[299856]: 
Dec 06 10:08:19 np0005548788.localdomain podman[299856]: 2025-12-06 10:08:19.943019984 +0000 UTC m=+0.090197864 container create bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_boyd, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, release=1763362218, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, build-date=2025-11-26T19:44:28Z, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f0e954b230e332bc4f9f815fe43f5b0dc7ef6adf29e93a67d0ae2a5dc6843a83-merged.mount: Deactivated successfully.
Dec 06 10:08:19 np0005548788.localdomain systemd[1]: Started libpod-conmon-bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6.scope.
Dec 06 10:08:19 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:19 np0005548788.localdomain podman[299856]: 2025-12-06 10:08:19.898545667 +0000 UTC m=+0.045723537 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:20 np0005548788.localdomain podman[299856]: 2025-12-06 10:08:20.011991103 +0000 UTC m=+0.159168983 container init bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_boyd, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container)
Dec 06 10:08:20 np0005548788.localdomain podman[299856]: 2025-12-06 10:08:20.027445259 +0000 UTC m=+0.174623129 container start bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_boyd, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_CLEAN=True)
Dec 06 10:08:20 np0005548788.localdomain podman[299856]: 2025-12-06 10:08:20.027784829 +0000 UTC m=+0.174962709 container attach bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_boyd, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=)
Dec 06 10:08:20 np0005548788.localdomain xenodochial_boyd[299872]: 167 167
Dec 06 10:08:20 np0005548788.localdomain systemd[1]: libpod-bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6.scope: Deactivated successfully.
Dec 06 10:08:20 np0005548788.localdomain podman[299856]: 2025-12-06 10:08:20.03207492 +0000 UTC m=+0.179252800 container died bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_boyd, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, version=7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, release=1763362218)
Dec 06 10:08:20 np0005548788.localdomain podman[299877]: 2025-12-06 10:08:20.126134612 +0000 UTC m=+0.085808639 container remove bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_boyd, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1763362218, RELEASE=main, io.openshift.expose-services=)
Dec 06 10:08:20 np0005548788.localdomain systemd[1]: libpod-conmon-bb9eade4e8fdfbfb28857d728b26838b56c2fc9aa640764ac4103a8f98cd6df6.scope: Deactivated successfully.
Dec 06 10:08:20 np0005548788.localdomain sudo[299822]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:20 np0005548788.localdomain sudo[299893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:20 np0005548788.localdomain sudo[299893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:20 np0005548788.localdomain sudo[299893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:20 np0005548788.localdomain sudo[299911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:20 np0005548788.localdomain sudo[299911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e12 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/2304971504' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e92 e92: 6 total, 6 up, 6 in
Dec 06 10:08:20 np0005548788.localdomain sshd[298453]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:08:20 np0005548788.localdomain systemd-logind[765]: Session 68 logged out. Waiting for processes to exit.
Dec 06 10:08:20 np0005548788.localdomain sshd[299942]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:20 np0005548788.localdomain sshd[299942]: Accepted publickey for ceph-admin from 192.168.122.107 port 47232 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:08:20 np0005548788.localdomain systemd-logind[765]: New session 70 of user ceph-admin.
Dec 06 10:08:20 np0005548788.localdomain podman[299948]: 
Dec 06 10:08:20 np0005548788.localdomain systemd[1]: Started Session 70 of User ceph-admin.
Dec 06 10:08:20 np0005548788.localdomain sshd[299942]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:08:20 np0005548788.localdomain podman[299948]: 2025-12-06 10:08:20.918478314 +0000 UTC m=+0.073630104 container create ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_archimedes, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e6c9640532939ac29f6ab20f3b251b921c8ba4e98e312ca211badcc35417350e-merged.mount: Deactivated successfully.
Dec 06 10:08:20 np0005548788.localdomain systemd[1]: Started libpod-conmon-ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e.scope.
Dec 06 10:08:20 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:20 np0005548788.localdomain podman[299948]: 2025-12-06 10:08:20.883131667 +0000 UTC m=+0.038283467 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:20 np0005548788.localdomain podman[299948]: 2025-12-06 10:08:20.991128887 +0000 UTC m=+0.146280677 container init ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_archimedes, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1763362218, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:08:21 np0005548788.localdomain podman[299948]: 2025-12-06 10:08:21.003576909 +0000 UTC m=+0.158728699 container start ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_archimedes, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:08:21 np0005548788.localdomain podman[299948]: 2025-12-06 10:08:21.004090365 +0000 UTC m=+0.159242205 container attach ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_archimedes, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4, version=7, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:21 np0005548788.localdomain infallible_archimedes[299966]: 167 167
Dec 06 10:08:21 np0005548788.localdomain systemd[1]: libpod-ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e.scope: Deactivated successfully.
Dec 06 10:08:21 np0005548788.localdomain podman[299948]: 2025-12-06 10:08:21.00914717 +0000 UTC m=+0.164298990 container died ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_archimedes, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:21 np0005548788.localdomain sudo[299969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:21 np0005548788.localdomain sudo[299969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:21 np0005548788.localdomain sudo[299969]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:21 np0005548788.localdomain podman[299987]: 2025-12-06 10:08:21.105794801 +0000 UTC m=+0.084606381 container remove ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_archimedes, name=rhceph, RELEASE=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Dec 06 10:08:21 np0005548788.localdomain systemd[1]: libpod-conmon-ba9e38e6b892bc96a0ab46a48b2f515e593678532f4c3137dbfe5ee6c7996d2e.scope: Deactivated successfully.
Dec 06 10:08:21 np0005548788.localdomain sudo[300001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:08:21 np0005548788.localdomain sudo[300001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:21 np0005548788.localdomain sudo[299911]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:21 np0005548788.localdomain systemd[1]: session-68.scope: Deactivated successfully.
Dec 06 10:08:21 np0005548788.localdomain systemd[1]: session-68.scope: Consumed 9.996s CPU time.
Dec 06 10:08:21 np0005548788.localdomain systemd-logind[765]: Removed session 68.
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/2304971504' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548789.mzhmje
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: mgrmap e29: np0005548789.mzhmje(active, starting, since 0.0455985s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: Manager daemon np0005548789.mzhmje is now available
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/trash_purge_schedule"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/trash_purge_schedule"} : dispatch
Dec 06 10:08:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c46d32e50e690193029191c8e49dd7de2fec49790d7ba074d8dbd85559442687-merged.mount: Deactivated successfully.
Dec 06 10:08:21 np0005548788.localdomain podman[300092]: 2025-12-06 10:08:21.974621354 +0000 UTC m=+0.080806185 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, version=7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1763362218, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:08:22 np0005548788.localdomain podman[300092]: 2025-12-06 10:08:22.087601656 +0000 UTC m=+0.193786487 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, version=7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-26T19:44:28Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=1763362218, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:08:22 np0005548788.localdomain ceph-mon[293643]: mgrmap e30: np0005548789.mzhmje(active, since 1.06844s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:22 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:21] ENGINE Bus STARTING
Dec 06 10:08:22 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:21] ENGINE Serving on https://172.18.0.107:7150
Dec 06 10:08:22 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:21] ENGINE Client ('172.18.0.107', 48298) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:08:22 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:21] ENGINE Serving on http://172.18.0.107:8765
Dec 06 10:08:22 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:08:21] ENGINE Bus STARTED
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:08:22 np0005548788.localdomain podman[300180]: 2025-12-06 10:08:22.518252062 +0000 UTC m=+0.089713608 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:08:22 np0005548788.localdomain podman[300180]: 2025-12-06 10:08:22.53055162 +0000 UTC m=+0.102013106 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:08:22 np0005548788.localdomain podman[300212]: 2025-12-06 10:08:22.626084526 +0000 UTC m=+0.098000473 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:08:22 np0005548788.localdomain podman[300212]: 2025-12-06 10:08:22.66753702 +0000 UTC m=+0.139452897 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:08:22 np0005548788.localdomain sudo[300001]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:22 np0005548788.localdomain podman[300244]: 2025-12-06 10:08:22.725953395 +0000 UTC m=+0.089143201 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc.)
Dec 06 10:08:22 np0005548788.localdomain podman[300244]: 2025-12-06 10:08:22.764527831 +0000 UTC m=+0.127717637 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:08:22 np0005548788.localdomain sudo[300272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:22 np0005548788.localdomain sudo[300272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:22 np0005548788.localdomain sudo[300272]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:22 np0005548788.localdomain sudo[300290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:08:22 np0005548788.localdomain sudo[300290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:22 np0005548788.localdomain systemd[1]: tmp-crun.YzNvMr.mount: Deactivated successfully.
Dec 06 10:08:23 np0005548788.localdomain sudo[300290]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: Cluster is now healthy
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548788.localdomain sudo[300340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:23 np0005548788.localdomain sudo[300340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:23 np0005548788.localdomain sudo[300340]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:23 np0005548788.localdomain sudo[300358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:08:23 np0005548788.localdomain sudo[300358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300358]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:24 np0005548788.localdomain sudo[300394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300394]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:24 np0005548788.localdomain sudo[300412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300412]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548788.localdomain sudo[300430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300430]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:24 np0005548788.localdomain sudo[300448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300448]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548788.localdomain sudo[300466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300466]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548788.localdomain sudo[300500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300500]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain sudo[300518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548788.localdomain sudo[300518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548788.localdomain sudo[300518]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: mgrmap e31: np0005548789.mzhmje(active, since 3s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:25 np0005548788.localdomain sudo[300536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain sudo[300536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300536]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:25 np0005548788.localdomain sudo[300554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300554]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:25 np0005548788.localdomain sudo[300572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300572]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548788.localdomain sudo[300590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300590]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:25 np0005548788.localdomain sudo[300608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300608]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548788.localdomain sudo[300626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300626]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548788.localdomain sudo[300660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300660]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548788.localdomain sudo[300678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300678]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain sudo[300696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300696]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:25 np0005548788.localdomain sudo[300714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300714]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:25 np0005548788.localdomain sudo[300732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300732]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain sudo[300750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:25 np0005548788.localdomain sudo[300750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548788.localdomain sudo[300750]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Standby manager daemon np0005548790.kvkfyr started
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548788.localdomain sudo[300768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:26 np0005548788.localdomain sudo[300768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300768]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300786]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300820]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300838]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548788.localdomain sudo[300856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300856]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:26 np0005548788.localdomain sudo[300874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300874]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:26 np0005548788.localdomain sudo[300892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300892]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300910]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:26 np0005548788.localdomain sudo[300928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:26 np0005548788.localdomain sudo[300928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300928]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300946]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300980]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548788.localdomain sudo[300998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548788.localdomain sudo[300998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548788.localdomain sudo[300998]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548788.localdomain sudo[301016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548788.localdomain sudo[301016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548788.localdomain sudo[301016]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: mgrmap e32: np0005548789.mzhmje(active, since 5s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1224196971' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:27 np0005548788.localdomain sudo[301034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:27 np0005548788.localdomain sudo[301034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548788.localdomain sudo[301034]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548788.localdomain sudo[301052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:27 np0005548788.localdomain sudo[301052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548788.localdomain sudo[301052]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548788.localdomain sudo[301070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:27 np0005548788.localdomain sudo[301070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 2025-12-06 10:08:28.231164344 +0000 UTC m=+0.077486981 container create 1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wilson, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, version=7, release=1763362218, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:08:28 np0005548788.localdomain systemd[1]: Started libpod-conmon-1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60.scope.
Dec 06 10:08:28 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 2025-12-06 10:08:28.19945845 +0000 UTC m=+0.045781127 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 2025-12-06 10:08:28.302233309 +0000 UTC m=+0.148555946 container init 1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wilson, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=1763362218, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=7, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 2025-12-06 10:08:28.314038242 +0000 UTC m=+0.160360869 container start 1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wilson, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 2025-12-06 10:08:28.314271309 +0000 UTC m=+0.160593966 container attach 1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wilson, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, name=rhceph, ceph=True, com.redhat.component=rhceph-container, release=1763362218, build-date=2025-11-26T19:44:28Z)
Dec 06 10:08:28 np0005548788.localdomain busy_wilson[301119]: 167 167
Dec 06 10:08:28 np0005548788.localdomain systemd[1]: libpod-1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60.scope: Deactivated successfully.
Dec 06 10:08:28 np0005548788.localdomain podman[301104]: 2025-12-06 10:08:28.31855109 +0000 UTC m=+0.164873777 container died 1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wilson, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, architecture=x86_64, version=7, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:08:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3249850813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:28 np0005548788.localdomain podman[301124]: 2025-12-06 10:08:28.416601833 +0000 UTC m=+0.089308674 container remove 1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_wilson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7)
Dec 06 10:08:28 np0005548788.localdomain systemd[1]: libpod-conmon-1dea58a021a70ac34a18636ea53d05bcba4263ca146bb01b64a3e6b6f4825f60.scope: Deactivated successfully.
Dec 06 10:08:28 np0005548788.localdomain sudo[301070]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:28 np0005548788.localdomain sudo[301142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:28 np0005548788.localdomain sudo[301142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:28 np0005548788.localdomain sudo[301142]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:28 np0005548788.localdomain sudo[301160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:28 np0005548788.localdomain sudo[301160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 2025-12-06 10:08:29.149078696 +0000 UTC m=+0.074399588 container create 43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_nash, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, version=7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:29 np0005548788.localdomain systemd[1]: Started libpod-conmon-43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a.scope.
Dec 06 10:08:29 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 2025-12-06 10:08:29.216136057 +0000 UTC m=+0.141456959 container init 43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_nash, io.openshift.expose-services=, ceph=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, release=1763362218, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 2025-12-06 10:08:29.119753385 +0000 UTC m=+0.045074317 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 2025-12-06 10:08:29.225172145 +0000 UTC m=+0.150493037 container start 43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_nash, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1763362218, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7)
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 2025-12-06 10:08:29.225457354 +0000 UTC m=+0.150778256 container attach 43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_nash, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=1763362218, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_CLEAN=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:29 np0005548788.localdomain distracted_nash[301210]: 167 167
Dec 06 10:08:29 np0005548788.localdomain systemd[1]: libpod-43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a.scope: Deactivated successfully.
Dec 06 10:08:29 np0005548788.localdomain podman[301195]: 2025-12-06 10:08:29.228354933 +0000 UTC m=+0.153675915 container died 43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_nash, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1763362218, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c7b4bad32c8ef2b15d85ba3dc17197033ec0761deca5142ae44ec07cee198a08-merged.mount: Deactivated successfully.
Dec 06 10:08:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-7d5d208cf8469cb094cf6e741ef719d9dc9546a27b4ac95f6766f49441a8b90b-merged.mount: Deactivated successfully.
Dec 06 10:08:29 np0005548788.localdomain podman[301215]: 2025-12-06 10:08:29.328441779 +0000 UTC m=+0.090968437 container remove 43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_nash, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, ceph=True)
Dec 06 10:08:29 np0005548788.localdomain systemd[1]: libpod-conmon-43cdfe1ad3f99ec3c8fc44e42121bdba1bf540219dbec1d9fa35e2014409167a.scope: Deactivated successfully.
Dec 06 10:08:29 np0005548788.localdomain sudo[301160]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.008 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.008 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.009 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.009 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.028 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.029 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:30.029 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:08:30 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:31.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:32.017 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:32.017 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:32.018 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:08:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:33.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:33.019 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:33 np0005548788.localdomain podman[301233]: 2025-12-06 10:08:33.019029027 +0000 UTC m=+0.088794409 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 06 10:08:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:33.019 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:08:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:33.020 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:08:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:33.032 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:08:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:33.032 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:33 np0005548788.localdomain podman[301233]: 2025-12-06 10:08:33.057039356 +0000 UTC m=+0.126804788 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:08:33 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='client.44410 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.025 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.026 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.027 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3333674039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:08:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/109628701' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.792 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.765s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.979 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.981 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12380MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.982 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:34.982 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.114 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.115 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.202 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.264 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.265 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.285 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.337 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.357 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548787", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/109628701' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2265627899' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:08:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2689790601' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.799 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.805 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.825 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.828 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:08:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:08:35.829 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x5599fb291600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Dec 06 10:08:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@2(peon) e13  my rank is now 1 (was 2)
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: --2- 172.18.0.106:0/3380714700 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x559a04e5c800 0x559a04eaeb00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Dec 06 10:08:36 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x5599fc03a000 mon_map magic: 0 from mon.1 v2:172.18.0.103:3300/0
Dec 06 10:08:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:08:37 np0005548788.localdomain systemd[1]: tmp-crun.avJaN5.mount: Deactivated successfully.
Dec 06 10:08:37 np0005548788.localdomain podman[301296]: 2025-12-06 10:08:37.283755111 +0000 UTC m=+0.104854834 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:08:37 np0005548788.localdomain podman[301296]: 2025-12-06 10:08:37.298635738 +0000 UTC m=+0.119735451 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:08:37 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:08:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:08:38 np0005548788.localdomain ceph-mon[293643]: paxos.1).electionLogic(56) init, last seen epoch 56
Dec 06 10:08:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:08:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:08:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:08:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:08:39 np0005548788.localdomain podman[301319]: 2025-12-06 10:08:39.259444952 +0000 UTC m=+0.084159507 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:08:39 np0005548788.localdomain podman[301319]: 2025-12-06 10:08:39.264189138 +0000 UTC m=+0.088903683 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:08:39 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:08:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:43 np0005548788.localdomain ceph-mds[285743]: mds.beacon.mds.np0005548788.erzujf missed beacon ack from the monitors
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: paxos.1).electionLogic(57) init, last seen epoch 57, mid-election, bumping
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.367620) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015723367679, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2907, "num_deletes": 256, "total_data_size": 11893605, "memory_usage": 12632632, "flush_reason": "Manual Compaction"}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='client.34469 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548787"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Remove daemons mon.np0005548787
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Safe to remove mon.np0005548787: new quorum should be ['np0005548790', 'np0005548788', 'np0005548789'] (from ['np0005548790', 'np0005548788', 'np0005548789'])
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Removing monitor np0005548787 from monmap...
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon rm", "name": "np0005548787"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Removing daemon mon.np0005548787 from np0005548787.localdomain -- ports []
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 is new leader, mons np0005548790,np0005548789 in quorum (ranks 0,2)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: monmap epoch 13
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:08:36.308855+0000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mgrmap e32: np0005548789.mzhmje(active, since 20s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1/3 mons down, quorum np0005548790,np0005548789 (MON_DOWN)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005548790,np0005548789
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005548790,np0005548789
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]:     mon.np0005548788 (rank 1) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2722608319' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2722608319' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015723402759, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 7299560, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14539, "largest_seqno": 17441, "table_properties": {"data_size": 7287588, "index_size": 7453, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3461, "raw_key_size": 31474, "raw_average_key_size": 22, "raw_value_size": 7261119, "raw_average_value_size": 5257, "num_data_blocks": 321, "num_entries": 1381, "num_filter_entries": 1381, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 1765015623, "file_creation_time": 1765015723, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 35203 microseconds, and 14400 cpu microseconds.
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.402816) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 7299560 bytes OK
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.402845) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.404682) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.404708) EVENT_LOG_v1 {"time_micros": 1765015723404702, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.404732) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 11879348, prev total WAL file size 11882440, number of live WAL files 2.
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.407384) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(7128KB)], [24(12MB)]
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015723407649, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20722984, "oldest_snapshot_seqno": -1}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11133 keys, 17497576 bytes, temperature: kUnknown
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015723520700, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17497576, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17432458, "index_size": 36217, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297927, "raw_average_key_size": 26, "raw_value_size": 17241014, "raw_average_value_size": 1548, "num_data_blocks": 1389, "num_entries": 11133, "num_filter_entries": 11133, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015723, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.521094) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17497576 bytes
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.522718) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.1 rd, 154.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(7.0, 12.8 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(5.2) write-amplify(2.4) OK, records in: 11684, records dropped: 551 output_compression: NoCompression
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.522751) EVENT_LOG_v1 {"time_micros": 1765015723522735, "job": 12, "event": "compaction_finished", "compaction_time_micros": 113154, "compaction_time_cpu_micros": 50624, "output_level": 6, "num_output_files": 1, "total_output_size": 17497576, "num_input_records": 11684, "num_output_records": 11133, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015723524340, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015723526773, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.407226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.526877) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.526887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.526891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.526895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:43.526899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2689790601' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 is new leader, mons np0005548790,np0005548788,np0005548789 in quorum (ranks 0,1,2)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: monmap epoch 13
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:08:36.308855+0000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: mgrmap e32: np0005548789.mzhmje(active, since 22s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548790,np0005548789)
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: from='client.44452 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: Removed label mgr from host np0005548787.localdomain
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:08:47.430 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:08:47.431 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:08:47.431 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:08:48 np0005548788.localdomain podman[301338]: 2025-12-06 10:08:48.26329834 +0000 UTC m=+0.086920073 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:08:48 np0005548788.localdomain podman[301338]: 2025-12-06 10:08:48.303462474 +0000 UTC m=+0.127084207 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:08:48 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:08:48 np0005548788.localdomain ceph-mon[293643]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:48 np0005548788.localdomain ceph-mon[293643]: from='client.44458 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:48 np0005548788.localdomain ceph-mon[293643]: Removed label _admin from host np0005548787.localdomain
Dec 06 10:08:48 np0005548788.localdomain sudo[301364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:48 np0005548788.localdomain sudo[301364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548788.localdomain sudo[301364]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:48 np0005548788.localdomain sudo[301382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:48 np0005548788.localdomain sudo[301382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548788.localdomain sudo[301382]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:48 np0005548788.localdomain sudo[301400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:48 np0005548788.localdomain sudo[301400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548788.localdomain sudo[301400]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:48 np0005548788.localdomain sudo[301418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:49 np0005548788.localdomain sudo[301418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301418]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:49 np0005548788.localdomain sudo[301436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301436]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:49 np0005548788.localdomain sudo[301470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301470]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:49 np0005548788.localdomain sudo[301488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301488]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548788.localdomain sudo[301506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301506]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:49 np0005548788.localdomain sudo[301524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301524]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:49 np0005548788.localdomain sudo[301542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301542]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: Removing np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:08:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:08:49 np0005548788.localdomain sudo[301560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548788.localdomain sudo[301560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301560]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:08:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:08:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:08:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18215 "" "Go-http-client/1.1"
Dec 06 10:08:49 np0005548788.localdomain sudo[301578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:49 np0005548788.localdomain sudo[301578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301578]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548788.localdomain sudo[301596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301596]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548788.localdomain sudo[301630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548788.localdomain sudo[301630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548788.localdomain sudo[301630]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:50 np0005548788.localdomain sudo[301648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:50 np0005548788.localdomain sudo[301648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:50 np0005548788.localdomain sudo[301648]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:50 np0005548788.localdomain sudo[301666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548788.localdomain sudo[301666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:50 np0005548788.localdomain sudo[301666]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: Removing daemon mgr.np0005548787.umwsra from np0005548787.localdomain -- ports [8765]
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.958191) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015731958285, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 582, "num_deletes": 250, "total_data_size": 652276, "memory_usage": 665112, "flush_reason": "Manual Compaction"}
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015731964840, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 397197, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17446, "largest_seqno": 18023, "table_properties": {"data_size": 393994, "index_size": 1122, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7875, "raw_average_key_size": 19, "raw_value_size": 387195, "raw_average_value_size": 949, "num_data_blocks": 44, "num_entries": 408, "num_filter_entries": 408, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015723, "oldest_key_time": 1765015723, "file_creation_time": 1765015731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 6654 microseconds, and 2023 cpu microseconds.
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.964887) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 397197 bytes OK
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.964907) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.966781) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.966803) EVENT_LOG_v1 {"time_micros": 1765015731966796, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.966820) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 648732, prev total WAL file size 648732, number of live WAL files 2.
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.967461) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323836' seq:72057594037927935, type:22 .. '6B760031353337' seq:0, type:0; will stop at (end)
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(387KB)], [27(16MB)]
Dec 06 10:08:51 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015731967521, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17894773, "oldest_snapshot_seqno": -1}
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11016 keys, 16890002 bytes, temperature: kUnknown
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015732057805, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 16890002, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16826591, "index_size": 34766, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27589, "raw_key_size": 297268, "raw_average_key_size": 26, "raw_value_size": 16637955, "raw_average_value_size": 1510, "num_data_blocks": 1311, "num_entries": 11016, "num_filter_entries": 11016, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.058124) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 16890002 bytes
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.060595) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.0 rd, 186.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 16.7 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(87.6) write-amplify(42.5) OK, records in: 11541, records dropped: 525 output_compression: NoCompression
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.060629) EVENT_LOG_v1 {"time_micros": 1765015732060614, "job": 14, "event": "compaction_finished", "compaction_time_micros": 90394, "compaction_time_cpu_micros": 46574, "output_level": 6, "num_output_files": 1, "total_output_size": 16890002, "num_input_records": 11541, "num_output_records": 11016, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015732060967, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015732063717, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:51.967184) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.063866) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.063872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.063875) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.063878) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:08:52.063881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548788.localdomain sudo[301684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:52 np0005548788.localdomain sudo[301684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:08:52 np0005548788.localdomain sudo[301684]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:52 np0005548788.localdomain podman[301702]: 2025-12-06 10:08:52.671845643 +0000 UTC m=+0.084238140 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute)
Dec 06 10:08:52 np0005548788.localdomain podman[301702]: 2025-12-06 10:08:52.687633617 +0000 UTC m=+0.100026084 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: tmp-crun.tHW5pB.mount: Deactivated successfully.
Dec 06 10:08:52 np0005548788.localdomain podman[301722]: 2025-12-06 10:08:52.832342465 +0000 UTC m=+0.103731069 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:08:52 np0005548788.localdomain podman[301722]: 2025-12-06 10:08:52.846766158 +0000 UTC m=+0.118154782 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:08:52 np0005548788.localdomain podman[301745]: 2025-12-06 10:08:52.938735465 +0000 UTC m=+0.081358161 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth rm", "entity": "mgr.np0005548787.umwsra"} : dispatch
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005548787.umwsra"}]': finished
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:52 np0005548788.localdomain podman[301745]: 2025-12-06 10:08:52.976727823 +0000 UTC m=+0.119350599 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter)
Dec 06 10:08:52 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:08:53 np0005548788.localdomain ceph-mon[293643]: Removing key for mgr.np0005548787.umwsra
Dec 06 10:08:54 np0005548788.localdomain sudo[301766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:54 np0005548788.localdomain sudo[301766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:54 np0005548788.localdomain sudo[301766]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:55 np0005548788.localdomain sudo[301784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:55 np0005548788.localdomain sudo[301784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:55 np0005548788.localdomain sudo[301784]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:55 np0005548788.localdomain sudo[301802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:55 np0005548788.localdomain sudo[301802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:55 np0005548788.localdomain podman[301838]: 
Dec 06 10:08:55 np0005548788.localdomain podman[301838]: 2025-12-06 10:08:55.935032005 +0000 UTC m=+0.076656158 container create 2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_buck, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, release=1763362218)
Dec 06 10:08:55 np0005548788.localdomain systemd[1]: Started libpod-conmon-2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9.scope.
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:56 np0005548788.localdomain podman[301838]: 2025-12-06 10:08:55.903145024 +0000 UTC m=+0.044769187 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:56 np0005548788.localdomain podman[301838]: 2025-12-06 10:08:56.016111567 +0000 UTC m=+0.157735720 container init 2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_buck, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, architecture=x86_64, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, ceph=True, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph)
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:56 np0005548788.localdomain podman[301838]: 2025-12-06 10:08:56.032904723 +0000 UTC m=+0.174528876 container start 2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_buck, io.buildah.version=1.41.4, release=1763362218, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, version=7, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:56 np0005548788.localdomain podman[301838]: 2025-12-06 10:08:56.034282115 +0000 UTC m=+0.175906308 container attach 2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_buck, name=rhceph, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, RELEASE=main, CEPH_POINT_RELEASE=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public)
Dec 06 10:08:56 np0005548788.localdomain goofy_buck[301853]: 167 167
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: libpod-2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9.scope: Deactivated successfully.
Dec 06 10:08:56 np0005548788.localdomain podman[301838]: 2025-12-06 10:08:56.036318088 +0000 UTC m=+0.177942271 container died 2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_buck, io.buildah.version=1.41.4, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main)
Dec 06 10:08:56 np0005548788.localdomain podman[301858]: 2025-12-06 10:08:56.142891743 +0000 UTC m=+0.091628027 container remove 2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_buck, RELEASE=main, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218)
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: libpod-conmon-2d656cdd7caaaad1dde6b625211a77b090de76a27fb0dbd6be614551ec28cac9.scope: Deactivated successfully.
Dec 06 10:08:56 np0005548788.localdomain sudo[301802]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:56 np0005548788.localdomain sudo[301875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:56 np0005548788.localdomain sudo[301875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:56 np0005548788.localdomain sudo[301875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:56 np0005548788.localdomain sudo[301893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:56 np0005548788.localdomain sudo[301893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 2025-12-06 10:08:56.872953121 +0000 UTC m=+0.079965399 container create 8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_albattani, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1763362218, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc.)
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: Started libpod-conmon-8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8.scope.
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 2025-12-06 10:08:56.940635702 +0000 UTC m=+0.147647960 container init 8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_albattani, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, ceph=True)
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 2025-12-06 10:08:56.842222496 +0000 UTC m=+0.049234834 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-577f793e59cde9f9c214cae7c9fb10ba971e9cb379cf75fe49fa4ed9c3c3e935-merged.mount: Deactivated successfully.
Dec 06 10:08:56 np0005548788.localdomain crazy_albattani[301944]: 167 167
Dec 06 10:08:56 np0005548788.localdomain systemd[1]: libpod-8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8.scope: Deactivated successfully.
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 2025-12-06 10:08:56.957777798 +0000 UTC m=+0.164790036 container start 8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_albattani, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., ceph=True, build-date=2025-11-26T19:44:28Z)
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 2025-12-06 10:08:56.958114808 +0000 UTC m=+0.165127106 container attach 8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_albattani, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-11-26T19:44:28Z, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4)
Dec 06 10:08:56 np0005548788.localdomain podman[301929]: 2025-12-06 10:08:56.959838681 +0000 UTC m=+0.166850979 container died 8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_albattani, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 06 10:08:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e6af7e96d0d30765dac66a7dfe1e243bd67618ff63f143da47188c5ba9dfd271-merged.mount: Deactivated successfully.
Dec 06 10:08:57 np0005548788.localdomain podman[301949]: 2025-12-06 10:08:57.040950794 +0000 UTC m=+0.074019406 container remove 8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_albattani, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, name=rhceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, version=7, vcs-type=git, GIT_CLEAN=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:08:57 np0005548788.localdomain systemd[1]: libpod-conmon-8385aa85f00957755faa1c7865de5af080aa12b142e33ffc5a0a9acf2b8618d8.scope: Deactivated successfully.
Dec 06 10:08:57 np0005548788.localdomain sudo[301893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='client.54257 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548787.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: Added label _no_schedule to host np0005548787.localdomain
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548787.localdomain
Dec 06 10:08:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548788.localdomain sudo[301973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:57 np0005548788.localdomain sudo[301973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:57 np0005548788.localdomain sudo[301973]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:57 np0005548788.localdomain sudo[301991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:57 np0005548788.localdomain sudo[301991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 2025-12-06 10:08:57.864759753 +0000 UTC m=+0.075460900 container create 11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_merkle, ceph=True, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64)
Dec 06 10:08:57 np0005548788.localdomain systemd[1]: Started libpod-conmon-11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709.scope.
Dec 06 10:08:57 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 2025-12-06 10:08:57.924364666 +0000 UTC m=+0.135065813 container init 11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_merkle, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, build-date=2025-11-26T19:44:28Z, version=7, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 2025-12-06 10:08:57.932702712 +0000 UTC m=+0.143403859 container start 11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_merkle, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, version=7, io.buildah.version=1.41.4)
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 2025-12-06 10:08:57.933000871 +0000 UTC m=+0.143702058 container attach 11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_merkle, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1763362218, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True)
Dec 06 10:08:57 np0005548788.localdomain wizardly_merkle[302041]: 167 167
Dec 06 10:08:57 np0005548788.localdomain systemd[1]: libpod-11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709.scope: Deactivated successfully.
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 2025-12-06 10:08:57.835076341 +0000 UTC m=+0.045777518 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:57 np0005548788.localdomain podman[302026]: 2025-12-06 10:08:57.935092205 +0000 UTC m=+0.145793362 container died 11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_merkle, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, release=1763362218, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64)
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-8ba209c952e4f416a8f9c8e4ecec229708e1259c3820f59eff232c16de890909-merged.mount: Deactivated successfully.
Dec 06 10:08:58 np0005548788.localdomain podman[302046]: 2025-12-06 10:08:58.031823208 +0000 UTC m=+0.088452320 container remove 11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_merkle, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1763362218, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: libpod-conmon-11bb97e5d28a338b9f8f429508e07a3ec4c22c846c41ca2a0300276b13466709.scope: Deactivated successfully.
Dec 06 10:08:58 np0005548788.localdomain sudo[301991]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:58 np0005548788.localdomain sudo[302067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:58 np0005548788.localdomain sudo[302067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:58 np0005548788.localdomain sudo[302067]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:58 np0005548788.localdomain sudo[302085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:58 np0005548788.localdomain sudo[302085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 2025-12-06 10:08:58.850299293 +0000 UTC m=+0.073762218 container create 53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_williamson, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4)
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: Started libpod-conmon-53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589.scope.
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 2025-12-06 10:08:58.909927196 +0000 UTC m=+0.133390121 container init 53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_williamson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, name=rhceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7)
Dec 06 10:08:58 np0005548788.localdomain nostalgic_williamson[302135]: 167 167
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 2025-12-06 10:08:58.820763716 +0000 UTC m=+0.044226691 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: libpod-53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589.scope: Deactivated successfully.
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 2025-12-06 10:08:58.920242483 +0000 UTC m=+0.143705438 container start 53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_williamson, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 2025-12-06 10:08:58.920593473 +0000 UTC m=+0.144056428 container attach 53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_williamson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., ceph=True, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-11-26T19:44:28Z, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:08:58 np0005548788.localdomain podman[302120]: 2025-12-06 10:08:58.922921725 +0000 UTC m=+0.146384650 container died 53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_williamson, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container)
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: tmp-crun.qvHwOV.mount: Deactivated successfully.
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1b513839a8e326c27a3bfd880f33b17c988e3b8622e518804b9f7d982ac893c6-merged.mount: Deactivated successfully.
Dec 06 10:08:58 np0005548788.localdomain podman[302140]: 2025-12-06 10:08:58.993551386 +0000 UTC m=+0.062784911 container remove 53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_williamson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7)
Dec 06 10:08:58 np0005548788.localdomain systemd[1]: libpod-conmon-53ec06f7824b8fe895df8e8140076122dcfbd081966e4f6ea81537a038393589.scope: Deactivated successfully.
Dec 06 10:08:59 np0005548788.localdomain sudo[302085]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:59 np0005548788.localdomain sudo[302157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:59 np0005548788.localdomain sudo[302157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:59 np0005548788.localdomain sudo[302157]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:59 np0005548788.localdomain sudo[302175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:59 np0005548788.localdomain sudo[302175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: from='client.44467 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548787.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 2025-12-06 10:08:59.697383848 +0000 UTC m=+0.075338556 container create 3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_morse, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.41.4, ceph=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:08:59 np0005548788.localdomain systemd[1]: Started libpod-conmon-3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c.scope.
Dec 06 10:08:59 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 2025-12-06 10:08:59.758135435 +0000 UTC m=+0.136090133 container init 3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_morse, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218)
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 2025-12-06 10:08:59.666630333 +0000 UTC m=+0.044585071 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 2025-12-06 10:08:59.766537323 +0000 UTC m=+0.144492021 container start 3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_morse, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:08:59 np0005548788.localdomain inspiring_morse[302225]: 167 167
Dec 06 10:08:59 np0005548788.localdomain systemd[1]: libpod-3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c.scope: Deactivated successfully.
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 2025-12-06 10:08:59.766820582 +0000 UTC m=+0.144775330 container attach 3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_morse, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_CLEAN=True, ceph=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:08:59 np0005548788.localdomain podman[302210]: 2025-12-06 10:08:59.772383663 +0000 UTC m=+0.150338371 container died 3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_morse, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 06 10:08:59 np0005548788.localdomain podman[302230]: 2025-12-06 10:08:59.860980726 +0000 UTC m=+0.082186167 container remove 3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_morse, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-26T19:44:28Z, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container)
Dec 06 10:08:59 np0005548788.localdomain systemd[1]: libpod-conmon-3aa23efe80e63dc9b81416e5e19a0ae247d0340fe528eb16072f65f3ef72fd8c.scope: Deactivated successfully.
Dec 06 10:08:59 np0005548788.localdomain sudo[302175]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-186994ff8eb5622a9b0f06ca3e73bcc96d94418b6b446a9cd624d95c4e7e8fe9-merged.mount: Deactivated successfully.
Dec 06 10:09:00 np0005548788.localdomain sudo[302247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:00 np0005548788.localdomain sudo[302247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:00 np0005548788.localdomain sudo[302247]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:00 np0005548788.localdomain sudo[302265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:00 np0005548788.localdomain sudo[302265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548787.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain"} : dispatch
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain"}]': finished
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: Removed host np0005548787.localdomain
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 2025-12-06 10:09:00.580597953 +0000 UTC m=+0.082082133 container create 47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wilbur, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7)
Dec 06 10:09:00 np0005548788.localdomain systemd[1]: Started libpod-conmon-47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38.scope.
Dec 06 10:09:00 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 2025-12-06 10:09:00.544521944 +0000 UTC m=+0.046006134 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 2025-12-06 10:09:00.644826227 +0000 UTC m=+0.146310397 container init 47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wilbur, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 2025-12-06 10:09:00.698343382 +0000 UTC m=+0.199827552 container start 47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wilbur, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=)
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 2025-12-06 10:09:00.698798216 +0000 UTC m=+0.200282426 container attach 47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wilbur, name=rhceph, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, distribution-scope=public)
Dec 06 10:09:00 np0005548788.localdomain crazy_wilbur[302314]: 167 167
Dec 06 10:09:00 np0005548788.localdomain systemd[1]: libpod-47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38.scope: Deactivated successfully.
Dec 06 10:09:00 np0005548788.localdomain podman[302299]: 2025-12-06 10:09:00.701497429 +0000 UTC m=+0.202981619 container died 47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wilbur, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.buildah.version=1.41.4, RELEASE=main, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.)
Dec 06 10:09:00 np0005548788.localdomain podman[302319]: 2025-12-06 10:09:00.802639007 +0000 UTC m=+0.087259723 container remove 47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wilbur, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-type=git, RELEASE=main, GIT_BRANCH=main)
Dec 06 10:09:00 np0005548788.localdomain systemd[1]: libpod-conmon-47699e05219b3531423f416abbd3f1e3bdf722e89fd8579795456b8464fdea38.scope: Deactivated successfully.
Dec 06 10:09:00 np0005548788.localdomain sudo[302265]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-73701a938416927afa7ffe1e7396e06fbe44134248c61f397272c7dcd00bc1fb-merged.mount: Deactivated successfully.
Dec 06 10:09:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:09:03 np0005548788.localdomain systemd[1]: tmp-crun.DkOFVF.mount: Deactivated successfully.
Dec 06 10:09:03 np0005548788.localdomain podman[302337]: 2025-12-06 10:09:03.289370175 +0000 UTC m=+0.096022582 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:09:03 np0005548788.localdomain podman[302337]: 2025-12-06 10:09:03.332686006 +0000 UTC m=+0.139338373 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:09:03 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:09:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:07 np0005548788.localdomain sudo[302356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:07 np0005548788.localdomain sudo[302356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:07 np0005548788.localdomain sudo[302356]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.496 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:09:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548788.localdomain sudo[302374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:07 np0005548788.localdomain sudo[302374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:09:07 np0005548788.localdomain sudo[302374]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:07 np0005548788.localdomain podman[302392]: 2025-12-06 10:09:07.784461099 +0000 UTC m=+0.082812666 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:09:07 np0005548788.localdomain podman[302392]: 2025-12-06 10:09:07.793873729 +0000 UTC m=+0.092225296 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:09:07 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:09:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:09:08 np0005548788.localdomain ceph-mon[293643]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:08 np0005548788.localdomain ceph-mon[293643]: from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:08 np0005548788.localdomain ceph-mon[293643]: Saving service mon spec with placement label:mon
Dec 06 10:09:10 np0005548788.localdomain ceph-mon[293643]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:09:10 np0005548788.localdomain podman[302415]: 2025-12-06 10:09:10.242850646 +0000 UTC m=+0.069416025 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:09:10 np0005548788.localdomain podman[302415]: 2025-12-06 10:09:10.278667717 +0000 UTC m=+0.105233116 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:09:10 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:09:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:12 np0005548788.localdomain ceph-mon[293643]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:12 np0005548788.localdomain ceph-mon[293643]: from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x559a04fd2000 mon_map magic: 0 from mon.1 v2:172.18.0.103:3300/0
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@1(peon) e14  my rank is now 0 (was 1)
Dec 06 10:09:13 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:09:13 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:09:13 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x559a04fd22c0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548788"} v 0)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548789"} v 0)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: paxos.0).electionLogic(62) init, last seen epoch 62
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 is new leader, mons np0005548788,np0005548789 in quorum (ranks 0,1)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : monmap epoch 14
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : last_changed 2025-12-06T10:09:13.351903+0000
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : election_strategy: 1
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e32: np0005548789.mzhmje(active, since 53s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548790"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: Remove daemons mon.np0005548790
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: Safe to remove mon.np0005548790: new quorum should be ['np0005548788', 'np0005548789'] (from ['np0005548788', 'np0005548789'])
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: Removing monitor np0005548790 from monmap...
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: Removing daemon mon.np0005548790 from np0005548790.localdomain -- ports []
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 is new leader, mons np0005548788,np0005548789 in quorum (ranks 0,1)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: monmap epoch 14
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:09:13.351903+0000
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: mgrmap e32: np0005548789.mzhmje(active, since 53s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:13 np0005548788.localdomain sudo[302434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:13 np0005548788.localdomain sudo[302434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548788.localdomain sudo[302434]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548788.localdomain sudo[302452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:13 np0005548788.localdomain sudo[302452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548788.localdomain sudo[302452]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548788.localdomain sudo[302470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548788.localdomain sudo[302470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548788.localdomain sudo[302470]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548788.localdomain sudo[302488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:13 np0005548788.localdomain sudo[302488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548788.localdomain sudo[302488]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548788.localdomain sudo[302506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548788.localdomain sudo[302506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548788.localdomain sudo[302506]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548788.localdomain sudo[302540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548788.localdomain sudo[302540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548788.localdomain sudo[302540]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:14 np0005548788.localdomain sudo[302558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302558]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548788.localdomain sudo[302576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302576]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:14 np0005548788.localdomain sudo[302594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302594]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:14 np0005548788.localdomain sudo[302612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302612]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548788.localdomain sudo[302630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302630]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:14 np0005548788.localdomain sudo[302648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302648]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:14 np0005548788.localdomain sudo[302666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548788.localdomain sudo[302666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302666]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548788.localdomain sudo[302700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548788.localdomain sudo[302700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302700]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548788.localdomain sudo[302718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302718]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain sudo[302736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548788.localdomain sudo[302736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548788.localdomain sudo[302736]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:15 np0005548788.localdomain sudo[302754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:15 np0005548788.localdomain sudo[302754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:15 np0005548788.localdomain sudo[302754]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:15 np0005548788.localdomain sudo[302772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:15 np0005548788.localdomain sudo[302772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:15 np0005548788.localdomain sudo[302772]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:15 np0005548788.localdomain sudo[302790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:15 np0005548788.localdomain sudo[302790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:15.660 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 2025-12-06 10:09:15.90534995 +0000 UTC m=+0.075671177 container create 28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_golick, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, release=1763362218, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:15 np0005548788.localdomain systemd[1]: Started libpod-conmon-28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34.scope.
Dec 06 10:09:15 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 2025-12-06 10:09:15.873375207 +0000 UTC m=+0.043696464 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 2025-12-06 10:09:15.980502649 +0000 UTC m=+0.150823886 container init 28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_golick, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, release=1763362218, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64)
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 2025-12-06 10:09:15.990396994 +0000 UTC m=+0.160718221 container start 28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_golick, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7)
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 2025-12-06 10:09:15.990662532 +0000 UTC m=+0.160983769 container attach 28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_golick, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, release=1763362218, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, ceph=True)
Dec 06 10:09:15 np0005548788.localdomain zen_golick[302841]: 167 167
Dec 06 10:09:15 np0005548788.localdomain systemd[1]: libpod-28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34.scope: Deactivated successfully.
Dec 06 10:09:15 np0005548788.localdomain podman[302825]: 2025-12-06 10:09:15.995074227 +0000 UTC m=+0.165395454 container died 28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_golick, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, ceph=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-78b09030b326b34ada5e7a26316ad7c2a5bcdf50025bf87a3f3a791fd545511f-merged.mount: Deactivated successfully.
Dec 06 10:09:16 np0005548788.localdomain podman[302846]: 2025-12-06 10:09:16.09439534 +0000 UTC m=+0.089874993 container remove 28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_golick, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z)
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: libpod-conmon-28a46deec103023f27f0b23acf22048e03732ec36ec54654b6433bb0321afc34.scope: Deactivated successfully.
Dec 06 10:09:16 np0005548788.localdomain sudo[302790]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:16 np0005548788.localdomain sudo[302860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:16 np0005548788.localdomain sudo[302860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:16 np0005548788.localdomain sudo[302860]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:16 np0005548788.localdomain sudo[302878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:16 np0005548788.localdomain sudo[302878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:16 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 2025-12-06 10:09:16.845079082 +0000 UTC m=+0.084274521 container create 726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_moore, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7)
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9.scope.
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 2025-12-06 10:09:16.907962074 +0000 UTC m=+0.147157453 container init 726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_moore, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 2025-12-06 10:09:16.812259543 +0000 UTC m=+0.051455012 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 2025-12-06 10:09:16.91561728 +0000 UTC m=+0.154812679 container start 726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_moore, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, release=1763362218, name=rhceph, ceph=True)
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 2025-12-06 10:09:16.915805976 +0000 UTC m=+0.155001385 container attach 726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_moore, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.expose-services=, name=rhceph, architecture=x86_64, version=7, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7)
Dec 06 10:09:16 np0005548788.localdomain quizzical_moore[302929]: 167 167
Dec 06 10:09:16 np0005548788.localdomain podman[302914]: 2025-12-06 10:09:16.920001884 +0000 UTC m=+0.159197353 container died 726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_moore, version=7, release=1763362218, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: libpod-726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9.scope: Deactivated successfully.
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: tmp-crun.nhlDoj.mount: Deactivated successfully.
Dec 06 10:09:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-952391e8c4200dae07dd844a040764126229541289766ed291999746b9da024f-merged.mount: Deactivated successfully.
Dec 06 10:09:17 np0005548788.localdomain podman[302934]: 2025-12-06 10:09:17.008164804 +0000 UTC m=+0.080781434 container remove 726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_moore, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public)
Dec 06 10:09:17 np0005548788.localdomain systemd[1]: libpod-conmon-726adc51eb510b60252ed7d9afafa2eeca0c737d4c67efe1ed41ca60b2ff50f9.scope: Deactivated successfully.
Dec 06 10:09:17 np0005548788.localdomain sudo[302878]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:17 np0005548788.localdomain sudo[302957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:17 np0005548788.localdomain sudo[302957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:17 np0005548788.localdomain sudo[302957]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:17 np0005548788.localdomain sudo[302975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:17 np0005548788.localdomain sudo[302975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 2025-12-06 10:09:17.77623412 +0000 UTC m=+0.080207896 container create 426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_almeida, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d.scope.
Dec 06 10:09:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 2025-12-06 10:09:17.841957691 +0000 UTC m=+0.145931437 container init 426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_almeida, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, io.openshift.expose-services=)
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 2025-12-06 10:09:17.743663049 +0000 UTC m=+0.047636845 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 2025-12-06 10:09:17.852078301 +0000 UTC m=+0.156052047 container start 426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_almeida, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1763362218, RELEASE=main, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7)
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 2025-12-06 10:09:17.852409231 +0000 UTC m=+0.156382967 container attach 426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_almeida, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph)
Dec 06 10:09:17 np0005548788.localdomain strange_almeida[303025]: 167 167
Dec 06 10:09:17 np0005548788.localdomain systemd[1]: libpod-426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d.scope: Deactivated successfully.
Dec 06 10:09:17 np0005548788.localdomain podman[303010]: 2025-12-06 10:09:17.855542067 +0000 UTC m=+0.159515823 container died 426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_almeida, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 06 10:09:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bd71953525b275b4e9c452e85d0147dfcc8924eac57a9a1612384122132db5db-merged.mount: Deactivated successfully.
Dec 06 10:09:17 np0005548788.localdomain podman[303031]: 2025-12-06 10:09:17.963887638 +0000 UTC m=+0.094315850 container remove 426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_almeida, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, release=1763362218, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:09:17 np0005548788.localdomain systemd[1]: libpod-conmon-426a3a308419a8794451b201aee13729109e2c4f5c4357aac3f33a7041b8cb1d.scope: Deactivated successfully.
Dec 06 10:09:18 np0005548788.localdomain sudo[302975]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:18 np0005548788.localdomain sudo[303054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:18 np0005548788.localdomain sudo[303054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:18 np0005548788.localdomain sudo[303054]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:18 np0005548788.localdomain sudo[303072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:18 np0005548788.localdomain sudo[303072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 
Dec 06 10:09:18 np0005548788.localdomain podman[303105]: 2025-12-06 10:09:18.81935935 +0000 UTC m=+0.102991286 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 2025-12-06 10:09:18.826251062 +0000 UTC m=+0.089620676 container create 35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1763362218, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, version=7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True)
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 2025-12-06 10:09:18.792084672 +0000 UTC m=+0.055454336 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:18 np0005548788.localdomain systemd[1]: Started libpod-conmon-35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889.scope.
Dec 06 10:09:18 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:18 np0005548788.localdomain podman[303105]: 2025-12-06 10:09:18.935146179 +0000 UTC m=+0.218778105 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:09:18 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 2025-12-06 10:09:18.950666135 +0000 UTC m=+0.214035739 container init 35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_lederberg, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1763362218, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, name=rhceph)
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 2025-12-06 10:09:18.960846769 +0000 UTC m=+0.224216373 container start 35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_lederberg, build-date=2025-11-26T19:44:28Z, release=1763362218, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container)
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 2025-12-06 10:09:18.961255291 +0000 UTC m=+0.224624965 container attach 35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_lederberg, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1763362218, distribution-scope=public, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:09:18 np0005548788.localdomain eloquent_lederberg[303147]: 167 167
Dec 06 10:09:18 np0005548788.localdomain systemd[1]: libpod-35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889.scope: Deactivated successfully.
Dec 06 10:09:18 np0005548788.localdomain podman[303113]: 2025-12-06 10:09:18.966732389 +0000 UTC m=+0.230102033 container died 35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_lederberg, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:09:19 np0005548788.localdomain podman[303152]: 2025-12-06 10:09:19.062180973 +0000 UTC m=+0.085888721 container remove 35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_lederberg, com.redhat.component=rhceph-container, release=1763362218, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True)
Dec 06 10:09:19 np0005548788.localdomain systemd[1]: libpod-conmon-35b32a52563a3b36fe6959e8fa21ffef525902c6b1056ff27f46d79280e48889.scope: Deactivated successfully.
Dec 06 10:09:19 np0005548788.localdomain sudo[303072]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:19 np0005548788.localdomain sudo[303168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:19 np0005548788.localdomain sudo[303168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:19 np0005548788.localdomain sudo[303168]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:19 np0005548788.localdomain sudo[303186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:19 np0005548788.localdomain sudo[303186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:09:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:09:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:09:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:09:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:09:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18224 "" "Go-http-client/1.1"
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:19 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 2025-12-06 10:09:19.849375067 +0000 UTC m=+0.077019538 container create d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_elbakyan, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:19 np0005548788.localdomain systemd[1]: Started libpod-conmon-d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be.scope.
Dec 06 10:09:19 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 2025-12-06 10:09:19.91682424 +0000 UTC m=+0.144468711 container init d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_elbakyan, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 2025-12-06 10:09:19.818426876 +0000 UTC m=+0.046071367 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:19 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a5d5890774948878291c36f761ec7ffa3c1d411b57012c7a4904aa58bf94f266-merged.mount: Deactivated successfully.
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 2025-12-06 10:09:19.927916251 +0000 UTC m=+0.155560722 container start d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_elbakyan, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vcs-type=git, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph)
Dec 06 10:09:19 np0005548788.localdomain systemd[1]: tmp-crun.Q7W4hg.mount: Deactivated successfully.
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 2025-12-06 10:09:19.928169448 +0000 UTC m=+0.155813929 container attach d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_elbakyan, release=1763362218, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., name=rhceph)
Dec 06 10:09:19 np0005548788.localdomain dreamy_elbakyan[303237]: 167 167
Dec 06 10:09:19 np0005548788.localdomain systemd[1]: libpod-d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be.scope: Deactivated successfully.
Dec 06 10:09:19 np0005548788.localdomain podman[303221]: 2025-12-06 10:09:19.932537823 +0000 UTC m=+0.160182314 container died d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_elbakyan, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-00a384976e47c43a6cbbc23f89a8d156c4732e53527de035fb6656d88a4e29b9-merged.mount: Deactivated successfully.
Dec 06 10:09:20 np0005548788.localdomain podman[303242]: 2025-12-06 10:09:20.02972762 +0000 UTC m=+0.085847699 container remove d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_elbakyan, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:09:20 np0005548788.localdomain systemd[1]: libpod-conmon-d3ec7f480c59691f9559437f3814853b562add8b457fd89d550601549d47f2be.scope: Deactivated successfully.
Dec 06 10:09:20 np0005548788.localdomain sudo[303186]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:09:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:09:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:09:23 np0005548788.localdomain podman[303259]: 2025-12-06 10:09:23.276916521 +0000 UTC m=+0.102617526 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:09:23 np0005548788.localdomain podman[303259]: 2025-12-06 10:09:23.284948118 +0000 UTC m=+0.110649173 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:09:23 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:09:23 np0005548788.localdomain podman[303260]: 2025-12-06 10:09:23.337054519 +0000 UTC m=+0.158466202 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Dec 06 10:09:23 np0005548788.localdomain podman[303260]: 2025-12-06 10:09:23.375547502 +0000 UTC m=+0.196959225 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 06 10:09:23 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:09:23 np0005548788.localdomain podman[303258]: 2025-12-06 10:09:23.399304252 +0000 UTC m=+0.225082460 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:09:23 np0005548788.localdomain podman[303258]: 2025-12-06 10:09:23.407017549 +0000 UTC m=+0.232795717 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3)
Dec 06 10:09:23 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:23 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:09:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:09:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:27 np0005548788.localdomain ceph-mon[293643]: from='client.54285 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548790.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:27 np0005548788.localdomain ceph-mon[293643]: Deploying daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:09:27 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:09:27 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:29.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:29.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).monmap v14 adding/updating np0005548790 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:29 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x559a04fd2420 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548788"} v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: paxos.0).electionLogic(64) init, last seen epoch 64
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548789"} v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:30 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:31.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:32.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:32 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:33.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:33.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:33.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:33 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:34.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:09:34 np0005548788.localdomain podman[303322]: 2025-12-06 10:09:34.261783494 +0000 UTC m=+0.083950581 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:09:34 np0005548788.localdomain podman[303322]: 2025-12-06 10:09:34.282404537 +0000 UTC m=+0.104571624 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:09:34 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 is new leader, mons np0005548788,np0005548789 in quorum (ranks 0,1)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : monmap epoch 15
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : last_changed 2025-12-06T10:09:29.475464+0000
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : election_strategy: 1
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548790
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e32: np0005548789.mzhmje(active, since 74s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005548788,np0005548789 (MON_DOWN)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005548788,np0005548789
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005548788,np0005548789
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     mon.np0005548790 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 is new leader, mons np0005548788,np0005548789 in quorum (ranks 0,1)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: monmap epoch 15
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:09:29.475464+0000
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548790
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: mgrmap e32: np0005548789.mzhmje(active, since 74s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1/3 mons down, quorum np0005548788,np0005548789 (MON_DOWN)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005548788,np0005548789
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005548788,np0005548789
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]:     mon.np0005548790 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:35.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:35.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:09:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:35.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:09:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:35.025 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/862062483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:35 np0005548788.localdomain sudo[303341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/862062483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:35 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:35 np0005548788.localdomain sudo[303341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:35 np0005548788.localdomain sudo[303341]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:35 np0005548788.localdomain sudo[303359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:09:35 np0005548788.localdomain sudo[303359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.029 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.030 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.030 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.031 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.031 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:09:36 np0005548788.localdomain sudo[303359]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/159614946' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.498 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 calling monitor election
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: paxos.0).electionLogic(66) init, last seen epoch 66
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : mon.np0005548788 is new leader, mons np0005548788,np0005548789,np0005548790 in quorum (ranks 0,1,2)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : monmap epoch 15
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : last_changed 2025-12-06T10:09:29.475464+0000
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : election_strategy: 1
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548790
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e32: np0005548789.mzhmje(active, since 76s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548788,np0005548789)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] :     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548790 calling monitor election
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548789 calling monitor election
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 calling monitor election
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788 is new leader, mons np0005548788,np0005548789,np0005548790 in quorum (ranks 0,1,2)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: monmap epoch 15
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: last_changed 2025-12-06T10:09:29.475464+0000
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: min_mon_release 18 (reef)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: election_strategy: 1
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548790
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mgrmap e32: np0005548789.mzhmje(active, since 76s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548788,np0005548789)
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.706 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.707 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12351MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.708 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.709 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.767 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.767 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:09:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:36.816 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2288131588' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:37.253 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:09:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:37.258 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:09:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:37.270 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:09:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:37.271 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:09:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:09:37.271 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2288131588' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:09:38 np0005548788.localdomain podman[303452]: 2025-12-06 10:09:38.254130346 +0000 UTC m=+0.080496625 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:09:38 np0005548788.localdomain podman[303452]: 2025-12-06 10:09:38.262674919 +0000 UTC m=+0.089041248 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:09:38 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:38 np0005548788.localdomain sudo[303473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:38 np0005548788.localdomain sudo[303473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548788.localdomain sudo[303473]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548788.localdomain sudo[303491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:38 np0005548788.localdomain sudo[303491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548788.localdomain sudo[303491]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548788.localdomain sudo[303509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:38 np0005548788.localdomain sudo[303509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548788.localdomain sudo[303509]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:38 np0005548788.localdomain sudo[303527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:38 np0005548788.localdomain sudo[303527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548788.localdomain sudo[303527]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:09:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:09:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:09:38 np0005548788.localdomain sudo[303545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:38 np0005548788.localdomain sudo[303545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548788.localdomain sudo[303545]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:39 np0005548788.localdomain sudo[303579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303579]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:39 np0005548788.localdomain sudo[303597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303597]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548788.localdomain sudo[303615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303615]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:39 np0005548788.localdomain sudo[303633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303633]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:39 np0005548788.localdomain sudo[303651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303651]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548788.localdomain sudo[303669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303669]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:39 np0005548788.localdomain sudo[303687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303687]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548788.localdomain sudo[303705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303705]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548788.localdomain sudo[303739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303739]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain sudo[303757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548788.localdomain sudo[303757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303757]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain sudo[303775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548788.localdomain sudo[303775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548788.localdomain sudo[303775]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2596033626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2596033626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:40 np0005548788.localdomain sudo[303793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:40 np0005548788.localdomain sudo[303793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:40 np0005548788.localdomain sudo[303793]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:40 np0005548788.localdomain sudo[303811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:40 np0005548788.localdomain sudo[303811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:40 np0005548788.localdomain sudo[303811]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:40 np0005548788.localdomain sudo[303829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3895678344' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:09:40 np0005548788.localdomain sudo[303829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:09:40 np0005548788.localdomain podman[303847]: 2025-12-06 10:09:40.465373657 +0000 UTC m=+0.090982897 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:09:40 np0005548788.localdomain podman[303847]: 2025-12-06 10:09:40.50161002 +0000 UTC m=+0.127219300 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 06 10:09:40 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 2025-12-06 10:09:40.822940946 +0000 UTC m=+0.071214199 container create 50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_lovelace, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 06 10:09:40 np0005548788.localdomain systemd[1]: Started libpod-conmon-50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9.scope.
Dec 06 10:09:40 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 2025-12-06 10:09:40.890021488 +0000 UTC m=+0.138294771 container init 50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_lovelace, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-26T19:44:28Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph)
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 2025-12-06 10:09:40.796463753 +0000 UTC m=+0.044737026 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 2025-12-06 10:09:40.900461829 +0000 UTC m=+0.148735092 container start 50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_lovelace, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, release=1763362218, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 2025-12-06 10:09:40.900778819 +0000 UTC m=+0.149052082 container attach 50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_lovelace, name=rhceph, ceph=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, release=1763362218, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z)
Dec 06 10:09:40 np0005548788.localdomain gifted_lovelace[303897]: 167 167
Dec 06 10:09:40 np0005548788.localdomain systemd[1]: libpod-50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9.scope: Deactivated successfully.
Dec 06 10:09:40 np0005548788.localdomain podman[303882]: 2025-12-06 10:09:40.905172073 +0000 UTC m=+0.153445336 container died 50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_lovelace, ceph=True, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main)
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3895678344' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:09:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548788.localdomain podman[303902]: 2025-12-06 10:09:41.043539426 +0000 UTC m=+0.128784649 container remove 50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_lovelace, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public)
Dec 06 10:09:41 np0005548788.localdomain systemd[1]: libpod-conmon-50f5c2b790fe64364f6a118696328efe7b78b7e39d6f58b63e1be3ab5b0f9ce9.scope: Deactivated successfully.
Dec 06 10:09:41 np0005548788.localdomain sudo[303829]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:41 np0005548788.localdomain sudo[303919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:41 np0005548788.localdomain sudo[303919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:41 np0005548788.localdomain sudo[303919]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:41 np0005548788.localdomain sudo[303937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:41 np0005548788.localdomain sudo[303937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:41 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-453e3fd83473c5b611d362a8f3e716dd233265db2798587acf5f446ca6db9fd1-merged.mount: Deactivated successfully.
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 2025-12-06 10:09:41.713541979 +0000 UTC m=+0.080428274 container create 95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_banzai, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=)
Dec 06 10:09:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7.scope.
Dec 06 10:09:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 2025-12-06 10:09:41.773545552 +0000 UTC m=+0.140431847 container init 95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_banzai, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 2025-12-06 10:09:41.681265546 +0000 UTC m=+0.048151851 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 2025-12-06 10:09:41.782087765 +0000 UTC m=+0.148974080 container start 95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_banzai, io.buildah.version=1.41.4, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 2025-12-06 10:09:41.782363913 +0000 UTC m=+0.149250218 container attach 95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_banzai, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Dec 06 10:09:41 np0005548788.localdomain zen_banzai[303984]: 167 167
Dec 06 10:09:41 np0005548788.localdomain systemd[1]: libpod-95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7.scope: Deactivated successfully.
Dec 06 10:09:41 np0005548788.localdomain podman[303969]: 2025-12-06 10:09:41.785270844 +0000 UTC m=+0.152157169 container died 95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_banzai, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 06 10:09:41 np0005548788.localdomain podman[303989]: 2025-12-06 10:09:41.876657762 +0000 UTC m=+0.083299112 container remove 95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_banzai, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:09:41 np0005548788.localdomain systemd[1]: libpod-conmon-95c9c7b6cc4d6fa83ca55586a22140af1409e72af88a534e9f06caac5fd15df7.scope: Deactivated successfully.
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:42 np0005548788.localdomain sudo[303937]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:42 np0005548788.localdomain sudo[304014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:42 np0005548788.localdomain sudo[304014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:42 np0005548788.localdomain sudo[304014]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:42 np0005548788.localdomain sudo[304032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:42 np0005548788.localdomain sudo[304032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:42 np0005548788.localdomain sshd[304050]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:09:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-339b78b6558f50e994c28253d2c14bb330db9f7f3f97998e41f61728267b1806-merged.mount: Deactivated successfully.
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 2025-12-06 10:09:42.693440045 +0000 UTC m=+0.073661745 container create 669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_knuth, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, release=1763362218, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:42 np0005548788.localdomain systemd[1]: Started libpod-conmon-669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13.scope.
Dec 06 10:09:42 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 2025-12-06 10:09:42.756514793 +0000 UTC m=+0.136736523 container init 669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_knuth, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, version=7, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 2025-12-06 10:09:42.667366203 +0000 UTC m=+0.047587973 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 2025-12-06 10:09:42.776223209 +0000 UTC m=+0.156444929 container start 669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_knuth, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main)
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 2025-12-06 10:09:42.776559149 +0000 UTC m=+0.156780919 container attach 669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_knuth, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, release=1763362218, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container)
Dec 06 10:09:42 np0005548788.localdomain eloquent_knuth[304085]: 167 167
Dec 06 10:09:42 np0005548788.localdomain systemd[1]: libpod-669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13.scope: Deactivated successfully.
Dec 06 10:09:42 np0005548788.localdomain podman[304069]: 2025-12-06 10:09:42.782871464 +0000 UTC m=+0.163093214 container died 669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_knuth, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:09:42 np0005548788.localdomain podman[304090]: 2025-12-06 10:09:42.876703217 +0000 UTC m=+0.081561047 container remove 669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_knuth, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, name=rhceph, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4)
Dec 06 10:09:42 np0005548788.localdomain systemd[1]: libpod-conmon-669c1aa5f04f8d1bdefb8c28be2169fe51f52ed5b97328c0755eedc66ec7bf13.scope: Deactivated successfully.
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:43 np0005548788.localdomain sudo[304032]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:43 np0005548788.localdomain sudo[304113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:43 np0005548788.localdomain sudo[304113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:43 np0005548788.localdomain sudo[304113]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:43 np0005548788.localdomain sudo[304131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:43 np0005548788.localdomain sudo[304131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:43 np0005548788.localdomain systemd[1]: tmp-crun.pW7vOo.mount: Deactivated successfully.
Dec 06 10:09:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cd6477522911e1cd74802f3031885369794a75baf7bba8b991ba38faf5c6aa5e-merged.mount: Deactivated successfully.
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 2025-12-06 10:09:43.675689953 +0000 UTC m=+0.080239116 container create 82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_hypatia, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1763362218, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Dec 06 10:09:43 np0005548788.localdomain systemd[1]: Started libpod-conmon-82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935.scope.
Dec 06 10:09:43 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 2025-12-06 10:09:43.737484893 +0000 UTC m=+0.142034036 container init 82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_hypatia, name=rhceph, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1763362218, architecture=x86_64, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7)
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 2025-12-06 10:09:43.640510963 +0000 UTC m=+0.045060156 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 2025-12-06 10:09:43.747610694 +0000 UTC m=+0.152159867 container start 82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_hypatia, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, name=rhceph, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7)
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 2025-12-06 10:09:43.747970575 +0000 UTC m=+0.152519718 container attach 82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_hypatia, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, name=rhceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=)
Dec 06 10:09:43 np0005548788.localdomain gracious_hypatia[304182]: 167 167
Dec 06 10:09:43 np0005548788.localdomain systemd[1]: libpod-82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935.scope: Deactivated successfully.
Dec 06 10:09:43 np0005548788.localdomain podman[304167]: 2025-12-06 10:09:43.750561485 +0000 UTC m=+0.155110648 container died 82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_hypatia, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-type=git, ceph=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_BRANCH=main, description=Red Hat Ceph Storage 7)
Dec 06 10:09:43 np0005548788.localdomain podman[304187]: 2025-12-06 10:09:43.832792742 +0000 UTC m=+0.074208942 container remove 82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_hypatia, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:09:43 np0005548788.localdomain systemd[1]: libpod-conmon-82795f970867f806187591c6955b0baac7633a70155867a24e411bf560ef9935.scope: Deactivated successfully.
Dec 06 10:09:43 np0005548788.localdomain sudo[304131]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:44 np0005548788.localdomain sudo[304204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:44 np0005548788.localdomain sudo[304204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:44 np0005548788.localdomain sudo[304204]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:44 np0005548788.localdomain sudo[304222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:44 np0005548788.localdomain sudo[304222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-decff965acd3b280a3f6253cf7557f029aa60603d1b41e7edb232a5520192e5d-merged.mount: Deactivated successfully.
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 2025-12-06 10:09:44.521805318 +0000 UTC m=+0.085659963 container create 43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_bell, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:44 np0005548788.localdomain systemd[1]: Started libpod-conmon-43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da.scope.
Dec 06 10:09:44 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 2025-12-06 10:09:44.481270903 +0000 UTC m=+0.045125598 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 2025-12-06 10:09:44.590958904 +0000 UTC m=+0.154813549 container init 43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_bell, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 2025-12-06 10:09:44.600565909 +0000 UTC m=+0.164420564 container start 43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_bell, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-type=git, com.redhat.component=rhceph-container, release=1763362218, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 2025-12-06 10:09:44.600809107 +0000 UTC m=+0.164663762 container attach 43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_bell, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:09:44 np0005548788.localdomain nervous_bell[304272]: 167 167
Dec 06 10:09:44 np0005548788.localdomain systemd[1]: libpod-43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da.scope: Deactivated successfully.
Dec 06 10:09:44 np0005548788.localdomain podman[304257]: 2025-12-06 10:09:44.605924234 +0000 UTC m=+0.169778929 container died 43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_bell, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, vcs-type=git)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain podman[304277]: 2025-12-06 10:09:44.714940274 +0000 UTC m=+0.100436908 container remove 43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_bell, release=1763362218, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True)
Dec 06 10:09:44 np0005548788.localdomain systemd[1]: libpod-conmon-43959b52702201d6cc59f5690c7ee34080cc5d42114c9c58d12421b991c533da.scope: Deactivated successfully.
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain sudo[304222]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='client.44541 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfig service osd.default_drive_group
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-8f9b0756d67c5a8c5123550d9631861dd70b5b3aa6133492d2141cbd5c000111-merged.mount: Deactivated successfully.
Dec 06 10:09:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:09:47.432 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:09:47.433 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:09:47.433 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.128724) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789128767, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2270, "num_deletes": 252, "total_data_size": 3532235, "memory_usage": 3592104, "flush_reason": "Manual Compaction"}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789145966, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2535380, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18028, "largest_seqno": 20293, "table_properties": {"data_size": 2525607, "index_size": 5830, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25518, "raw_average_key_size": 22, "raw_value_size": 2504070, "raw_average_value_size": 2204, "num_data_blocks": 257, "num_entries": 1136, "num_filter_entries": 1136, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015732, "oldest_key_time": 1765015732, "file_creation_time": 1765015789, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 17300 microseconds, and 6071 cpu microseconds.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.146020) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2535380 bytes OK
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.146045) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.147811) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.147835) EVENT_LOG_v1 {"time_micros": 1765015789147828, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.147859) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 3521433, prev total WAL file size 3537785, number of live WAL files 2.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.149170) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2475KB)], [30(16MB)]
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789149250, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 19425382, "oldest_snapshot_seqno": -1}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11613 keys, 17625140 bytes, temperature: kUnknown
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789262479, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17625140, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17557926, "index_size": 37097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29061, "raw_key_size": 311818, "raw_average_key_size": 26, "raw_value_size": 17359053, "raw_average_value_size": 1494, "num_data_blocks": 1411, "num_entries": 11613, "num_filter_entries": 11613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015789, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.262798) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17625140 bytes
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.265172) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.4 rd, 155.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 16.1 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(14.6) write-amplify(7.0) OK, records in: 12152, records dropped: 539 output_compression: NoCompression
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.265239) EVENT_LOG_v1 {"time_micros": 1765015789265224, "job": 16, "event": "compaction_finished", "compaction_time_micros": 113328, "compaction_time_cpu_micros": 42987, "output_level": 6, "num_output_files": 1, "total_output_size": 17625140, "num_input_records": 12152, "num_output_records": 11613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789265738, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789268347, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.149053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.268486) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.268494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.268497) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.268500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:09:49.268503) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548788.localdomain systemd[1]: tmp-crun.BvfSGu.mount: Deactivated successfully.
Dec 06 10:09:49 np0005548788.localdomain podman[304293]: 2025-12-06 10:09:49.291308387 +0000 UTC m=+0.110586641 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:09:49 np0005548788.localdomain podman[304293]: 2025-12-06 10:09:49.387835463 +0000 UTC m=+0.207113677 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/90840268' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e92 do_prune osdmap full prune enabled
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Activating manager daemon np0005548785.vhqlsq
Dec 06 10:09:49 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 e93: 6 total, 6 up, 6 in
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/90840268' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e33: np0005548785.vhqlsq(active, starting, since 0.0485672s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548788"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548789"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 0
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 0
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 0
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 1
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Manager daemon np0005548785.vhqlsq is now available
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain sshd[299942]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:09:49 np0005548788.localdomain systemd[1]: session-70.scope: Deactivated successfully.
Dec 06 10:09:49 np0005548788.localdomain systemd[1]: session-70.scope: Consumed 25.154s CPU time.
Dec 06 10:09:49 np0005548788.localdomain systemd-logind[765]: Session 70 logged out. Waiting for processes to exit.
Dec 06 10:09:49 np0005548788.localdomain systemd-logind[765]: Removed session 70.
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"}]': finished
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"}]': finished
Dec 06 10:09:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:09:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/mirror_snapshot_schedule"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:09:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/trash_purge_schedule"} v 0)
Dec 06 10:09:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/trash_purge_schedule"} : dispatch
Dec 06 10:09:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:09:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18220 "" "Go-http-client/1.1"
Dec 06 10:09:49 np0005548788.localdomain sshd[304320]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:09:49 np0005548788.localdomain sshd[304320]: Accepted publickey for ceph-admin from 192.168.122.103 port 58340 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:09:49 np0005548788.localdomain systemd-logind[765]: New session 71 of user ceph-admin.
Dec 06 10:09:49 np0005548788.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Dec 06 10:09:49 np0005548788.localdomain sshd[304320]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:09:49 np0005548788.localdomain sudo[304324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:50 np0005548788.localdomain sudo[304324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:50 np0005548788.localdomain sudo[304324]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:50 np0005548788.localdomain sudo[304342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:09:50 np0005548788.localdomain sudo[304342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/90840268' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548785.vhqlsq
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: osdmap e93: 6 total, 6 up, 6 in
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/90840268' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: mgrmap e33: np0005548785.vhqlsq(active, starting, since 0.0485672s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: Manager daemon np0005548785.vhqlsq is now available
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: removing stray HostCache host record np0005548787.localdomain.devices.0
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"}]': finished
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"}]': finished
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/trash_purge_schedule"} : dispatch
Dec 06 10:09:50 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e34: np0005548785.vhqlsq(active, since 1.08564s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:50 np0005548788.localdomain sshd[304050]: Received disconnect from 45.78.194.186 port 57738:11: Bye Bye [preauth]
Dec 06 10:09:50 np0005548788.localdomain sshd[304050]: Disconnected from authenticating user root 45.78.194.186 port 57738 [preauth]
Dec 06 10:09:50 np0005548788.localdomain podman[304432]: 2025-12-06 10:09:50.933354774 +0000 UTC m=+0.099760278 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.openshift.expose-services=, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container)
Dec 06 10:09:51 np0005548788.localdomain podman[304432]: 2025-12-06 10:09:51.035370509 +0000 UTC m=+0.201776013 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mgrmap e34: np0005548785.vhqlsq(active, since 1.08564s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:09:50] ENGINE Bus STARTING
Dec 06 10:09:51 np0005548788.localdomain sudo[304342]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:51 np0005548788.localdomain sudo[304552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:51 np0005548788.localdomain sudo[304552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:51 np0005548788.localdomain sudo[304552]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:51 np0005548788.localdomain sudo[304570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:09:51 np0005548788.localdomain sudo[304570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:52 np0005548788.localdomain sudo[304570]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548788.localdomain sudo[304619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:52 np0005548788.localdomain sudo[304619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:52 np0005548788.localdomain sudo[304619]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e35: np0005548785.vhqlsq(active, since 3s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:52 np0005548788.localdomain sudo[304637]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:09:52 np0005548788.localdomain sudo[304637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548788.localdomain sudo[304637]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain sudo[304675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:53 np0005548788.localdomain sudo[304675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:09:53 np0005548788.localdomain sudo[304675]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:09:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:09:53 np0005548788.localdomain sudo[304696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:53 np0005548788.localdomain sudo[304696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548788.localdomain sudo[304696]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548788.localdomain podman[304695]: 2025-12-06 10:09:53.612422223 +0000 UTC m=+0.089736859 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:09:53 np0005548788.localdomain podman[304694]: 2025-12-06 10:09:53.666395102 +0000 UTC m=+0.145920445 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:09:53 np0005548788.localdomain sudo[304745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548788.localdomain sudo[304745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548788.localdomain sudo[304745]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548788.localdomain podman[304694]: 2025-12-06 10:09:53.678502494 +0000 UTC m=+0.158027847 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:09:53 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:09:53 np0005548788.localdomain podman[304695]: 2025-12-06 10:09:53.729549173 +0000 UTC m=+0.206863879 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec 06 10:09:53 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:09:53 np0005548788.localdomain sudo[304777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:53 np0005548788.localdomain sudo[304777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548788.localdomain sudo[304777]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: mgrmap e35: np0005548785.vhqlsq(active, since 3s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:53 np0005548788.localdomain podman[304693]: 2025-12-06 10:09:53.832875299 +0000 UTC m=+0.312556317 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:53 np0005548788.localdomain podman[304693]: 2025-12-06 10:09:53.844185476 +0000 UTC m=+0.323866474 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:09:53 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:09:53 np0005548788.localdomain sudo[304795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548788.localdomain sudo[304795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548788.localdomain sudo[304795]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548788.localdomain sudo[304839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548788.localdomain sudo[304839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304839]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:54 np0005548788.localdomain sudo[304857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304857]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548788.localdomain sudo[304875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:54 np0005548788.localdomain sudo[304893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:54 np0005548788.localdomain sudo[304911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304911]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548788.localdomain sudo[304929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304929]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:54 np0005548788.localdomain sudo[304947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304947]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[304965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548788.localdomain sudo[304965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304965]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : Standby manager daemon np0005548789.mzhmje started
Dec 06 10:09:54 np0005548788.localdomain sudo[304999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548788.localdomain sudo[304999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[304999]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[305017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548788.localdomain sudo[305017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[305017]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain sudo[305035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:54 np0005548788.localdomain sudo[305035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[305035]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e36: np0005548785.vhqlsq(active, since 5s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} v 0)
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548788.localdomain ceph-mon[293643]: Standby manager daemon np0005548789.mzhmje started
Dec 06 10:09:54 np0005548788.localdomain sudo[305053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:54 np0005548788.localdomain sudo[305053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548788.localdomain sudo[305053]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:55 np0005548788.localdomain sudo[305071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305071]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305089]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:55 np0005548788.localdomain sudo[305107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305107]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305125]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305159]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305177]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548788.localdomain sudo[305195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305195]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:55 np0005548788.localdomain sudo[305213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305213]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:55 np0005548788.localdomain sudo[305231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305231]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305249]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:55 np0005548788.localdomain sudo[305267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305267]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain sudo[305285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305285]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: mgrmap e36: np0005548785.vhqlsq(active, since 5s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:09:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:09:55 np0005548788.localdomain sudo[305319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548788.localdomain sudo[305319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548788.localdomain sudo[305319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548788.localdomain sudo[305337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:56 np0005548788.localdomain sudo[305337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548788.localdomain sudo[305337]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain sudo[305355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548788.localdomain sudo[305355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548788.localdomain sudo[305355]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:56 np0005548788.localdomain sudo[305373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:56 np0005548788.localdomain sudo[305373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548788.localdomain sudo[305373]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:56 np0005548788.localdomain sudo[305391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:56 np0005548788.localdomain sudo[305391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548788.localdomain sudo[305391]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:56 np0005548788.localdomain sudo[305409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:56 np0005548788.localdomain sudo[305409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 2025-12-06 10:09:57.163867084 +0000 UTC m=+0.078311988 container create 4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_diffie, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, name=rhceph, distribution-scope=public, build-date=2025-11-26T19:44:28Z, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 06 10:09:57 np0005548788.localdomain systemd[1]: Started libpod-conmon-4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68.scope.
Dec 06 10:09:57 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 2025-12-06 10:09:57.131802599 +0000 UTC m=+0.046247543 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 2025-12-06 10:09:57.249003802 +0000 UTC m=+0.163448716 container init 4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_diffie, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 2025-12-06 10:09:57.260127452 +0000 UTC m=+0.174572366 container start 4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_diffie, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.41.4)
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 2025-12-06 10:09:57.261268208 +0000 UTC m=+0.175713152 container attach 4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_diffie, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_BRANCH=main, version=7, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Dec 06 10:09:57 np0005548788.localdomain romantic_diffie[305458]: 167 167
Dec 06 10:09:57 np0005548788.localdomain systemd[1]: libpod-4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68.scope: Deactivated successfully.
Dec 06 10:09:57 np0005548788.localdomain podman[305443]: 2025-12-06 10:09:57.266015084 +0000 UTC m=+0.180459958 container died 4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_diffie, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1763362218, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 06 10:09:57 np0005548788.localdomain podman[305463]: 2025-12-06 10:09:57.37000526 +0000 UTC m=+0.089822781 container remove 4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_diffie, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:57 np0005548788.localdomain systemd[1]: libpod-conmon-4c948cc07542338d70a681eb542cf32fe82eefa425f53e07d6a199670ac5fd68.scope: Deactivated successfully.
Dec 06 10:09:57 np0005548788.localdomain sudo[305409]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:57 np0005548788.localdomain sudo[305487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:57 np0005548788.localdomain sudo[305487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:57 np0005548788.localdomain sudo[305487]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:57 np0005548788.localdomain sudo[305505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:57 np0005548788.localdomain sudo[305505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:09:56] ENGINE Error in 'start' listener <bound method Server.start of <cephadm.service_discovery.Root object at 0x7fef2d81f340>>
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish
                                                               output.append(listener(*args, **kwargs))
                                                             File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start
                                                               super(Server, self).start()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start
                                                               self.wait()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait
                                                               portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)
                                                             File "/lib/python3.9/site-packages/portend.py", line 162, in occupied
                                                               raise Timeout("Port {port} not bound on {host}.".format(**locals()))
                                                           portend.Timeout: Port 8765 not bound on 172.18.0.103.
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-baf96ac0b54009ab4f136d30d21d9b4d8c45dcaca9bbed4c12ed5c0e15dcfdfa-merged.mount: Deactivated successfully.
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 2025-12-06 10:09:58.210937666 +0000 UTC m=+0.068496667 container create fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_feynman, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, RELEASE=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:58 np0005548788.localdomain systemd[1]: Started libpod-conmon-fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551.scope.
Dec 06 10:09:58 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 2025-12-06 10:09:58.276123859 +0000 UTC m=+0.133682860 container init fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_feynman, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=1763362218, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, version=7, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 2025-12-06 10:09:58.282262327 +0000 UTC m=+0.139821328 container start fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_feynman, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, RELEASE=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 2025-12-06 10:09:58.282555656 +0000 UTC m=+0.140114697 container attach fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_feynman, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:09:58 np0005548788.localdomain awesome_feynman[305555]: 167 167
Dec 06 10:09:58 np0005548788.localdomain systemd[1]: libpod-fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551.scope: Deactivated successfully.
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 2025-12-06 10:09:58.284530857 +0000 UTC m=+0.142089838 container died fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_feynman, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 10:09:58 np0005548788.localdomain podman[305540]: 2025-12-06 10:09:58.188942979 +0000 UTC m=+0.046501960 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:58 np0005548788.localdomain podman[305560]: 2025-12-06 10:09:58.37215214 +0000 UTC m=+0.070167178 container remove fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_feynman, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main)
Dec 06 10:09:58 np0005548788.localdomain systemd[1]: libpod-conmon-fb907e241664e8ebfeaafeb98a8dd6d7101fea6d7046a7a6be02874fb2012551.scope: Deactivated successfully.
Dec 06 10:09:58 np0005548788.localdomain sudo[305505]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:59 np0005548788.localdomain systemd[1]: tmp-crun.wkNulw.mount: Deactivated successfully.
Dec 06 10:09:59 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b790a9629c5a102f8117aae118d6d401d76dd536b48baaae883423567f4a4877-merged.mount: Deactivated successfully.
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:10:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:10:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:02 np0005548788.localdomain sudo[305585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:02 np0005548788.localdomain sudo[305585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:02 np0005548788.localdomain sudo[305585]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE Error in 'start' listener <bound method Server.start of <cephadm.agent.HostData object at 0x7fefc03cbf40>>
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish
                                                               output.append(listener(*args, **kwargs))
                                                             File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start
                                                               super(Server, self).start()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start
                                                               self.wait()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait
                                                               portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)
                                                             File "/lib/python3.9/site-packages/portend.py", line 162, in occupied
                                                               raise Timeout("Port {port} not bound on {host}.".format(**locals()))
                                                           portend.Timeout: Port 7150 not bound on 172.18.0.103.
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE Shutting down due to error in start listener:
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 268, in start
                                                               self.publish('start')
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 248, in publish
                                                               raise exc
                                                           cherrypy.process.wspbus.ChannelFailures: Timeout('Port 8765 not bound on 172.18.0.103.')
                                                           Timeout('Port 7150 not bound on 172.18.0.103.')
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE Bus STOPPING
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 8765)) already shut down
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 7150)) already shut down
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE Bus STOPPED
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE Bus EXITING
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:01] ENGINE Bus EXITED
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: Failed to run cephadm http server: Timeout('Port 8765 not bound on 172.18.0.103.')
                                                           Timeout('Port 7150 not bound on 172.18.0.103.')
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:10:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:10:05 np0005548788.localdomain podman[305603]: 2025-12-06 10:10:05.25533601 +0000 UTC m=+0.084102966 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:10:05 np0005548788.localdomain podman[305603]: 2025-12-06 10:10:05.292642137 +0000 UTC m=+0.121409083 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 10:10:05 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:10:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:06 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e37: np0005548785.vhqlsq(active, since 17s), standbys: np0005548788.yvwbqq, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:07 np0005548788.localdomain ceph-mon[293643]: mgrmap e37: np0005548785.vhqlsq(active, since 17s), standbys: np0005548788.yvwbqq, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:10:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:10:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:10:09 np0005548788.localdomain systemd[298457]: Starting Mark boot as successful...
Dec 06 10:10:09 np0005548788.localdomain podman[305623]: 2025-12-06 10:10:09.258297028 +0000 UTC m=+0.085594411 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:10:09 np0005548788.localdomain systemd[298457]: Finished Mark boot as successful.
Dec 06 10:10:09 np0005548788.localdomain podman[305623]: 2025-12-06 10:10:09.271979848 +0000 UTC m=+0.099277241 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:10:09 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:10:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:10:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5621 writes, 24K keys, 5621 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5621 writes, 913 syncs, 6.16 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 203 writes, 505 keys, 203 commit groups, 1.0 writes per commit group, ingest: 0.50 MB, 0.00 MB/s
                                                          Interval WAL: 203 writes, 91 syncs, 2.23 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:10:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check update: 2 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:10:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check update: 2 stray host(s) with 2 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:10:10 np0005548788.localdomain ceph-mon[293643]: Health check update: 2 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:10:10 np0005548788.localdomain ceph-mon[293643]: Health check update: 2 stray host(s) with 2 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:10:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:10:11 np0005548788.localdomain podman[305647]: 2025-12-06 10:10:11.249264 +0000 UTC m=+0.075315636 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:10:11 np0005548788.localdomain podman[305647]: 2025-12-06 10:10:11.256770111 +0000 UTC m=+0.082821737 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:10:11 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:10:11 np0005548788.localdomain ceph-mon[293643]: pgmap v3: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:13 np0005548788.localdomain ceph-mon[293643]: pgmap v4: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:10:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.2 total, 600.0 interval
                                                          Cumulative writes: 5387 writes, 23K keys, 5387 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5387 writes, 687 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 43 writes, 135 keys, 43 commit groups, 1.0 writes per commit group, ingest: 0.21 MB, 0.00 MB/s
                                                          Interval WAL: 43 writes, 21 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:10:15 np0005548788.localdomain ceph-mon[293643]: pgmap v5: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:17 np0005548788.localdomain ceph-mon[293643]: pgmap v6: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:10:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:10:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:10:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:10:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:10:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18222 "" "Go-http-client/1.1"
Dec 06 10:10:19 np0005548788.localdomain ceph-mon[293643]: pgmap v7: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:10:20 np0005548788.localdomain podman[305665]: 2025-12-06 10:10:20.268763477 +0000 UTC m=+0.090445912 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:10:20 np0005548788.localdomain podman[305665]: 2025-12-06 10:10:20.358597278 +0000 UTC m=+0.180279693 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 06 10:10:20 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:10:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:21 np0005548788.localdomain ceph-mon[293643]: pgmap v8: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:23 np0005548788.localdomain ceph-mon[293643]: pgmap v9: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:10:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:10:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:10:24 np0005548788.localdomain podman[305692]: 2025-12-06 10:10:24.269368963 +0000 UTC m=+0.090689489 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:10:24 np0005548788.localdomain podman[305692]: 2025-12-06 10:10:24.306949368 +0000 UTC m=+0.128269844 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:10:24 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:10:24 np0005548788.localdomain podman[305693]: 2025-12-06 10:10:24.308273099 +0000 UTC m=+0.125850209 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Dec 06 10:10:24 np0005548788.localdomain podman[305691]: 2025-12-06 10:10:24.368660425 +0000 UTC m=+0.189739353 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:10:24 np0005548788.localdomain podman[305691]: 2025-12-06 10:10:24.382558392 +0000 UTC m=+0.203637320 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:10:24 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:10:24 np0005548788.localdomain podman[305693]: 2025-12-06 10:10:24.395703096 +0000 UTC m=+0.213280256 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:10:24 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:10:25 np0005548788.localdomain ceph-mon[293643]: pgmap v10: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:27 np0005548788.localdomain ceph-mon[293643]: pgmap v11: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3108344624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:29 np0005548788.localdomain ceph-mon[293643]: pgmap v12: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3639889960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:31.273 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:31.273 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:10:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:31 np0005548788.localdomain ceph-mon[293643]: pgmap v13: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:33.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:33.002 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:33.026 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:33.026 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:33.026 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:33.027 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:33 np0005548788.localdomain ceph-mon[293643]: pgmap v14: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:34.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4114425361' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:36 np0005548788.localdomain ceph-mon[293643]: pgmap v15: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:36 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1769226532' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:10:36 np0005548788.localdomain podman[305751]: 2025-12-06 10:10:36.274189674 +0000 UTC m=+0.101471280 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:10:36 np0005548788.localdomain podman[305751]: 2025-12-06 10:10:36.315688599 +0000 UTC m=+0.142970245 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:10:36 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:10:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:37.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:37.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:10:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:37.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:10:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:37.022 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3346912753' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e93 do_prune osdmap full prune enabled
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Activating manager daemon np0005548788.yvwbqq
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 e94: 6 total, 6 up, 6 in
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: mgr handle_mgr_map Activating!
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: mgr handle_mgr_map I am now activating
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3346912753' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e38: np0005548788.yvwbqq(active, starting, since 0.0607571s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548788"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548789"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 0
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 0
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 0
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds metadata"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).mds e16 all = 1
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon metadata"} v 0)
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: balancer
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Starting
Dec 06 10:10:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Manager daemon np0005548788.yvwbqq is now available
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Optimize plan auto_2025-12-06_10:10:37
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 06 10:10:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Dec 06 10:10:37 np0005548788.localdomain sshd[304320]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:10:37 np0005548788.localdomain systemd[1]: session-71.scope: Deactivated successfully.
Dec 06 10:10:37 np0005548788.localdomain systemd[1]: session-71.scope: Consumed 7.743s CPU time.
Dec 06 10:10:37 np0005548788.localdomain systemd-logind[765]: Session 71 logged out. Waiting for processes to exit.
Dec 06 10:10:38 np0005548788.localdomain systemd-logind[765]: Removed session 71.
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: cephadm
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: crash
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: devicehealth
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: iostat
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [devicehealth INFO root] Starting
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: nfs
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: orchestrator
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: pg_autoscaler
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: progress
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] _maybe_adjust
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.029 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.029 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.029 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.030 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: pgmap v16: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3346912753' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548788.yvwbqq
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: osdmap e94: 6 total, 6 up, 6 in
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3346912753' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: mgrmap e38: np0005548788.yvwbqq(active, starting, since 0.0607571s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: Manager daemon np0005548788.yvwbqq is now available
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Loading...
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f6831571f10>, <progress.module.GhostEvent object at 0x7f6831571f40>, <progress.module.GhostEvent object at 0x7f6831571f70>, <progress.module.GhostEvent object at 0x7f6831571fa0>, <progress.module.GhostEvent object at 0x7f6831571fd0>, <progress.module.GhostEvent object at 0x7f6831583040>, <progress.module.GhostEvent object at 0x7f6831583070>, <progress.module.GhostEvent object at 0x7f68315830a0>, <progress.module.GhostEvent object at 0x7f68315830d0>, <progress.module.GhostEvent object at 0x7f6831583100>, <progress.module.GhostEvent object at 0x7f6831583130>, <progress.module.GhostEvent object at 0x7f6831583160>, <progress.module.GhostEvent object at 0x7f6831583190>, <progress.module.GhostEvent object at 0x7f68315831c0>, <progress.module.GhostEvent object at 0x7f68315831f0>, <progress.module.GhostEvent object at 0x7f6831583220>, <progress.module.GhostEvent object at 0x7f6831583250>, <progress.module.GhostEvent object at 0x7f6831583280>, <progress.module.GhostEvent object at 0x7f68315832b0>, <progress.module.GhostEvent object at 0x7f68315832e0>, <progress.module.GhostEvent object at 0x7f6831583310>, <progress.module.GhostEvent object at 0x7f6831583340>, <progress.module.GhostEvent object at 0x7f6831583370>, <progress.module.GhostEvent object at 0x7f68315833a0>, <progress.module.GhostEvent object at 0x7f68315833d0>, <progress.module.GhostEvent object at 0x7f6831583400>, <progress.module.GhostEvent object at 0x7f6831583430>, <progress.module.GhostEvent object at 0x7f6831583460>, <progress.module.GhostEvent object at 0x7f6831583490>, <progress.module.GhostEvent object at 0x7f68315834c0>, <progress.module.GhostEvent object at 0x7f68315834f0>, <progress.module.GhostEvent object at 0x7f6831583520>, <progress.module.GhostEvent object at 0x7f6831583550>, <progress.module.GhostEvent object at 0x7f6831583580>, <progress.module.GhostEvent object at 0x7f68315835b0>, <progress.module.GhostEvent object at 0x7f68315835e0>, <progress.module.GhostEvent object at 0x7f6831583610>, <progress.module.GhostEvent object at 0x7f6831583640>, <progress.module.GhostEvent object at 0x7f6831583670>, <progress.module.GhostEvent object at 0x7f68315836a0>, <progress.module.GhostEvent object at 0x7f68315836d0>, <progress.module.GhostEvent object at 0x7f6831583700>, <progress.module.GhostEvent object at 0x7f6831583730>, <progress.module.GhostEvent object at 0x7f6831583760>, <progress.module.GhostEvent object at 0x7f6831583790>, <progress.module.GhostEvent object at 0x7f68315837c0>, <progress.module.GhostEvent object at 0x7f68315837f0>, <progress.module.GhostEvent object at 0x7f6831583820>, <progress.module.GhostEvent object at 0x7f6831583850>, <progress.module.GhostEvent object at 0x7f6831583880>] historic events
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Loaded OSDMap, ready.
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] recovery thread starting
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] starting setup
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: rbd_support
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: restful
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [restful INFO root] server_addr: :: server_port: 8003
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: status
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: telemetry
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [restful WARNING root] server not running: no certificate configured
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} v 0)
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] PerfHandler: starting
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: vms, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: volumes, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: mgr load Constructed class from module: volumes
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.104+0000 7f6820cf7640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.104+0000 7f6820cf7640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.104+0000 7f6820cf7640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.104+0000 7f6820cf7640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.104+0000 7f6820cf7640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.110+0000 7f681ccaf640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.110+0000 7f681ccaf640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.110+0000 7f681ccaf640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.110+0000 7f681ccaf640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:10:38.110+0000 7f681ccaf640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: images, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_task_task: backups, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TaskHandler: starting
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} v 0)
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] setup complete
Dec 06 10:10:38 np0005548788.localdomain sshd[305929]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:10:38 np0005548788.localdomain sshd[305929]: Accepted publickey for ceph-admin from 192.168.122.106 port 51702 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:10:38 np0005548788.localdomain systemd-logind[765]: New session 72 of user ceph-admin.
Dec 06 10:10:38 np0005548788.localdomain systemd[1]: Started Session 72 of User ceph-admin.
Dec 06 10:10:38 np0005548788.localdomain sshd[305929]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3616421039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.456 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:10:38 np0005548788.localdomain sudo[305933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:38 np0005548788.localdomain sudo[305933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:38 np0005548788.localdomain sudo[305933]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:38 np0005548788.localdomain sudo[305953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:10:38 np0005548788.localdomain sudo[305953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.619 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.620 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12326MB free_disk=0.0GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.620 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.620 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.687 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.687 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=0GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:10:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:38.708 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:10:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:10:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:10:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e39: np0005548788.yvwbqq(active, since 1.0804s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44592 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3616421039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: mgrmap e39: np0005548788.yvwbqq(active, since 1.0804s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/261091109' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/261091109' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/212188230' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.202 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.209 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:10:39] ENGINE Bus STARTING
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:10:39] ENGINE Bus STARTING
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.326 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updated inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.326 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.327 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.355 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:10:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:10:39.356 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:10:39] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:10:39] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:10:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:10:39 np0005548788.localdomain podman[306074]: 2025-12-06 10:10:39.498092019 +0000 UTC m=+0.108659350 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=rhceph, build-date=2025-11-26T19:44:28Z, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1763362218, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:10:39] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:10:39] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:10:39] ENGINE Bus STARTED
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:10:39] ENGINE Bus STARTED
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:10:39] ENGINE Client ('172.18.0.106', 48356) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:10:39] ENGINE Client ('172.18.0.106', 48356) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:10:39 np0005548788.localdomain systemd[1]: tmp-crun.a3UuJA.mount: Deactivated successfully.
Dec 06 10:10:39 np0005548788.localdomain podman[306105]: 2025-12-06 10:10:39.590309284 +0000 UTC m=+0.082850768 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44646 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:39 np0005548788.localdomain podman[306074]: 2025-12-06 10:10:39.61459445 +0000 UTC m=+0.225161731 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:10:39 np0005548788.localdomain podman[306105]: 2025-12-06 10:10:39.670967923 +0000 UTC m=+0.163509367 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:10:39 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 2 stray daemon(s) not managed by cephadm)
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 2 stray host(s) with 2 daemon(s) not managed by cephadm)
Dec 06 10:10:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 06 10:10:39 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/212188230' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:39] ENGINE Bus STARTING
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:39] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:39] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:39] ENGINE Bus STARTED
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:10:39] ENGINE Client ('172.18.0.106', 48356) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 2 stray daemon(s) not managed by cephadm)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_HOST (was: 2 stray host(s) with 2 daemon(s) not managed by cephadm)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: Cluster is now healthy
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:40 np0005548788.localdomain ceph-mgr[286998]: [devicehealth INFO root] Check health
Dec 06 10:10:40 np0005548788.localdomain sudo[305953]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:40 np0005548788.localdomain sudo[306238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:40 np0005548788.localdomain sudo[306238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:40 np0005548788.localdomain sudo[306238]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:40 np0005548788.localdomain sudo[306256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:10:40 np0005548788.localdomain sudo[306256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:40 np0005548788.localdomain sudo[306256]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='client.44646 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain sudo[306305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:41 np0005548788.localdomain sudo[306305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:41 np0005548788.localdomain sudo[306305]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:41 np0005548788.localdomain sudo[306323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:10:41 np0005548788.localdomain sudo[306323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e40: np0005548788.yvwbqq(active, since 3s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:10:41 np0005548788.localdomain sudo[306323]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44655 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO root] Saving service mon spec with placement label:mon
Dec 06 10:10:41 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 06 10:10:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain sudo[306359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:10:42 np0005548788.localdomain sudo[306359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:10:42 np0005548788.localdomain sudo[306359]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:10:42 np0005548788.localdomain sudo[306378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306378]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain podman[306377]: 2025-12-06 10:10:42.124713077 +0000 UTC m=+0.083456376 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:10:42 np0005548788.localdomain podman[306377]: 2025-12-06 10:10:42.159660341 +0000 UTC m=+0.118403690 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:10:42 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:10:42 np0005548788.localdomain sudo[306411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548788.localdomain sudo[306411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306411]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:42 np0005548788.localdomain sudo[306431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306431]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548788.localdomain sudo[306449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306449]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: mgrmap e40: np0005548788.yvwbqq(active, since 3s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548788.localdomain sudo[306483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548788.localdomain sudo[306483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306483]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548788.localdomain sudo[306501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306501]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain sudo[306519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain sudo[306519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306519]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:42 np0005548788.localdomain sudo[306537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:42 np0005548788.localdomain sudo[306537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306537]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:42 np0005548788.localdomain sudo[306555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306555]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:42 np0005548788.localdomain sudo[306573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306573]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548788.localdomain sudo[306591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:42 np0005548788.localdomain sudo[306591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548788.localdomain sudo[306591]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain sudo[306609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306609]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : Standby manager daemon np0005548785.vhqlsq started
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: mgr.server handle_open ignoring open from mgr.np0005548785.vhqlsq 172.18.0.200:0/2158312331; not ready for session (expect reconnect)
Dec 06 10:10:43 np0005548788.localdomain sudo[306643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306643]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain sudo[306661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306661]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain sudo[306679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:43 np0005548788.localdomain sudo[306679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306679]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain sudo[306697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:10:43 np0005548788.localdomain sudo[306697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306697]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: from='client.44655 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Saving service mon spec with placement label:mon
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: Standby manager daemon np0005548785.vhqlsq started
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44658 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:43 np0005548788.localdomain sudo[306715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:10:43 np0005548788.localdomain sudo[306715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306715]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain sudo[306733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306733]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain sudo[306751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:43 np0005548788.localdomain sudo[306751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306751]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain sudo[306769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306769]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548788.localdomain sudo[306803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306803]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain sudo[306821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548788.localdomain sudo[306821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548788.localdomain sudo[306821]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e41: np0005548788.yvwbqq(active, since 6s), standbys: np0005548790.kvkfyr, np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} v 0)
Dec 06 10:10:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:10:44 np0005548788.localdomain sudo[306839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain sudo[306839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306839]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain sudo[306857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:44 np0005548788.localdomain sudo[306857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306857]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain sudo[306875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:44 np0005548788.localdomain sudo[306875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain sudo[306893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548788.localdomain sudo[306893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain sudo[306911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:44 np0005548788.localdomain sudo[306911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306911]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain sudo[306929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548788.localdomain sudo[306929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306929]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: from='client.44658 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mgrmap e41: np0005548788.yvwbqq(active, since 6s), standbys: np0005548790.kvkfyr, np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:10:44 np0005548788.localdomain sudo[306963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548788.localdomain sudo[306963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain sudo[306963]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain sudo[306981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548788.localdomain sudo[306981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306981]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain sudo[306999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548788.localdomain sudo[306999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[306999]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 0 B/s wr, 19 op/s
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev 123b2785-d34c-4e78-9554-0c361c77a234 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev 123b2785-d34c-4e78-9554-0c361c77a234 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:10:44 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event 123b2785-d34c-4e78-9554-0c361c77a234 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:44 np0005548788.localdomain sudo[307017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:44 np0005548788.localdomain sudo[307017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548788.localdomain sudo[307017]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:10:44 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:10:45 np0005548788.localdomain sudo[307035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:45 np0005548788.localdomain sudo[307035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:45 np0005548788.localdomain sudo[307035]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:45 np0005548788.localdomain sudo[307053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:45 np0005548788.localdomain sudo[307053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 2025-12-06 10:10:45.662062867 +0000 UTC m=+0.078496064 container create 77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_benz, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:10:45 np0005548788.localdomain systemd[1]: Started libpod-conmon-77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8.scope.
Dec 06 10:10:45 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 2025-12-06 10:10:45.63046972 +0000 UTC m=+0.046902967 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 2025-12-06 10:10:45.740484167 +0000 UTC m=+0.156917384 container init 77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_benz, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 2025-12-06 10:10:45.75100585 +0000 UTC m=+0.167439067 container start 77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_benz, version=7, ceph=True, RELEASE=main, GIT_BRANCH=main, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1763362218)
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 2025-12-06 10:10:45.751281928 +0000 UTC m=+0.167715205 container attach 77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_benz, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64)
Dec 06 10:10:45 np0005548788.localdomain nice_benz[307102]: 167 167
Dec 06 10:10:45 np0005548788.localdomain systemd[1]: libpod-77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8.scope: Deactivated successfully.
Dec 06 10:10:45 np0005548788.localdomain podman[307086]: 2025-12-06 10:10:45.757385055 +0000 UTC m=+0.173818312 container died 77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_benz, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Dec 06 10:10:45 np0005548788.localdomain podman[307107]: 2025-12-06 10:10:45.856349504 +0000 UTC m=+0.089665855 container remove 77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_benz, version=7, ceph=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=1763362218, io.openshift.expose-services=, distribution-scope=public)
Dec 06 10:10:45 np0005548788.localdomain systemd[1]: libpod-conmon-77d35128b2f5994528b0779e1ec4672ace612ea877a8991884aca6d7694369d8.scope: Deactivated successfully.
Dec 06 10:10:45 np0005548788.localdomain sudo[307053]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:10:45 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 0 B/s wr, 19 op/s
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-4e7858da184f2b108743fc9efcbd5d397a5ecf5f490b9fe7510f8ae8a265a147-merged.mount: Deactivated successfully.
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:46 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:10:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:46 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:46 np0005548788.localdomain ceph-mgr[286998]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:10:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:10:46 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:10:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:10:47.433 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:10:47.434 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:10:47.435 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] update: starting ev f8965e17-0a71-4750-8eab-4de39548ea75 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:10:47 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] complete: finished ev f8965e17-0a71-4750-8eab-4de39548ea75 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:10:47 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Completed event f8965e17-0a71-4750-8eab-4de39548ea75 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:10:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:47 np0005548788.localdomain sudo[307125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:47 np0005548788.localdomain sudo[307125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:47 np0005548788.localdomain sudo[307125]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:48 np0005548788.localdomain ceph-mgr[286998]: [progress INFO root] Writing back 50 completed events
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:10:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:10:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:10:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:10:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:10:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:10:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18233 "" "Go-http-client/1.1"
Dec 06 10:10:49 np0005548788.localdomain ceph-mon[293643]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:10:50 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:10:51 np0005548788.localdomain systemd[1]: tmp-crun.a7C42M.mount: Deactivated successfully.
Dec 06 10:10:51 np0005548788.localdomain podman[307143]: 2025-12-06 10:10:51.259906297 +0000 UTC m=+0.087026235 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:10:51 np0005548788.localdomain podman[307143]: 2025-12-06 10:10:51.302633975 +0000 UTC m=+0.129753903 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:10:51 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:10:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:51 np0005548788.localdomain ceph-mon[293643]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:52 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:53 np0005548788.localdomain ceph-mon[293643]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:54 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:10:55 np0005548788.localdomain podman[307169]: 2025-12-06 10:10:55.258359392 +0000 UTC m=+0.084535129 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: tmp-crun.CiLfy7.mount: Deactivated successfully.
Dec 06 10:10:55 np0005548788.localdomain podman[307170]: 2025-12-06 10:10:55.301238904 +0000 UTC m=+0.123839751 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:10:55 np0005548788.localdomain podman[307169]: 2025-12-06 10:10:55.305644289 +0000 UTC m=+0.131820056 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:10:55 np0005548788.localdomain podman[307170]: 2025-12-06 10:10:55.363688995 +0000 UTC m=+0.186289872 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:10:55 np0005548788.localdomain podman[307172]: 2025-12-06 10:10:55.368653647 +0000 UTC m=+0.181525747 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, name=ubi9-minimal, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:10:55 np0005548788.localdomain podman[307172]: 2025-12-06 10:10:55.45366432 +0000 UTC m=+0.266536400 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, vcs-type=git)
Dec 06 10:10:55 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:10:55 np0005548788.localdomain ceph-mon[293643]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:56 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:57 np0005548788.localdomain ceph-mon[293643]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:58 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:59 np0005548788.localdomain ceph-mon[293643]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:00 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:01 np0005548788.localdomain ceph-mon[293643]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:02 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:03 np0005548788.localdomain ceph-mon[293643]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:04 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:05 np0005548788.localdomain ceph-mon[293643]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:06 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:11:07 np0005548788.localdomain systemd[1]: tmp-crun.QBUN40.mount: Deactivated successfully.
Dec 06 10:11:07 np0005548788.localdomain podman[307231]: 2025-12-06 10:11:07.264829197 +0000 UTC m=+0.091144942 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:11:07 np0005548788.localdomain podman[307231]: 2025-12-06 10:11:07.281715244 +0000 UTC m=+0.108030989 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:11:07 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:11:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548788.localdomain ceph-mon[293643]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:11:08 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:11:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:11:09 np0005548788.localdomain ceph-mon[293643]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:11:10 np0005548788.localdomain podman[307250]: 2025-12-06 10:11:10.26179569 +0000 UTC m=+0.084439017 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:11:10 np0005548788.localdomain podman[307250]: 2025-12-06 10:11:10.275485118 +0000 UTC m=+0.098128465 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:11:10 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:11:10 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:11 np0005548788.localdomain ceph-mon[293643]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:12 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:11:13 np0005548788.localdomain podman[307272]: 2025-12-06 10:11:13.252450628 +0000 UTC m=+0.080012880 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:11:13 np0005548788.localdomain podman[307272]: 2025-12-06 10:11:13.263494646 +0000 UTC m=+0.091056858 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 10:11:13 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:11:13 np0005548788.localdomain ceph-mon[293643]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:14 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:15 np0005548788.localdomain ceph-mon[293643]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:16 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:17 np0005548788.localdomain ceph-mon[293643]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:18 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:11:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:11:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:11:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:11:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:11:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18229 "" "Go-http-client/1.1"
Dec 06 10:11:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 06 10:11:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/483164750' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:11:19 np0005548788.localdomain ceph-mon[293643]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:19 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/483164750' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:11:20 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:21 np0005548788.localdomain ceph-mon[293643]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:11:22 np0005548788.localdomain podman[307289]: 2025-12-06 10:11:22.263180653 +0000 UTC m=+0.090580664 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:11:22 np0005548788.localdomain podman[307289]: 2025-12-06 10:11:22.305761286 +0000 UTC m=+0.133161337 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:11:22 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:11:22 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:23 np0005548788.localdomain ceph-mon[293643]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:24 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44667 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:11:24 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:24 np0005548788.localdomain ceph-mon[293643]: from='client.44667 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:11:25 np0005548788.localdomain ceph-mon[293643]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:11:26 np0005548788.localdomain podman[307315]: 2025-12-06 10:11:26.252752487 +0000 UTC m=+0.076110511 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:11:26 np0005548788.localdomain podman[307315]: 2025-12-06 10:11:26.261146354 +0000 UTC m=+0.084504358 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: tmp-crun.LNRoWa.mount: Deactivated successfully.
Dec 06 10:11:26 np0005548788.localdomain podman[307314]: 2025-12-06 10:11:26.311031101 +0000 UTC m=+0.137323335 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 10:11:26 np0005548788.localdomain podman[307316]: 2025-12-06 10:11:26.370587513 +0000 UTC m=+0.186965004 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:11:26 np0005548788.localdomain podman[307316]: 2025-12-06 10:11:26.387620795 +0000 UTC m=+0.203998326 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9-minimal)
Dec 06 10:11:26 np0005548788.localdomain podman[307314]: 2025-12-06 10:11:26.394572867 +0000 UTC m=+0.220865091 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:11:26 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:11:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:26 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:27 np0005548788.localdomain ceph-mon[293643]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1823885354' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:28 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 06 10:11:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2405516431' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 06 10:11:29 np0005548788.localdomain ceph-mon[293643]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/2405516431' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 06 10:11:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1834841947' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:30 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:31.357 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:31.357 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:11:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:31 np0005548788.localdomain ceph-mon[293643]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:32 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:34.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:34.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:34.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:34 np0005548788.localdomain ceph-mon[293643]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(audit) log [DBG] : from='client.44673 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:11:34 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:35.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:35.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:35.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:35 np0005548788.localdomain ceph-mon[293643]: from='client.44673 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:11:36 np0005548788.localdomain ceph-mon[293643]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:36 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1742337711' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:36 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/246951837' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Optimize plan auto_2025-12-06_10:11:37
Dec 06 10:11:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 06 10:11:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] do_upmap
Dec 06 10:11:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] pools ['backups', 'volumes', 'manila_metadata', '.mgr', 'images', 'manila_data', 'vms']
Dec 06 10:11:37 np0005548788.localdomain ceph-mgr[286998]: [balancer INFO root] prepared 0/10 changes
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.019 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.020 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.035 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.036 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.036 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.037 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.037 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] _maybe_adjust
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033250017448352874 of space, bias 1.0, pg target 0.6650003489670575 quantized to 32 (current 32)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mon[293643]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:11:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:11:38 np0005548788.localdomain podman[307382]: 2025-12-06 10:11:38.264777625 +0000 UTC m=+0.086954722 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 06 10:11:38 np0005548788.localdomain podman[307382]: 2025-12-06 10:11:38.279704862 +0000 UTC m=+0.101881989 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true)
Dec 06 10:11:38 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:11:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/291434841' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.517 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.711 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.713 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12302MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.713 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.714 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.787 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.788 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:11:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:38.813 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:11:38 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:11:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:11:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/291434841' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/416774902' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/416774902' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/259866061' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1381932184' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.265 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.273 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.347 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updated inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.347 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.347 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.372 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:11:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:11:39.373 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:40 np0005548788.localdomain ceph-mon[293643]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/259866061' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:11:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1381932184' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:40 np0005548788.localdomain ceph-mgr[286998]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:11:41 np0005548788.localdomain podman[307437]: 2025-12-06 10:11:41.255744293 +0000 UTC m=+0.083446666 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:11:41 np0005548788.localdomain podman[307437]: 2025-12-06 10:11:41.26382889 +0000 UTC m=+0.091531293 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:11:41 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3000437731' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e94 do_prune osdmap full prune enabled
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Activating manager daemon np0005548790.kvkfyr
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 e95: 6 total, 6 up, 6 in
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr handle_mgr_map I was active but no longer am
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 06 10:11:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:41.753+0000 7f68a8bee640 -1 mgr handle_mgr_map I was active but no longer am
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  1: '-n'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  2: 'mgr.np0005548788.yvwbqq'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  3: '-f'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  4: '--setuser'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  5: 'ceph'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  6: '--setgroup'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  7: 'ceph'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  8: '--default-log-to-file=false'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  9: '--default-log-to-journald=true'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3000437731' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e42: np0005548790.kvkfyr(active, starting, since 0.0402134s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Manager daemon np0005548790.kvkfyr is now available
Dec 06 10:11:41 np0005548788.localdomain sshd[305929]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:11:41 np0005548788.localdomain systemd[1]: session-72.scope: Deactivated successfully.
Dec 06 10:11:41 np0005548788.localdomain systemd[1]: session-72.scope: Consumed 6.757s CPU time.
Dec 06 10:11:41 np0005548788.localdomain systemd-logind[765]: Session 72 logged out. Waiting for processes to exit.
Dec 06 10:11:41 np0005548788.localdomain systemd-logind[765]: Removed session 72.
Dec 06 10:11:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: ignoring --setuser ceph since I am not root
Dec 06 10:11:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: ignoring --setgroup ceph since I am not root
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: pidfile_write: ignore empty --pid-file
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'alerts'
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} v 0)
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} v 0)
Dec 06 10:11:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:11:41 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'balancer'
Dec 06 10:11:41 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:41.943+0000 7f88b441d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:11:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:11:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'cephadm'
Dec 06 10:11:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:42.009+0000 7f88b441d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:11:42 np0005548788.localdomain sshd[307484]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:11:42 np0005548788.localdomain sshd[307484]: Accepted publickey for ceph-admin from 192.168.122.108 port 60532 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:11:42 np0005548788.localdomain systemd-logind[765]: New session 73 of user ceph-admin.
Dec 06 10:11:42 np0005548788.localdomain systemd[1]: Started Session 73 of User ceph-admin.
Dec 06 10:11:42 np0005548788.localdomain sshd[307484]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3000437731' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: Activating manager daemon np0005548790.kvkfyr
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: osdmap e95: 6 total, 6 up, 6 in
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.200:0/3000437731' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: mgrmap e42: np0005548790.kvkfyr(active, starting, since 0.0402134s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: Manager daemon np0005548790.kvkfyr is now available
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:11:42 np0005548788.localdomain sudo[307488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:42 np0005548788.localdomain sudo[307488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:42 np0005548788.localdomain sudo[307488]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:42 np0005548788.localdomain sudo[307506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:11:42 np0005548788.localdomain sudo[307506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'crash'
Dec 06 10:11:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:11:42 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'dashboard'
Dec 06 10:11:42 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:42.656+0000 7f88b441d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:11:42 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e43: np0005548790.kvkfyr(active, since 1.10215s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'devicehealth'
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:43.182+0000 7f88b441d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]:   from numpy import show_config as show_numpy_config
Dec 06 10:11:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:43.315+0000 7f88b441d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'influx'
Dec 06 10:11:43 np0005548788.localdomain systemd[1]: tmp-crun.tORsSz.mount: Deactivated successfully.
Dec 06 10:11:43 np0005548788.localdomain podman[307601]: 2025-12-06 10:11:43.328348722 +0000 UTC m=+0.111482174 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1763362218, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:43.373+0000 7f88b441d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'insights'
Dec 06 10:11:43 np0005548788.localdomain systemd[1]: tmp-crun.Evx6Lr.mount: Deactivated successfully.
Dec 06 10:11:43 np0005548788.localdomain podman[307619]: 2025-12-06 10:11:43.427439204 +0000 UTC m=+0.098882867 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'iostat'
Dec 06 10:11:43 np0005548788.localdomain podman[307601]: 2025-12-06 10:11:43.437693019 +0000 UTC m=+0.220826541 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, release=1763362218, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7)
Dec 06 10:11:43 np0005548788.localdomain podman[307619]: 2025-12-06 10:11:43.461718174 +0000 UTC m=+0.133161887 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:11:43 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:11:43 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:43.485+0000 7f88b441d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'k8sevents'
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'localpool'
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: mgrmap e43: np0005548790.kvkfyr(active, since 1.10215s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:11:43] ENGINE Bus STARTING
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:11:43] ENGINE Serving on https://172.18.0.108:7150
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:11:43] ENGINE Client ('172.18.0.108', 49786) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:11:43] ENGINE Serving on http://172.18.0.108:8765
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: [06/Dec/2025:10:11:43] ENGINE Bus STARTED
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:11:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:43 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'mirroring'
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'nfs'
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:11:44 np0005548788.localdomain sudo[307506]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain sudo[307736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:44 np0005548788.localdomain sudo[307736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:44 np0005548788.localdomain sudo[307736]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.201+0000 7f88b441d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'orchestrator'
Dec 06 10:11:44 np0005548788.localdomain sudo[307754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:11:44 np0005548788.localdomain sudo[307754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.344+0000 7f88b441d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.406+0000 7f88b441d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'osd_support'
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.460+0000 7f88b441d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.525+0000 7f88b441d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'progress'
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.582+0000 7f88b441d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'prometheus'
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: Cluster is now healthy
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548788.localdomain sudo[307754]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.879+0000 7f88b441d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rbd_support'
Dec 06 10:11:44 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:44.958+0000 7f88b441d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:11:44 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'restful'
Dec 06 10:11:45 np0005548788.localdomain sudo[307803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:45 np0005548788.localdomain sudo[307803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain sudo[307803]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain sudo[307821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:11:45 np0005548788.localdomain sudo[307821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rgw'
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e44: np0005548790.kvkfyr(active, since 3s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:45.273+0000 7f88b441d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'rook'
Dec 06 10:11:45 np0005548788.localdomain sudo[307821]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:11:45 np0005548788.localdomain sudo[307857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:11:45 np0005548788.localdomain sudo[307857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain sudo[307857]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:45.685+0000 7f88b441d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'selftest'
Dec 06 10:11:45 np0005548788.localdomain sudo[307875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:11:45 np0005548788.localdomain sudo[307875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain sudo[307875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:45.743+0000 7f88b441d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'snap_schedule'
Dec 06 10:11:45 np0005548788.localdomain sudo[307893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:45 np0005548788.localdomain sudo[307893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain sudo[307893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'stats'
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'status'
Dec 06 10:11:45 np0005548788.localdomain sudo[307911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:45 np0005548788.localdomain sudo[307911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain sudo[307911]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:45.928+0000 7f88b441d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'telegraf'
Dec 06 10:11:45 np0005548788.localdomain sudo[307929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:45 np0005548788.localdomain sudo[307929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548788.localdomain sudo[307929]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:45.984+0000 7f88b441d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:11:45 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'telemetry'
Dec 06 10:11:46 np0005548788.localdomain sudo[307963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:46 np0005548788.localdomain sudo[307963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[307963]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:46.111+0000 7f88b441d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 10:11:46 np0005548788.localdomain sudo[307981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:46 np0005548788.localdomain sudo[307981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[307981]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain sudo[307999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548788.localdomain sudo[307999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[307999]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:46.255+0000 7f88b441d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'volumes'
Dec 06 10:11:46 np0005548788.localdomain sudo[308017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:46 np0005548788.localdomain sudo[308017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308017]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain sudo[308035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:46 np0005548788.localdomain sudo[308035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308035]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:46.437+0000 7f88b441d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Loading python module 'zabbix'
Dec 06 10:11:46 np0005548788.localdomain sudo[308053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548788.localdomain sudo[308053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308053]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548788-yvwbqq[286994]: 2025-12-06T10:11:46.493+0000 7f88b441d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: ms_deliver_dispatch: unhandled message 0x56252774b1e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Dec 06 10:11:46 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3354697053
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: mgrmap e44: np0005548790.kvkfyr(active, since 3s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548788.localdomain sudo[308071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:46 np0005548788.localdomain sudo[308071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308071]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e45: np0005548790.kvkfyr(active, since 4s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:11:46 np0005548788.localdomain sudo[308089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548788.localdomain sudo[308089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308089]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain sudo[308123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548788.localdomain sudo[308123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308123]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.802158) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906802255, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2463, "num_deletes": 257, "total_data_size": 7938750, "memory_usage": 8598016, "flush_reason": "Manual Compaction"}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec 06 10:11:46 np0005548788.localdomain sudo[308141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548788.localdomain sudo[308141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308141]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906850887, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 7341396, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20294, "largest_seqno": 22756, "table_properties": {"data_size": 7330634, "index_size": 6691, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26504, "raw_average_key_size": 22, "raw_value_size": 7307651, "raw_average_value_size": 6130, "num_data_blocks": 287, "num_entries": 1192, "num_filter_entries": 1192, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015789, "oldest_key_time": 1765015789, "file_creation_time": 1765015906, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 48793 microseconds, and 15315 cpu microseconds.
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.850945) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 7341396 bytes OK
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.850972) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.852914) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.852935) EVENT_LOG_v1 {"time_micros": 1765015906852929, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.852958) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 7927600, prev total WAL file size 7927600, number of live WAL files 2.
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.855002) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(7169KB)], [33(16MB)]
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906855070, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 24966536, "oldest_snapshot_seqno": -1}
Dec 06 10:11:46 np0005548788.localdomain sudo[308159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:46 np0005548788.localdomain sudo[308159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308159]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain sudo[308177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:11:46 np0005548788.localdomain sudo[308177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548788.localdomain sudo[308177]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12264 keys, 20846624 bytes, temperature: kUnknown
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906981285, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 20846624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20774531, "index_size": 40312, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30725, "raw_key_size": 327295, "raw_average_key_size": 26, "raw_value_size": 20563781, "raw_average_value_size": 1676, "num_data_blocks": 1548, "num_entries": 12264, "num_filter_entries": 12264, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015906, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.981540) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 20846624 bytes
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.983275) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 197.7 rd, 165.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(7.0, 16.8 +0.0 blob) out(19.9 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 12805, records dropped: 541 output_compression: NoCompression
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.983294) EVENT_LOG_v1 {"time_micros": 1765015906983285, "job": 18, "event": "compaction_finished", "compaction_time_micros": 126296, "compaction_time_cpu_micros": 53451, "output_level": 6, "num_output_files": 1, "total_output_size": 20846624, "num_input_records": 12805, "num_output_records": 12264, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906983969, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906985782, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.854886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.985881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.985886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.985888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.985890) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:11:46.985892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:47 np0005548788.localdomain sudo[308195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:11:47 np0005548788.localdomain sudo[308195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308195]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548788.localdomain sudo[308213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308213]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:47 np0005548788.localdomain sudo[308231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308231]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548788.localdomain sudo[308249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308249]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548788.localdomain sudo[308283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308283]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:11:47.434 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:11:47.435 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:11:47.435 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:47 np0005548788.localdomain sudo[308301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548788.localdomain sudo[308301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308301]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain ceph-mon[293643]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:47 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:47 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:47 np0005548788.localdomain ceph-mon[293643]: Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:11:47 np0005548788.localdomain ceph-mon[293643]: mgrmap e45: np0005548790.kvkfyr(active, since 4s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:11:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:11:47 np0005548788.localdomain sudo[308319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:47 np0005548788.localdomain sudo[308319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:47 np0005548788.localdomain sudo[308337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308337]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:47 np0005548788.localdomain sudo[308355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308355]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548788.localdomain sudo[308373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308373]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:47 np0005548788.localdomain sudo[308391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308391]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548788.localdomain sudo[308409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548788.localdomain sudo[308409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548788.localdomain sudo[308409]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain sudo[308443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:48 np0005548788.localdomain sudo[308443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:48 np0005548788.localdomain sudo[308443]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:48 np0005548788.localdomain sudo[308461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:48 np0005548788.localdomain sudo[308461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:48 np0005548788.localdomain sudo[308461]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:48 np0005548788.localdomain sudo[308479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548788.localdomain sudo[308479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:48 np0005548788.localdomain sudo[308479]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:11:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548788.localdomain sudo[308497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:11:49 np0005548788.localdomain sudo[308497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:49 np0005548788.localdomain sudo[308497]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548788.localdomain sudo[308515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:11:49 np0005548788.localdomain sudo[308515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:49 np0005548788.localdomain sudo[308515]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:11:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:11:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:11:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:11:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:11:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:11:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18232 "" "Go-http-client/1.1"
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:11:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:11:50 np0005548788.localdomain ceph-mon[293643]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 0 B/s wr, 18 op/s
Dec 06 10:11:50 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:11:50 np0005548788.localdomain ceph-mon[293643]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:11:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:11:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:52 np0005548788.localdomain ceph-mon[293643]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:11:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:11:53 np0005548788.localdomain podman[308533]: 2025-12-06 10:11:53.252287528 +0000 UTC m=+0.080315990 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:11:53 np0005548788.localdomain podman[308533]: 2025-12-06 10:11:53.294793239 +0000 UTC m=+0.122821671 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Dec 06 10:11:53 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:11:54 np0005548788.localdomain ceph-mon[293643]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:11:55 np0005548788.localdomain ceph-mon[293643]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:11:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:11:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:11:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:11:57 np0005548788.localdomain podman[308560]: 2025-12-06 10:11:57.270298192 +0000 UTC m=+0.092663377 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:11:57 np0005548788.localdomain podman[308558]: 2025-12-06 10:11:57.322770168 +0000 UTC m=+0.146492785 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:11:57 np0005548788.localdomain podman[308558]: 2025-12-06 10:11:57.338693185 +0000 UTC m=+0.162415802 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:11:57 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:11:57 np0005548788.localdomain podman[308560]: 2025-12-06 10:11:57.390655336 +0000 UTC m=+0.213020501 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:11:57 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:11:57 np0005548788.localdomain podman[308559]: 2025-12-06 10:11:57.484438286 +0000 UTC m=+0.306111680 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:11:57 np0005548788.localdomain podman[308559]: 2025-12-06 10:11:57.499925651 +0000 UTC m=+0.321599035 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:11:57 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:11:57 np0005548788.localdomain ceph-mon[293643]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:12:00 np0005548788.localdomain ceph-mon[293643]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:12:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:02 np0005548788.localdomain ceph-mon[293643]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:04 np0005548788.localdomain ceph-mon[293643]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:04 np0005548788.localdomain sshd[308622]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:12:06 np0005548788.localdomain ceph-mon[293643]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:07 np0005548788.localdomain sshd[308622]: Received disconnect from 45.78.194.186 port 55238:11: Bye Bye [preauth]
Dec 06 10:12:07 np0005548788.localdomain sshd[308622]: Disconnected from authenticating user root 45.78.194.186 port 55238 [preauth]
Dec 06 10:12:07 np0005548788.localdomain ceph-mon[293643]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:12:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:12:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:12:09 np0005548788.localdomain podman[308624]: 2025-12-06 10:12:09.270302383 +0000 UTC m=+0.094602636 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 06 10:12:09 np0005548788.localdomain podman[308624]: 2025-12-06 10:12:09.282372783 +0000 UTC m=+0.106673056 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:12:09 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:12:10 np0005548788.localdomain ceph-mon[293643]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:12 np0005548788.localdomain ceph-mon[293643]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:12:12 np0005548788.localdomain podman[308643]: 2025-12-06 10:12:12.251655458 +0000 UTC m=+0.082702984 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:12:12 np0005548788.localdomain podman[308643]: 2025-12-06 10:12:12.260026574 +0000 UTC m=+0.091074090 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:12:12 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:12:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:12:13 np0005548788.localdomain podman[308667]: 2025-12-06 10:12:13.685559548 +0000 UTC m=+0.078078341 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:12:13 np0005548788.localdomain podman[308667]: 2025-12-06 10:12:13.71927948 +0000 UTC m=+0.111798293 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:12:13 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:12:14 np0005548788.localdomain ceph-mon[293643]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:16 np0005548788.localdomain ceph-mon[293643]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:17 np0005548788.localdomain ceph-mon[293643]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:12:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:12:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:12:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:12:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:12:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18237 "" "Go-http-client/1.1"
Dec 06 10:12:20 np0005548788.localdomain ceph-mon[293643]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:22 np0005548788.localdomain ceph-mon[293643]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:24 np0005548788.localdomain ceph-mon[293643]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:12:24 np0005548788.localdomain podman[308685]: 2025-12-06 10:12:24.259468727 +0000 UTC m=+0.083742634 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:12:24 np0005548788.localdomain podman[308685]: 2025-12-06 10:12:24.30265188 +0000 UTC m=+0.126925827 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:12:24 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:12:26 np0005548788.localdomain ceph-mon[293643]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:27 np0005548788.localdomain ceph-mon[293643]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:12:28 np0005548788.localdomain podman[308710]: 2025-12-06 10:12:28.261662779 +0000 UTC m=+0.086265272 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:12:28 np0005548788.localdomain podman[308710]: 2025-12-06 10:12:28.302802277 +0000 UTC m=+0.127404730 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: tmp-crun.xRQREy.mount: Deactivated successfully.
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:12:28 np0005548788.localdomain podman[308711]: 2025-12-06 10:12:28.325698719 +0000 UTC m=+0.146061653 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:12:28 np0005548788.localdomain podman[308711]: 2025-12-06 10:12:28.335662793 +0000 UTC m=+0.156025767 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:12:28 np0005548788.localdomain podman[308712]: 2025-12-06 10:12:28.419275173 +0000 UTC m=+0.236539131 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:12:28 np0005548788.localdomain podman[308712]: 2025-12-06 10:12:28.434520779 +0000 UTC m=+0.251784737 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:12:28 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:12:30 np0005548788.localdomain ceph-mon[293643]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:30 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/825719281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:31 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2307179959' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:32 np0005548788.localdomain ceph-mon[293643]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:33.358 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:33.360 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:12:34 np0005548788.localdomain ceph-mon[293643]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:35.008 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:35.008 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:36.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:36.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:36 np0005548788.localdomain ceph-mon[293643]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:37.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:37.455 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:37.456 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:37 np0005548788.localdomain ceph-mon[293643]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1486122464' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.024 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.024 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.051 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.052 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.052 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.053 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.053 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:12:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:12:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4173888289' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.518 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.734 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.736 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12351MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.736 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:38.737 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:12:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:12:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:12:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4173888289' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2157659444' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1131690724' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:12:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1131690724' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:12:39 np0005548788.localdomain ceph-mon[293643]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:12:40 np0005548788.localdomain systemd[1]: tmp-crun.BXuJNd.mount: Deactivated successfully.
Dec 06 10:12:40 np0005548788.localdomain podman[308795]: 2025-12-06 10:12:40.256775889 +0000 UTC m=+0.085674904 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:12:40 np0005548788.localdomain podman[308795]: 2025-12-06 10:12:40.267188637 +0000 UTC m=+0.096087702 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:12:40 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:12:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:40.322 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:12:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:40.322 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:12:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:40.356 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:12:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:12:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/233671784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:40.790 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:12:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:40.796 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:12:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/233671784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:41.463 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:12:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:41.466 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:12:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:12:41.466 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:41 np0005548788.localdomain ceph-mon[293643]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:12:43 np0005548788.localdomain podman[308836]: 2025-12-06 10:12:43.235680718 +0000 UTC m=+0.063970649 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:12:43 np0005548788.localdomain podman[308836]: 2025-12-06 10:12:43.246672424 +0000 UTC m=+0.074962405 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:12:43 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:12:44 np0005548788.localdomain ceph-mon[293643]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:12:44 np0005548788.localdomain podman[308859]: 2025-12-06 10:12:44.241822944 +0000 UTC m=+0.072334515 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:12:44 np0005548788.localdomain podman[308859]: 2025-12-06 10:12:44.276538086 +0000 UTC m=+0.107049637 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:12:44 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:12:46 np0005548788.localdomain ceph-mon[293643]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:12:47.436 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:12:47.437 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:12:47.437 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:47 np0005548788.localdomain ceph-mon[293643]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:49 np0005548788.localdomain sudo[308878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:12:49 np0005548788.localdomain sudo[308878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:12:49 np0005548788.localdomain sudo[308878]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:12:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:12:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:12:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153096 "" "Go-http-client/1.1"
Dec 06 10:12:49 np0005548788.localdomain sudo[308896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:12:49 np0005548788.localdomain sudo[308896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:12:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:12:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18234 "" "Go-http-client/1.1"
Dec 06 10:12:50 np0005548788.localdomain ceph-mon[293643]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:50 np0005548788.localdomain sudo[308896]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:12:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:12:50 np0005548788.localdomain sudo[308945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:12:50 np0005548788.localdomain sudo[308945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:12:50 np0005548788.localdomain sudo[308945]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:12:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:12:52 np0005548788.localdomain ceph-mon[293643]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:12:54 np0005548788.localdomain ceph-mon[293643]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:12:55 np0005548788.localdomain podman[308963]: 2025-12-06 10:12:55.268986237 +0000 UTC m=+0.091808840 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 06 10:12:55 np0005548788.localdomain podman[308963]: 2025-12-06 10:12:55.357796416 +0000 UTC m=+0.180619049 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 06 10:12:55 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:12:56 np0005548788.localdomain ceph-mon[293643]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:57 np0005548788.localdomain ceph-mon[293643]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: tmp-crun.K3d4My.mount: Deactivated successfully.
Dec 06 10:12:59 np0005548788.localdomain podman[308989]: 2025-12-06 10:12:59.258565222 +0000 UTC m=+0.087479619 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 06 10:12:59 np0005548788.localdomain podman[308989]: 2025-12-06 10:12:59.270487767 +0000 UTC m=+0.099402194 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:12:59 np0005548788.localdomain podman[308991]: 2025-12-06 10:12:59.359187261 +0000 UTC m=+0.181141944 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:12:59 np0005548788.localdomain podman[308990]: 2025-12-06 10:12:59.407121759 +0000 UTC m=+0.230728233 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:12:59 np0005548788.localdomain podman[308991]: 2025-12-06 10:12:59.426277175 +0000 UTC m=+0.248231808 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9)
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:12:59 np0005548788.localdomain podman[308990]: 2025-12-06 10:12:59.441616424 +0000 UTC m=+0.265222828 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:12:59 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:13:00 np0005548788.localdomain ceph-mon[293643]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:00 np0005548788.localdomain systemd[1]: tmp-crun.KwmSb0.mount: Deactivated successfully.
Dec 06 10:13:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:02 np0005548788.localdomain ceph-mon[293643]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:04 np0005548788.localdomain ceph-mon[293643]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:06.289 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:13:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:06.291 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:13:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:06.292 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:13:06 np0005548788.localdomain ceph-mon[293643]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.497 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.498 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.499 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:13:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548788.localdomain ceph-mon[293643]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:13:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:13:10 np0005548788.localdomain ceph-mon[293643]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:10 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:10.857 262572 INFO oslo.privsep.daemon [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmppx7_i8v2/privsep.sock']
Dec 06 10:13:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:13:11 np0005548788.localdomain systemd[298457]: Created slice User Background Tasks Slice.
Dec 06 10:13:11 np0005548788.localdomain systemd[298457]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 10:13:11 np0005548788.localdomain podman[309056]: 2025-12-06 10:13:11.250228412 +0000 UTC m=+0.076867663 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec 06 10:13:11 np0005548788.localdomain systemd[298457]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 10:13:11 np0005548788.localdomain podman[309056]: 2025-12-06 10:13:11.265348916 +0000 UTC m=+0.091988177 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:13:11 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:13:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:11.513 262572 INFO oslo.privsep.daemon [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 10:13:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:11.391 309077 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:13:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:11.394 309077 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:13:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:11.396 309077 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 06 10:13:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:11.396 309077 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309077
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.870528) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015991870574, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1172, "num_deletes": 255, "total_data_size": 1092594, "memory_usage": 1123648, "flush_reason": "Manual Compaction"}
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015991879730, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1044636, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22757, "largest_seqno": 23928, "table_properties": {"data_size": 1039395, "index_size": 2712, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11725, "raw_average_key_size": 20, "raw_value_size": 1028560, "raw_average_value_size": 1764, "num_data_blocks": 115, "num_entries": 583, "num_filter_entries": 583, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015907, "oldest_key_time": 1765015907, "file_creation_time": 1765015991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 9252 microseconds, and 3572 cpu microseconds.
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.879778) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1044636 bytes OK
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.879800) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881317) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881336) EVENT_LOG_v1 {"time_micros": 1765015991881330, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881356) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 1087195, prev total WAL file size 1087519, number of live WAL files 2.
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881861) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373637' seq:72057594037927935, type:22 .. '6C6F676D0034303138' seq:0, type:0; will stop at (end)
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1020KB)], [36(19MB)]
Dec 06 10:13:11 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015991881903, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 21891260, "oldest_snapshot_seqno": -1}
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12314 keys, 21753989 bytes, temperature: kUnknown
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015992004726, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 21753989, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21679648, "index_size": 42432, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30853, "raw_key_size": 329412, "raw_average_key_size": 26, "raw_value_size": 21466100, "raw_average_value_size": 1743, "num_data_blocks": 1635, "num_entries": 12314, "num_filter_entries": 12314, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765015991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.005268) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 21753989 bytes
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.007504) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.8 rd, 176.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 19.9 +0.0 blob) out(20.7 +0.0 blob), read-write-amplify(41.8) write-amplify(20.8) OK, records in: 12847, records dropped: 533 output_compression: NoCompression
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.007537) EVENT_LOG_v1 {"time_micros": 1765015992007524, "job": 20, "event": "compaction_finished", "compaction_time_micros": 123099, "compaction_time_cpu_micros": 51492, "output_level": 6, "num_output_files": 1, "total_output_size": 21753989, "num_input_records": 12847, "num_output_records": 12314, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015992008293, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015992012463, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.012636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.012642) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.012645) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.012648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:12.012651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:12.033 262572 INFO oslo.privsep.daemon [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp7wef13nd/privsep.sock']
Dec 06 10:13:12 np0005548788.localdomain ceph-mon[293643]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:12.694 262572 INFO oslo.privsep.daemon [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 10:13:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:12.582 309087 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:13:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:12.587 309087 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:13:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:12.591 309087 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 06 10:13:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:12.591 309087 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309087
Dec 06 10:13:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e95 do_prune osdmap full prune enabled
Dec 06 10:13:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e96 e96: 6 total, 6 up, 6 in
Dec 06 10:13:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in
Dec 06 10:13:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e46: np0005548790.kvkfyr(active, since 91s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:13:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:13:13 np0005548788.localdomain podman[309092]: 2025-12-06 10:13:13.710041293 +0000 UTC m=+0.096389231 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:13:13 np0005548788.localdomain podman[309092]: 2025-12-06 10:13:13.719058069 +0000 UTC m=+0.105405987 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:13:13 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:13:13 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:13.737 262572 INFO oslo.privsep.daemon [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp4vedvtgy/privsep.sock']
Dec 06 10:13:14 np0005548788.localdomain ceph-mon[293643]: pgmap v49: 177 pgs: 177 active+clean; 125 MiB data, 637 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s rd, 1.7 MiB/s wr, 12 op/s
Dec 06 10:13:14 np0005548788.localdomain ceph-mon[293643]: osdmap e96: 6 total, 6 up, 6 in
Dec 06 10:13:14 np0005548788.localdomain ceph-mon[293643]: mgrmap e46: np0005548790.kvkfyr(active, since 91s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:13:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:14.386 262572 INFO oslo.privsep.daemon [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 10:13:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:14.283 309123 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:13:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:14.288 309123 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:13:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:14.292 309123 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 06 10:13:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:14.292 309123 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309123
Dec 06 10:13:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e96 do_prune osdmap full prune enabled
Dec 06 10:13:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 e97: 6 total, 6 up, 6 in
Dec 06 10:13:15 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in
Dec 06 10:13:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:13:15 np0005548788.localdomain systemd[1]: tmp-crun.6g1urd.mount: Deactivated successfully.
Dec 06 10:13:15 np0005548788.localdomain podman[309128]: 2025-12-06 10:13:15.269519476 +0000 UTC m=+0.096089552 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:13:15 np0005548788.localdomain podman[309128]: 2025-12-06 10:13:15.298556555 +0000 UTC m=+0.125126641 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 06 10:13:15 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:13:15 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:15.700 262572 INFO neutron.agent.linux.ip_lib [None req-719a6bf7-c3ad-4072-98e5-e0ec854af1ed - - - - - -] Device tap0ef8d29e-76 cannot be used as it has no MAC address
Dec 06 10:13:15 np0005548788.localdomain kernel: device tap0ef8d29e-76 entered promiscuous mode
Dec 06 10:13:15 np0005548788.localdomain NetworkManager[5968]: <info>  [1765015995.8192] manager: (tap0ef8d29e-76): new Generic device (/org/freedesktop/NetworkManager/Devices/13)
Dec 06 10:13:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:13:15Z|00025|binding|INFO|Claiming lport 0ef8d29e-76b7-4e43-958d-4407531d055f for this chassis.
Dec 06 10:13:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:13:15Z|00026|binding|INFO|0ef8d29e-76b7-4e43-958d-4407531d055f: Claiming unknown
Dec 06 10:13:15 np0005548788.localdomain systemd-udevd[309156]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:13:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:15.833 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-5a505334-a6af-4da9-bd7e-26140873cbe6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a505334-a6af-4da9-bd7e-26140873cbe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bffcf50ce6c4b07a51d7021f8a9cc25', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8c78030-63fb-47f2-8294-d83460de2aaf, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=0ef8d29e-76b7-4e43-958d-4407531d055f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:13:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:15.835 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 0ef8d29e-76b7-4e43-958d-4407531d055f in datapath 5a505334-a6af-4da9-bd7e-26140873cbe6 bound to our chassis
Dec 06 10:13:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:15.838 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 15afb103-ca35-45d2-865f-1fef51070063 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:13:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:15.838 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5a505334-a6af-4da9-bd7e-26140873cbe6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:13:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:15.840 159620 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpf_8ns53a/privsep.sock']
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: hostname: np0005548788.localdomain
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:13:15Z|00027|binding|INFO|Setting lport 0ef8d29e-76b7-4e43-958d-4407531d055f ovn-installed in OVS
Dec 06 10:13:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:13:15Z|00028|binding|INFO|Setting lport 0ef8d29e-76b7-4e43-958d-4407531d055f up in Southbound
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap0ef8d29e-76: No such device
Dec 06 10:13:16 np0005548788.localdomain ceph-mon[293643]: pgmap v51: 177 pgs: 177 active+clean; 125 MiB data, 637 MiB used, 41 GiB / 42 GiB avail; 9.9 KiB/s rd, 2.0 MiB/s wr, 14 op/s
Dec 06 10:13:16 np0005548788.localdomain ceph-mon[293643]: osdmap e97: 6 total, 6 up, 6 in
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.489 159620 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.490 159620 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpf_8ns53a/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.374 309209 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.381 309209 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.385 309209 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.386 309209 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309209
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.494 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[98f856ba-9792-449b-9229-bec219fcb028]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:13:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.965 309209 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.965 309209 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:16.965 309209 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:17.061 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e757c20e-e1e8-475c-9c60-892b68dc67ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:13:17 np0005548788.localdomain podman[309239]: 
Dec 06 10:13:17 np0005548788.localdomain podman[309239]: 2025-12-06 10:13:17.78106635 +0000 UTC m=+0.089260072 container create f215f1fc6c1442de714850267008ec28c09c64768b468eb7725021739406f129 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a505334-a6af-4da9-bd7e-26140873cbe6, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:13:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-f215f1fc6c1442de714850267008ec28c09c64768b468eb7725021739406f129.scope.
Dec 06 10:13:17 np0005548788.localdomain podman[309239]: 2025-12-06 10:13:17.738666202 +0000 UTC m=+0.046859974 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:13:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:13:17 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32f5d0e9a8dfac003f5170ea61f60680077657716dae34a05f62e49f2855aab5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:13:17 np0005548788.localdomain podman[309239]: 2025-12-06 10:13:17.855415056 +0000 UTC m=+0.163608778 container init f215f1fc6c1442de714850267008ec28c09c64768b468eb7725021739406f129 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a505334-a6af-4da9-bd7e-26140873cbe6, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:13:17 np0005548788.localdomain podman[309239]: 2025-12-06 10:13:17.864856265 +0000 UTC m=+0.173049997 container start f215f1fc6c1442de714850267008ec28c09c64768b468eb7725021739406f129 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a505334-a6af-4da9-bd7e-26140873cbe6, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:13:17 np0005548788.localdomain dnsmasq[309258]: started, version 2.85 cachesize 150
Dec 06 10:13:17 np0005548788.localdomain dnsmasq[309258]: DNS service limited to local subnets
Dec 06 10:13:17 np0005548788.localdomain dnsmasq[309258]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:13:17 np0005548788.localdomain dnsmasq[309258]: warning: no upstream servers configured
Dec 06 10:13:17 np0005548788.localdomain dnsmasq-dhcp[309258]: DHCP, static leases only on 192.168.199.0, lease time 1d
Dec 06 10:13:17 np0005548788.localdomain dnsmasq[309258]: read /var/lib/neutron/dhcp/5a505334-a6af-4da9-bd7e-26140873cbe6/addn_hosts - 0 addresses
Dec 06 10:13:17 np0005548788.localdomain dnsmasq-dhcp[309258]: read /var/lib/neutron/dhcp/5a505334-a6af-4da9-bd7e-26140873cbe6/host
Dec 06 10:13:17 np0005548788.localdomain dnsmasq-dhcp[309258]: read /var/lib/neutron/dhcp/5a505334-a6af-4da9-bd7e-26140873cbe6/opts
Dec 06 10:13:17 np0005548788.localdomain ceph-mon[293643]: pgmap v53: 177 pgs: 177 active+clean; 125 MiB data, 637 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 2.6 MiB/s wr, 18 op/s
Dec 06 10:13:18 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:13:18.323 262572 INFO neutron.agent.dhcp.agent [None req-d36d740c-ecb8-4ebc-9cac-1adb844ab5d6 - - - - - -] DHCP configuration for ports {'eab75789-919c-441b-bd61-4efa9e0925a2'} is completed
Dec 06 10:13:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:13:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:13:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:13:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:13:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:13:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1"
Dec 06 10:13:20 np0005548788.localdomain ceph-mon[293643]: pgmap v54: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Dec 06 10:13:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:22 np0005548788.localdomain ceph-mon[293643]: pgmap v55: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.6 MiB/s wr, 29 op/s
Dec 06 10:13:24 np0005548788.localdomain ceph-mon[293643]: pgmap v56: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.1 MiB/s wr, 23 op/s
Dec 06 10:13:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:13:26 np0005548788.localdomain ceph-mon[293643]: pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 2.0 MiB/s wr, 23 op/s
Dec 06 10:13:26 np0005548788.localdomain podman[309260]: 2025-12-06 10:13:26.263834003 +0000 UTC m=+0.087857440 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec 06 10:13:26 np0005548788.localdomain podman[309260]: 2025-12-06 10:13:26.334165596 +0000 UTC m=+0.158189013 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:13:26 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:13:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:27 np0005548788.localdomain ceph-mon[293643]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 1.7 MiB/s wr, 19 op/s
Dec 06 10:13:30 np0005548788.localdomain ceph-mon[293643]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 1.7 MiB/s wr, 19 op/s
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: tmp-crun.oJuEDw.mount: Deactivated successfully.
Dec 06 10:13:30 np0005548788.localdomain podman[309286]: 2025-12-06 10:13:30.273355459 +0000 UTC m=+0.097661190 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:13:30 np0005548788.localdomain podman[309287]: 2025-12-06 10:13:30.285581953 +0000 UTC m=+0.109322197 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:13:30 np0005548788.localdomain podman[309287]: 2025-12-06 10:13:30.292775774 +0000 UTC m=+0.116516058 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:13:30 np0005548788.localdomain podman[309286]: 2025-12-06 10:13:30.316735826 +0000 UTC m=+0.141041557 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:13:30 np0005548788.localdomain podman[309288]: 2025-12-06 10:13:30.376123014 +0000 UTC m=+0.192744250 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Dec 06 10:13:30 np0005548788.localdomain podman[309288]: 2025-12-06 10:13:30.388749351 +0000 UTC m=+0.205370597 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Dec 06 10:13:30 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:13:31 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2010010665' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:32 np0005548788.localdomain ceph-mon[293643]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:33 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/113296038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:34.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:34.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:13:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:34.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:34.007 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:13:34 np0005548788.localdomain ceph-mon[293643]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:36.067 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:36.067 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:36.068 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:36.068 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548788.localdomain ceph-mon[293643]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:37.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:37.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:13:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:37.021 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:13:37 np0005548788.localdomain ceph-mon[293643]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:38.020 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:13:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2709377790' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.024 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.024 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.025 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.025 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.025 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2976675022' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.501 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.709 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.710 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=12013MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.711 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:39.711 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e97 do_prune osdmap full prune enabled
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2247424638' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2247424638' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2976675022' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e98 e98: 6 total, 6 up, 6 in
Dec 06 10:13:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in
Dec 06 10:13:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:40.051 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:13:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:40.052 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:13:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:40.398 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:13:40 np0005548788.localdomain ceph-mon[293643]: osdmap e98: 6 total, 6 up, 6 in
Dec 06 10:13:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/92571660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.048 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.049 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.077 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.116 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.338 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3225179418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.797 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.805 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.836 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.838 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:13:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:41.839 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.897743) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016021897813, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 583, "num_deletes": 250, "total_data_size": 733823, "memory_usage": 743680, "flush_reason": "Manual Compaction"}
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016021905757, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 653774, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23929, "largest_seqno": 24511, "table_properties": {"data_size": 651051, "index_size": 706, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7627, "raw_average_key_size": 20, "raw_value_size": 645236, "raw_average_value_size": 1743, "num_data_blocks": 31, "num_entries": 370, "num_filter_entries": 370, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015991, "oldest_key_time": 1765015991, "file_creation_time": 1765016021, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8075 microseconds, and 4104 cpu microseconds.
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.905814) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 653774 bytes OK
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.905851) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.907803) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.907829) EVENT_LOG_v1 {"time_micros": 1765016021907821, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.907858) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 730608, prev total WAL file size 730932, number of live WAL files 2.
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.908607) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373534' seq:72057594037927935, type:22 .. '6D6772737461740034303035' seq:0, type:0; will stop at (end)
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(638KB)], [39(20MB)]
Dec 06 10:13:41 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016021908670, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 22407763, "oldest_snapshot_seqno": -1}
Dec 06 10:13:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:42.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:42.004 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:13:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:42.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12172 keys, 20264729 bytes, temperature: kUnknown
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016022020958, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 20264729, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20195833, "index_size": 37371, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 326684, "raw_average_key_size": 26, "raw_value_size": 19989208, "raw_average_value_size": 1642, "num_data_blocks": 1423, "num_entries": 12172, "num_filter_entries": 12172, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016021, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.021330) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 20264729 bytes
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023663) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 199.4 rd, 180.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 20.7 +0.0 blob) out(19.3 +0.0 blob), read-write-amplify(65.3) write-amplify(31.0) OK, records in: 12684, records dropped: 512 output_compression: NoCompression
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023727) EVENT_LOG_v1 {"time_micros": 1765016022023695, "job": 22, "event": "compaction_finished", "compaction_time_micros": 112388, "compaction_time_cpu_micros": 58361, "output_level": 6, "num_output_files": 1, "total_output_size": 20264729, "num_input_records": 12684, "num_output_records": 12172, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:41.908498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023877) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016022024856, "job": 0, "event": "table_file_deletion", "file_number": 41}
Dec 06 10:13:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:42.023 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:13:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:13:42.024 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016022027775, "job": 0, "event": "table_file_deletion", "file_number": 39}
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3225179418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:13:42 np0005548788.localdomain podman[309393]: 2025-12-06 10:13:42.262554767 +0000 UTC m=+0.087714626 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec 06 10:13:42 np0005548788.localdomain podman[309393]: 2025-12-06 10:13:42.304624285 +0000 UTC m=+0.129784174 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:13:42 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:13:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e98 do_prune osdmap full prune enabled
Dec 06 10:13:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e99 e99: 6 total, 6 up, 6 in
Dec 06 10:13:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in
Dec 06 10:13:44 np0005548788.localdomain ceph-mon[293643]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.3 KiB/s wr, 23 op/s
Dec 06 10:13:44 np0005548788.localdomain ceph-mon[293643]: osdmap e99: 6 total, 6 up, 6 in
Dec 06 10:13:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:13:44 np0005548788.localdomain systemd[1]: tmp-crun.TzRcAz.mount: Deactivated successfully.
Dec 06 10:13:44 np0005548788.localdomain podman[309413]: 2025-12-06 10:13:44.258874821 +0000 UTC m=+0.084168827 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:13:44 np0005548788.localdomain podman[309413]: 2025-12-06 10:13:44.265888256 +0000 UTC m=+0.091182282 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:13:44 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:13:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:44.608 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:13:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:44.609 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:13:45 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:13:45Z|00029|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 06 10:13:46 np0005548788.localdomain ceph-mon[293643]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.9 KiB/s wr, 29 op/s
Dec 06 10:13:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:13:46 np0005548788.localdomain podman[309436]: 2025-12-06 10:13:46.251596204 +0000 UTC m=+0.079013589 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:13:46 np0005548788.localdomain podman[309436]: 2025-12-06 10:13:46.282556871 +0000 UTC m=+0.109974206 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:13:46 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:13:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:47.437 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:47.438 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:47.438 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:13:47.611 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:13:47 np0005548788.localdomain ceph-mon[293643]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.9 KiB/s wr, 29 op/s
Dec 06 10:13:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:13:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:13:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:13:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:13:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:13:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18715 "" "Go-http-client/1.1"
Dec 06 10:13:50 np0005548788.localdomain ceph-mon[293643]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 3.4 KiB/s wr, 44 op/s
Dec 06 10:13:50 np0005548788.localdomain sudo[309453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:13:50 np0005548788.localdomain sudo[309453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:50 np0005548788.localdomain sudo[309453]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:50 np0005548788.localdomain sudo[309471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 10:13:50 np0005548788.localdomain sudo[309471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e99 do_prune osdmap full prune enabled
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e100 e100: 6 total, 6 up, 6 in
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in
Dec 06 10:13:51 np0005548788.localdomain sudo[309471]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:51 np0005548788.localdomain sudo[309510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:13:51 np0005548788.localdomain sudo[309510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:51 np0005548788.localdomain sudo[309510]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:51 np0005548788.localdomain sudo[309528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:13:51 np0005548788.localdomain sudo[309528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e100 do_prune osdmap full prune enabled
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e101 e101: 6 total, 6 up, 6 in
Dec 06 10:13:51 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in
Dec 06 10:13:52 np0005548788.localdomain sudo[309528]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.1 KiB/s wr, 39 op/s
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: osdmap e100: 6 total, 6 up, 6 in
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: osdmap e101: 6 total, 6 up, 6 in
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:13:52 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548788.localdomain sudo[309578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:13:52 np0005548788.localdomain sudo[309578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:52 np0005548788.localdomain sudo[309578]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:13:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:13:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:13:54 np0005548788.localdomain ceph-mon[293643]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 3.0 KiB/s wr, 38 op/s
Dec 06 10:13:56 np0005548788.localdomain ceph-mon[293643]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 3.0 KiB/s wr, 38 op/s
Dec 06 10:13:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:13:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:13:57 np0005548788.localdomain systemd[1]: tmp-crun.JWUbVi.mount: Deactivated successfully.
Dec 06 10:13:57 np0005548788.localdomain podman[309596]: 2025-12-06 10:13:57.273908327 +0000 UTC m=+0.097307900 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:13:57 np0005548788.localdomain podman[309596]: 2025-12-06 10:13:57.36872702 +0000 UTC m=+0.192126582 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec 06 10:13:57 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:13:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e101 do_prune osdmap full prune enabled
Dec 06 10:13:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:57 np0005548788.localdomain ceph-mon[293643]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Dec 06 10:13:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e102 e102: 6 total, 6 up, 6 in
Dec 06 10:13:57 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in
Dec 06 10:13:58 np0005548788.localdomain ceph-mon[293643]: osdmap e102: 6 total, 6 up, 6 in
Dec 06 10:13:59 np0005548788.localdomain ceph-mon[293643]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.8 KiB/s wr, 30 op/s
Dec 06 10:14:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:14:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:14:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:14:01 np0005548788.localdomain podman[309623]: 2025-12-06 10:14:01.269748832 +0000 UTC m=+0.091864942 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Dec 06 10:14:01 np0005548788.localdomain podman[309623]: 2025-12-06 10:14:01.309065165 +0000 UTC m=+0.131181335 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:14:01 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:14:01 np0005548788.localdomain podman[309624]: 2025-12-06 10:14:01.310994975 +0000 UTC m=+0.128683790 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:14:01 np0005548788.localdomain podman[309625]: 2025-12-06 10:14:01.365661568 +0000 UTC m=+0.177813404 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:14:01 np0005548788.localdomain podman[309624]: 2025-12-06 10:14:01.392546311 +0000 UTC m=+0.210235106 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:14:01 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:14:01 np0005548788.localdomain podman[309625]: 2025-12-06 10:14:01.403777374 +0000 UTC m=+0.215929140 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Dec 06 10:14:01 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:14:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:02 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:02.019 262572 INFO neutron.agent.linux.ip_lib [None req-9c1b121a-802b-495a-bc46-4608057ec1fe - - - - - -] Device tap14d978a5-89 cannot be used as it has no MAC address
Dec 06 10:14:02 np0005548788.localdomain kernel: device tap14d978a5-89 entered promiscuous mode
Dec 06 10:14:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:02Z|00030|binding|INFO|Claiming lport 14d978a5-89e5-4bec-87c4-0261c015709d for this chassis.
Dec 06 10:14:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:02Z|00031|binding|INFO|14d978a5-89e5-4bec-87c4-0261c015709d: Claiming unknown
Dec 06 10:14:02 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016042.0538] manager: (tap14d978a5-89): new Generic device (/org/freedesktop/NetworkManager/Devices/14)
Dec 06 10:14:02 np0005548788.localdomain systemd-udevd[309698]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:02.066 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-47d636a7-c520-4320-aa94-bfb41f418584', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47d636a7-c520-4320-aa94-bfb41f418584', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7897d6398eb64eb29c66df8db792e581', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6898c302-0153-460c-9cb1-4c62ebc9ff31, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=14d978a5-89e5-4bec-87c4-0261c015709d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:02.068 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 14d978a5-89e5-4bec-87c4-0261c015709d in datapath 47d636a7-c520-4320-aa94-bfb41f418584 bound to our chassis
Dec 06 10:14:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:02.071 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1c560fcc-0582-4264-a4bd-7b3f92fe7c8a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:14:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:02.072 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47d636a7-c520-4320-aa94-bfb41f418584, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:02.073 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[c0106a8b-6de9-4579-868b-feea527a2351]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:02Z|00032|binding|INFO|Setting lport 14d978a5-89e5-4bec-87c4-0261c015709d ovn-installed in OVS
Dec 06 10:14:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:02Z|00033|binding|INFO|Setting lport 14d978a5-89e5-4bec-87c4-0261c015709d up in Southbound
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain ceph-mon[293643]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 2.4 KiB/s wr, 26 op/s
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14d978a5-89: No such device
Dec 06 10:14:03 np0005548788.localdomain podman[309770]: 
Dec 06 10:14:03 np0005548788.localdomain podman[309770]: 2025-12-06 10:14:03.041662108 +0000 UTC m=+0.093621178 container create 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:14:03 np0005548788.localdomain systemd[1]: Started libpod-conmon-5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858.scope.
Dec 06 10:14:03 np0005548788.localdomain podman[309770]: 2025-12-06 10:14:02.994164454 +0000 UTC m=+0.046123544 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:03 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:03 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2febf276f78f79b0ded7a394bcb796cf54cbd4f8cda2019ce79a07034472eab4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:03 np0005548788.localdomain podman[309770]: 2025-12-06 10:14:03.120661395 +0000 UTC m=+0.172620465 container init 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:03 np0005548788.localdomain podman[309770]: 2025-12-06 10:14:03.13060478 +0000 UTC m=+0.182563880 container start 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:14:03 np0005548788.localdomain dnsmasq[309788]: started, version 2.85 cachesize 150
Dec 06 10:14:03 np0005548788.localdomain dnsmasq[309788]: DNS service limited to local subnets
Dec 06 10:14:03 np0005548788.localdomain dnsmasq[309788]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:03 np0005548788.localdomain dnsmasq[309788]: warning: no upstream servers configured
Dec 06 10:14:03 np0005548788.localdomain dnsmasq-dhcp[309788]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:03 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 0 addresses
Dec 06 10:14:03 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:14:03 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:14:03 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:03.282 262572 INFO neutron.agent.dhcp.agent [None req-aed0968a-94b8-41c8-b4d1-4cb548aee033 - - - - - -] DHCP configuration for ports {'8839eeed-ff6b-46d9-b40d-610788617728'} is completed
Dec 06 10:14:03 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:03.534 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:03Z, description=, device_id=0ab66a60-f76b-4775-891d-30b21387ddeb, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68a41c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68a4910>], id=eb59753e-18fa-4d22-9f19-4cf02ef3bfc3, ip_allocation=immediate, mac_address=fa:16:3e:19:9a:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:13:59Z, description=, dns_domain=, id=47d636a7-c520-4320-aa94-bfb41f418584, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1313845827-network, port_security_enabled=True, project_id=7897d6398eb64eb29c66df8db792e581, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16795, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=240, status=ACTIVE, subnets=['1f85bb5d-01b8-4e29-bdbf-5aebcf31d657'], tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, updated_at=2025-12-06T10:14:00Z, vlan_transparent=None, network_id=47d636a7-c520-4320-aa94-bfb41f418584, port_security_enabled=False, project_id=7897d6398eb64eb29c66df8db792e581, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=302, status=DOWN, tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, updated_at=2025-12-06T10:14:03Z on network 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:14:03 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 1 addresses
Dec 06 10:14:03 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:14:03 np0005548788.localdomain podman[309806]: 2025-12-06 10:14:03.756868668 +0000 UTC m=+0.057894572 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:03 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:14:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:04.090 262572 INFO neutron.agent.dhcp.agent [None req-170fa4d9-74fa-446d-9453-63a8360dcccd - - - - - -] DHCP configuration for ports {'eb59753e-18fa-4d22-9f19-4cf02ef3bfc3'} is completed
Dec 06 10:14:04 np0005548788.localdomain ceph-mon[293643]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Dec 06 10:14:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:05.007 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:03Z, description=, device_id=0ab66a60-f76b-4775-891d-30b21387ddeb, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688e820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688eaf0>], id=eb59753e-18fa-4d22-9f19-4cf02ef3bfc3, ip_allocation=immediate, mac_address=fa:16:3e:19:9a:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:13:59Z, description=, dns_domain=, id=47d636a7-c520-4320-aa94-bfb41f418584, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1313845827-network, port_security_enabled=True, project_id=7897d6398eb64eb29c66df8db792e581, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16795, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=240, status=ACTIVE, subnets=['1f85bb5d-01b8-4e29-bdbf-5aebcf31d657'], tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, updated_at=2025-12-06T10:14:00Z, vlan_transparent=None, network_id=47d636a7-c520-4320-aa94-bfb41f418584, port_security_enabled=False, project_id=7897d6398eb64eb29c66df8db792e581, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=302, status=DOWN, tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, updated_at=2025-12-06T10:14:03Z on network 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:14:05 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 1 addresses
Dec 06 10:14:05 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:14:05 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:14:05 np0005548788.localdomain podman[309843]: 2025-12-06 10:14:05.300683921 +0000 UTC m=+0.065178135 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:14:05 np0005548788.localdomain systemd[1]: tmp-crun.rmQlHH.mount: Deactivated successfully.
Dec 06 10:14:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:05.620 262572 INFO neutron.agent.dhcp.agent [None req-0410efc9-d1c6-4c4f-a03f-182595344967 - - - - - -] DHCP configuration for ports {'eb59753e-18fa-4d22-9f19-4cf02ef3bfc3'} is completed
Dec 06 10:14:06 np0005548788.localdomain ceph-mon[293643]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Dec 06 10:14:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e102 do_prune osdmap full prune enabled
Dec 06 10:14:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 e103: 6 total, 6 up, 6 in
Dec 06 10:14:06 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in
Dec 06 10:14:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:07Z|00034|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:14:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:07Z|00035|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:14:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:07Z|00036|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:14:07 np0005548788.localdomain ceph-mon[293643]: osdmap e103: 6 total, 6 up, 6 in
Dec 06 10:14:07 np0005548788.localdomain ceph-mon[293643]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 904 B/s wr, 17 op/s
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:14:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:14:10 np0005548788.localdomain ceph-mon[293643]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 818 B/s wr, 16 op/s
Dec 06 10:14:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:12 np0005548788.localdomain ceph-mon[293643]: pgmap v86: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 818 B/s wr, 16 op/s
Dec 06 10:14:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:14:13 np0005548788.localdomain podman[309865]: 2025-12-06 10:14:13.277385434 +0000 UTC m=+0.097327820 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:14:13 np0005548788.localdomain podman[309865]: 2025-12-06 10:14:13.295604562 +0000 UTC m=+0.115546948 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 10:14:13 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.578 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "3d34a856-7613-4158-b859-fb3089fe3bc7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.578 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.605 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.725 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.726 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.733 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.733 281009 INFO nova.compute.claims [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Claim successful on node np0005548788.localdomain
Dec 06 10:14:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:13.890 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:14 np0005548788.localdomain ceph-mon[293643]: pgmap v87: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:14:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2845146463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.385 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.394 281009 DEBUG nova.compute.provider_tree [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.412 281009 DEBUG nova.scheduler.client.report [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.444 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.445 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.493 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.513 281009 INFO nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.533 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.630 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.632 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.633 281009 INFO nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Creating image(s)
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.682 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.728 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.770 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.776 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "cb68b180567fda17719a7393615b2f958ad3226e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:14.778 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "cb68b180567fda17719a7393615b2f958ad3226e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:15 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2845146463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:14:15 np0005548788.localdomain systemd[1]: tmp-crun.QBCSkS.mount: Deactivated successfully.
Dec 06 10:14:15 np0005548788.localdomain podman[309961]: 2025-12-06 10:14:15.270602933 +0000 UTC m=+0.092247245 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:14:15 np0005548788.localdomain podman[309961]: 2025-12-06 10:14:15.283774727 +0000 UTC m=+0.105419069 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:14:15 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:14:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:15.376 281009 DEBUG nova.virt.libvirt.imagebackend [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Image locations are: [{'url': 'rbd://1939e851-b10c-5c3b-9bb7-8e7f380233e8/images/6a944ab6-8965-4055-b7fc-af6e395005ea/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://1939e851-b10c-5c3b-9bb7-8e7f380233e8/images/6a944ab6-8965-4055-b7fc-af6e395005ea/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Dec 06 10:14:16 np0005548788.localdomain ceph-mon[293643]: pgmap v88: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.237 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.316 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.part --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.318 281009 DEBUG nova.virt.images [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] 6a944ab6-8965-4055-b7fc-af6e395005ea was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.319 281009 DEBUG nova.privsep.utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.320 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.part /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.505 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.part /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.converted" returned: 0 in 0.185s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.511 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.588 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e.converted --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.590 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "cb68b180567fda17719a7393615b2f958ad3226e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.630 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:16.636 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e 3d34a856-7613-4158-b859-fb3089fe3bc7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.228 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e 3d34a856-7613-4158-b859-fb3089fe3bc7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.591s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:17 np0005548788.localdomain podman[310034]: 2025-12-06 10:14:17.247696119 +0000 UTC m=+0.075827212 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:14:17 np0005548788.localdomain podman[310034]: 2025-12-06 10:14:17.282967569 +0000 UTC m=+0.111098632 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:14:17 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.328 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] resizing rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.477 281009 DEBUG nova.objects.instance [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lazy-loading 'migration_context' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.501 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.501 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Ensure instance console log exists: /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.504 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.505 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.507 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.511 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:11Z,direct_url=<?>,disk_format='qcow2',id=6a944ab6-8965-4055-b7fc-af6e395005ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3d603431c0bb4967bafc7a0aa6108bfe',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:13:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'size': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.518 281009 WARNING nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.522 281009 DEBUG nova.virt.libvirt.host [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Searching host: 'np0005548788.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.523 281009 DEBUG nova.virt.libvirt.host [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.525 281009 DEBUG nova.virt.libvirt.host [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Searching host: 'np0005548788.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.525 281009 DEBUG nova.virt.libvirt.host [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.526 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.527 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:13:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a0a7498e-22eb-495c-a2e3-89ba9e483bf6',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:11Z,direct_url=<?>,disk_format='qcow2',id=6a944ab6-8965-4055-b7fc-af6e395005ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3d603431c0bb4967bafc7a0aa6108bfe',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:13:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.527 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.528 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.528 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.529 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.529 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.530 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.530 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.530 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.531 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.531 281009 DEBUG nova.virt.hardware [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.537 281009 DEBUG nova.privsep.utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.538 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:14:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/989352054' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.938 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:17 np0005548788.localdomain ceph-mon[293643]: pgmap v89: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:14:17 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/989352054' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.980 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:17.986 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:14:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3413727768' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.385 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.387 281009 DEBUG nova.objects.instance [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lazy-loading 'pci_devices' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.404 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] End _get_guest_xml xml=<domain type="kvm">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <uuid>3d34a856-7613-4158-b859-fb3089fe3bc7</uuid>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <name>instance-00000006</name>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <memory>131072</memory>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <vcpu>1</vcpu>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <metadata>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-65395191</nova:name>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:creationTime>2025-12-06 10:14:17</nova:creationTime>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:flavor name="m1.nano">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:memory>128</nova:memory>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:disk>1</nova:disk>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:swap>0</nova:swap>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:ephemeral>0</nova:ephemeral>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:vcpus>1</nova:vcpus>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       </nova:flavor>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:owner>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:user uuid="496ca8bf29dc4e81ba0b08a592dc45d3">tempest-UnshelveToHostMultiNodesTest-912460009-project-member</nova:user>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <nova:project uuid="c6d84801a8b44d9da497e9761a0cd10c">tempest-UnshelveToHostMultiNodesTest-912460009</nova:project>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       </nova:owner>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:root type="image" uuid="6a944ab6-8965-4055-b7fc-af6e395005ea"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <nova:ports/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </nova:instance>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </metadata>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <sysinfo type="smbios">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <system>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <entry name="manufacturer">RDO</entry>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <entry name="product">OpenStack Compute</entry>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <entry name="serial">3d34a856-7613-4158-b859-fb3089fe3bc7</entry>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <entry name="uuid">3d34a856-7613-4158-b859-fb3089fe3bc7</entry>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <entry name="family">Virtual Machine</entry>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </system>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </sysinfo>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <os>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <boot dev="hd"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <smbios mode="sysinfo"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <acpi/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <apic/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <clock offset="utc">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <timer name="pit" tickpolicy="delay"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <timer name="hpet" present="no"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </clock>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <cpu mode="host-model" match="exact">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <topology sockets="1" cores="1" threads="1"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <disk type="network" device="disk">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <driver type="raw" cache="none"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <source protocol="rbd" name="vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.103" port="6789"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.104" port="6789"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.105" port="6789"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       </source>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <auth username="openstack">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       </auth>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <target dev="vda" bus="virtio"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <disk type="network" device="cdrom">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <driver type="raw" cache="none"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <source protocol="rbd" name="vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.103" port="6789"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.104" port="6789"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.105" port="6789"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       </source>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <auth username="openstack">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       </auth>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <target dev="sda" bus="sata"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <serial type="pty">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <log file="/var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/console.log" append="off"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </serial>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <video>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <model type="virtio"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <input type="tablet" bus="usb"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <rng model="virtio">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <backend model="random">/dev/urandom</backend>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <controller type="usb" index="0"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     <memballoon model="virtio">
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:       <stats period="10"/>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:     </memballoon>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: </domain>
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.451 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.451 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.451 281009 INFO nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Using config drive
Dec 06 10:14:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:18.484 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3413727768' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.085 281009 INFO nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Creating config drive at /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.091 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvm01wc_l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.215 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvm01wc_l" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.261 281009 DEBUG nova.storage.rbd_utils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.267 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.525 281009 DEBUG oslo_concurrency.processutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:19.527 281009 INFO nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Deleting local config drive /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config because it was imported into RBD.
Dec 06 10:14:19 np0005548788.localdomain systemd[1]: Started libvirt secret daemon.
Dec 06 10:14:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:14:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:14:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:14:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:14:19 np0005548788.localdomain systemd-machined[202859]: New machine qemu-1-instance-00000006.
Dec 06 10:14:19 np0005548788.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000006.
Dec 06 10:14:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:14:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19190 "" "Go-http-client/1.1"
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.005 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016060.0036552, 3d34a856-7613-4158-b859-fb3089fe3bc7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.006 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] VM Resumed (Lifecycle Event)
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.009 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.010 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.013 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance spawned successfully.
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.014 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.037 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.044 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.045 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.047 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.047 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.048 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.049 281009 DEBUG nova.virt.libvirt.driver [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:14:20 np0005548788.localdomain ceph-mon[293643]: pgmap v90: 177 pgs: 177 active+clean; 192 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.058 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.096 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.097 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016060.0061584, 3d34a856-7613-4158-b859-fb3089fe3bc7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.097 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] VM Started (Lifecycle Event)
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.141 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.145 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.157 281009 INFO nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Took 5.53 seconds to spawn the instance on the hypervisor.
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.159 281009 DEBUG nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:20 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:20.163 262572 INFO neutron.agent.linux.ip_lib [None req-96026d9f-246f-410f-9eec-c34d3ae8630d - - - - - -] Device tap8c3bec90-d4 cannot be used as it has no MAC address
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.192 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:14:20 np0005548788.localdomain kernel: device tap8c3bec90-d4 entered promiscuous mode
Dec 06 10:14:20 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016060.1999] manager: (tap8c3bec90-d4): new Generic device (/org/freedesktop/NetworkManager/Devices/15)
Dec 06 10:14:20 np0005548788.localdomain systemd-udevd[310318]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:20Z|00037|binding|INFO|Claiming lport 8c3bec90-d45b-4579-9bd1-e89eb803ef3d for this chassis.
Dec 06 10:14:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:20Z|00038|binding|INFO|8c3bec90-d45b-4579-9bd1-e89eb803ef3d: Claiming unknown
Dec 06 10:14:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:20.215 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-47c8dfd1-b65f-41a2-985d-f7571459a4b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c8dfd1-b65f-41a2-985d-f7571459a4b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '206fcaa64fd14d0ea5fb23a017c43692', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1418a735-1c46-4b65-b015-90af5f7ddbf4, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=8c3bec90-d45b-4579-9bd1-e89eb803ef3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:20.220 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 8c3bec90-d45b-4579-9bd1-e89eb803ef3d in datapath 47c8dfd1-b65f-41a2-985d-f7571459a4b1 bound to our chassis
Dec 06 10:14:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:20.222 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 47c8dfd1-b65f-41a2-985d-f7571459a4b1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:14:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:20.223 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[55535b10-6942-4279-8710-2a22754caa91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:20Z|00039|binding|INFO|Setting lport 8c3bec90-d45b-4579-9bd1-e89eb803ef3d ovn-installed in OVS
Dec 06 10:14:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:20Z|00040|binding|INFO|Setting lport 8c3bec90-d45b-4579-9bd1-e89eb803ef3d up in Southbound
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.254 281009 INFO nova.compute.manager [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Took 6.56 seconds to build instance.
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c3bec90-d4: No such device
Dec 06 10:14:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:20.271 281009 DEBUG oslo_concurrency.lockutils [None req-eb9cc7f1-2ca1-47be-8922-a9e09a97423c 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:21 np0005548788.localdomain podman[310402]: 
Dec 06 10:14:21 np0005548788.localdomain podman[310402]: 2025-12-06 10:14:21.308959427 +0000 UTC m=+0.100843857 container create 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:14:21 np0005548788.localdomain podman[310402]: 2025-12-06 10:14:21.255191101 +0000 UTC m=+0.047075581 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e.scope.
Dec 06 10:14:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47de62f4e4c01796363f2ef606e6915823850733203b2883a60d8ea290261e05/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:21 np0005548788.localdomain podman[310402]: 2025-12-06 10:14:21.434057316 +0000 UTC m=+0.225941766 container init 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:14:21 np0005548788.localdomain podman[310402]: 2025-12-06 10:14:21.447433766 +0000 UTC m=+0.239318216 container start 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:21 np0005548788.localdomain dnsmasq[310420]: started, version 2.85 cachesize 150
Dec 06 10:14:21 np0005548788.localdomain dnsmasq[310420]: DNS service limited to local subnets
Dec 06 10:14:21 np0005548788.localdomain dnsmasq[310420]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:21 np0005548788.localdomain dnsmasq[310420]: warning: no upstream servers configured
Dec 06 10:14:21 np0005548788.localdomain dnsmasq-dhcp[310420]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:21 np0005548788.localdomain dnsmasq[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/addn_hosts - 0 addresses
Dec 06 10:14:21 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/host
Dec 06 10:14:21 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/opts
Dec 06 10:14:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:21.655 262572 INFO neutron.agent.dhcp.agent [None req-8595db4a-3ea2-470d-a2e7-22786ab4948a - - - - - -] DHCP configuration for ports {'64d5a3e3-ee42-4c6a-8641-60ef110746a5'} is completed
Dec 06 10:14:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:22 np0005548788.localdomain ceph-mon[293643]: pgmap v91: 177 pgs: 177 active+clean; 192 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Dec 06 10:14:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:22.293 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "3d34a856-7613-4158-b859-fb3089fe3bc7" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:22.294 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:22.295 281009 INFO nova.compute.manager [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Shelving
Dec 06 10:14:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:22.323 281009 DEBUG nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 06 10:14:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:23.379 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:23Z, description=, device_id=0e54fb37-e53e-4ada-9f5a-9b02f9c2b583, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c691fc40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c691ff10>], id=55091d27-ca63-4aae-a8b0-1fbec5b1996b, ip_allocation=immediate, mac_address=fa:16:3e:82:86:5d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:18Z, description=, dns_domain=, id=47c8dfd1-b65f-41a2-985d-f7571459a4b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-295162459-network, port_security_enabled=True, project_id=206fcaa64fd14d0ea5fb23a017c43692, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13980, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=385, status=ACTIVE, subnets=['3420c26d-db14-4a4b-9fd2-6e48760d8d2d'], tags=[], tenant_id=206fcaa64fd14d0ea5fb23a017c43692, updated_at=2025-12-06T10:14:19Z, vlan_transparent=None, network_id=47c8dfd1-b65f-41a2-985d-f7571459a4b1, port_security_enabled=False, project_id=206fcaa64fd14d0ea5fb23a017c43692, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=427, status=DOWN, tags=[], tenant_id=206fcaa64fd14d0ea5fb23a017c43692, updated_at=2025-12-06T10:14:23Z on network 47c8dfd1-b65f-41a2-985d-f7571459a4b1
Dec 06 10:14:23 np0005548788.localdomain dnsmasq[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/addn_hosts - 1 addresses
Dec 06 10:14:23 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/host
Dec 06 10:14:23 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/opts
Dec 06 10:14:23 np0005548788.localdomain podman[310438]: 2025-12-06 10:14:23.610329068 +0000 UTC m=+0.068980202 container kill 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:14:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:23.805 262572 INFO neutron.agent.dhcp.agent [None req-bbede222-486b-4d78-906e-89fdada8dc1f - - - - - -] DHCP configuration for ports {'55091d27-ca63-4aae-a8b0-1fbec5b1996b'} is completed
Dec 06 10:14:24 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:14:24.057 2 INFO neutron.agent.securitygroups_rpc [None req-713c535f-db70-452f-a97f-68d844244da8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:14:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:24.092 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:23Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6823d30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6823ac0>], id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62, ip_allocation=immediate, mac_address=fa:16:3e:0e:f5:37, name=tempest-parent-876689022, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:13:59Z, description=, dns_domain=, id=47d636a7-c520-4320-aa94-bfb41f418584, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1313845827-network, port_security_enabled=True, project_id=7897d6398eb64eb29c66df8db792e581, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16795, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=240, status=ACTIVE, subnets=['1f85bb5d-01b8-4e29-bdbf-5aebcf31d657'], tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, updated_at=2025-12-06T10:14:00Z, vlan_transparent=None, network_id=47d636a7-c520-4320-aa94-bfb41f418584, port_security_enabled=True, project_id=7897d6398eb64eb29c66df8db792e581, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['bfad329a-0ea3-4b02-8e91-9d15749f8c9b'], standard_attr_id=435, status=DOWN, tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, updated_at=2025-12-06T10:14:23Z on network 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:14:24 np0005548788.localdomain ceph-mon[293643]: pgmap v92: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Dec 06 10:14:24 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 2 addresses
Dec 06 10:14:24 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:14:24 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:14:24 np0005548788.localdomain podman[310476]: 2025-12-06 10:14:24.376007685 +0000 UTC m=+0.078922398 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:24.695 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:23Z, description=, device_id=0e54fb37-e53e-4ada-9f5a-9b02f9c2b583, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686b400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686bbb0>], id=55091d27-ca63-4aae-a8b0-1fbec5b1996b, ip_allocation=immediate, mac_address=fa:16:3e:82:86:5d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:18Z, description=, dns_domain=, id=47c8dfd1-b65f-41a2-985d-f7571459a4b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-295162459-network, port_security_enabled=True, project_id=206fcaa64fd14d0ea5fb23a017c43692, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13980, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=385, status=ACTIVE, subnets=['3420c26d-db14-4a4b-9fd2-6e48760d8d2d'], tags=[], tenant_id=206fcaa64fd14d0ea5fb23a017c43692, updated_at=2025-12-06T10:14:19Z, vlan_transparent=None, network_id=47c8dfd1-b65f-41a2-985d-f7571459a4b1, port_security_enabled=False, project_id=206fcaa64fd14d0ea5fb23a017c43692, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=427, status=DOWN, tags=[], tenant_id=206fcaa64fd14d0ea5fb23a017c43692, updated_at=2025-12-06T10:14:23Z on network 47c8dfd1-b65f-41a2-985d-f7571459a4b1
Dec 06 10:14:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:24.702 262572 INFO neutron.agent.dhcp.agent [None req-4ce1c120-628b-4727-b9ea-dd7cbd650815 - - - - - -] DHCP configuration for ports {'e87832d3-ffc3-44e0-9f77-cd2eb6073d62'} is completed
Dec 06 10:14:24 np0005548788.localdomain dnsmasq[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/addn_hosts - 1 addresses
Dec 06 10:14:24 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/host
Dec 06 10:14:24 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/opts
Dec 06 10:14:24 np0005548788.localdomain podman[310512]: 2025-12-06 10:14:24.923334797 +0000 UTC m=+0.064399752 container kill 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:25.187 262572 INFO neutron.agent.dhcp.agent [None req-d8a99692-4cb6-4720-aa6e-17f8dadc2a38 - - - - - -] DHCP configuration for ports {'55091d27-ca63-4aae-a8b0-1fbec5b1996b'} is completed
Dec 06 10:14:26 np0005548788.localdomain ceph-mon[293643]: pgmap v93: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Dec 06 10:14:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:14:27 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:27.895 262572 INFO neutron.agent.linux.ip_lib [None req-90e04818-009d-479b-827e-31207db27c01 - - - - - -] Device tap791cf6d4-c9 cannot be used as it has no MAC address
Dec 06 10:14:27 np0005548788.localdomain systemd[1]: tmp-crun.KGxFJ2.mount: Deactivated successfully.
Dec 06 10:14:27 np0005548788.localdomain podman[310536]: 2025-12-06 10:14:27.966249656 +0000 UTC m=+0.150809707 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Dec 06 10:14:27 np0005548788.localdomain kernel: device tap791cf6d4-c9 entered promiscuous mode
Dec 06 10:14:27 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016067.9762] manager: (tap791cf6d4-c9): new Generic device (/org/freedesktop/NetworkManager/Devices/16)
Dec 06 10:14:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:27Z|00041|binding|INFO|Claiming lport 791cf6d4-c971-4e4f-960d-1a47244446a2 for this chassis.
Dec 06 10:14:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:27Z|00042|binding|INFO|791cf6d4-c971-4e4f-960d-1a47244446a2: Claiming unknown
Dec 06 10:14:27 np0005548788.localdomain ceph-mon[293643]: pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Dec 06 10:14:27 np0005548788.localdomain systemd-udevd[310562]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:27.988 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-45604602-bc87-4608-9881-9568cbf90870', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45604602-bc87-4608-9881-9568cbf90870', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9167331b2c424ef6961b096b551f8434', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d40d335f-7e85-43c3-894d-993c12735497, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=791cf6d4-c971-4e4f-960d-1a47244446a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:27.990 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 791cf6d4-c971-4e4f-960d-1a47244446a2 in datapath 45604602-bc87-4608-9881-9568cbf90870 bound to our chassis
Dec 06 10:14:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:27.991 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 45604602-bc87-4608-9881-9568cbf90870 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:14:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:27.993 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[8b8fd4b7-dbc7-48f1-873a-2e656d4db24f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:28Z|00043|binding|INFO|Setting lport 791cf6d4-c971-4e4f-960d-1a47244446a2 ovn-installed in OVS
Dec 06 10:14:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:28Z|00044|binding|INFO|Setting lport 791cf6d4-c971-4e4f-960d-1a47244446a2 up in Southbound
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap791cf6d4-c9: No such device
Dec 06 10:14:28 np0005548788.localdomain podman[310536]: 2025-12-06 10:14:28.047601536 +0000 UTC m=+0.232161617 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:14:28 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:14:28 np0005548788.localdomain podman[310633]: 2025-12-06 10:14:28.691709301 +0000 UTC m=+0.071657905 container kill 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:14:28 np0005548788.localdomain dnsmasq[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/addn_hosts - 0 addresses
Dec 06 10:14:28 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/host
Dec 06 10:14:28 np0005548788.localdomain dnsmasq-dhcp[310420]: read /var/lib/neutron/dhcp/47c8dfd1-b65f-41a2-985d-f7571459a4b1/opts
Dec 06 10:14:28 np0005548788.localdomain podman[310677]: 
Dec 06 10:14:28 np0005548788.localdomain podman[310677]: 2025-12-06 10:14:28.933758159 +0000 UTC m=+0.056263383 container create f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:14:28 np0005548788.localdomain systemd[1]: Started libpod-conmon-f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd.scope.
Dec 06 10:14:28 np0005548788.localdomain kernel: device tap8c3bec90-d4 left promiscuous mode
Dec 06 10:14:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:28Z|00045|binding|INFO|Releasing lport 8c3bec90-d45b-4579-9bd1-e89eb803ef3d from this chassis (sb_readonly=0)
Dec 06 10:14:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:28Z|00046|binding|INFO|Setting lport 8c3bec90-d45b-4579-9bd1-e89eb803ef3d down in Southbound
Dec 06 10:14:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:28.993 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-47c8dfd1-b65f-41a2-985d-f7571459a4b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c8dfd1-b65f-41a2-985d-f7571459a4b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '206fcaa64fd14d0ea5fb23a017c43692', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1418a735-1c46-4b65-b015-90af5f7ddbf4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=8c3bec90-d45b-4579-9bd1-e89eb803ef3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:28.995 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 8c3bec90-d45b-4579-9bd1-e89eb803ef3d in datapath 47c8dfd1-b65f-41a2-985d-f7571459a4b1 unbound from our chassis
Dec 06 10:14:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:28.998 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47c8dfd1-b65f-41a2-985d-f7571459a4b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:28.999 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[6fda0366-383c-4e0b-9b3e-50468808e803]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:29 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:29 np0005548788.localdomain podman[310677]: 2025-12-06 10:14:28.907883797 +0000 UTC m=+0.030389031 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:29 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44745714ded2a5c6c27641d7523732339da4255f252ccd80706818d6e03efb3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:29 np0005548788.localdomain podman[310677]: 2025-12-06 10:14:29.023355642 +0000 UTC m=+0.145860886 container init f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:29 np0005548788.localdomain podman[310677]: 2025-12-06 10:14:29.033021438 +0000 UTC m=+0.155526682 container start f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:14:29 np0005548788.localdomain dnsmasq[310698]: started, version 2.85 cachesize 150
Dec 06 10:14:29 np0005548788.localdomain dnsmasq[310698]: DNS service limited to local subnets
Dec 06 10:14:29 np0005548788.localdomain dnsmasq[310698]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:29 np0005548788.localdomain dnsmasq[310698]: warning: no upstream servers configured
Dec 06 10:14:29 np0005548788.localdomain dnsmasq-dhcp[310698]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:29 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 0 addresses
Dec 06 10:14:29 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:14:29 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:14:29 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:29.163 262572 INFO neutron.agent.dhcp.agent [None req-90c87304-618a-46ee-a41a-0c941f6a10e4 - - - - - -] DHCP configuration for ports {'d57132cf-ea52-419a-82d6-37dcdb5dd89a'} is completed
Dec 06 10:14:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:14:29.200 2 INFO neutron.agent.securitygroups_rpc [None req-42741e53-1189-4d3e-a617-18fc0438f9c5 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:14:30 np0005548788.localdomain ceph-mon[293643]: pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Dec 06 10:14:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:32 np0005548788.localdomain ceph-mon[293643]: pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:14:32 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3766673990' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:14:32 np0005548788.localdomain podman[310701]: 2025-12-06 10:14:32.24462832 +0000 UTC m=+0.074778019 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: tmp-crun.iOmEMT.mount: Deactivated successfully.
Dec 06 10:14:32 np0005548788.localdomain podman[310703]: 2025-12-06 10:14:32.263303682 +0000 UTC m=+0.085068415 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Dec 06 10:14:32 np0005548788.localdomain podman[310703]: 2025-12-06 10:14:32.30573298 +0000 UTC m=+0.127497743 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:14:32 np0005548788.localdomain podman[310702]: 2025-12-06 10:14:32.32138834 +0000 UTC m=+0.146089023 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:14:32 np0005548788.localdomain podman[310701]: 2025-12-06 10:14:32.342507316 +0000 UTC m=+0.172657015 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute)
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:14:32 np0005548788.localdomain podman[310702]: 2025-12-06 10:14:32.358720962 +0000 UTC m=+0.183421575 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:14:32 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:14:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:32.381 281009 DEBUG nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 06 10:14:33 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4228215768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:33 np0005548788.localdomain dnsmasq[310420]: exiting on receipt of SIGTERM
Dec 06 10:14:33 np0005548788.localdomain podman[310778]: 2025-12-06 10:14:33.442299649 +0000 UTC m=+0.042383288 container kill 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:14:33 np0005548788.localdomain systemd[1]: tmp-crun.vbrKnh.mount: Deactivated successfully.
Dec 06 10:14:33 np0005548788.localdomain systemd[1]: libpod-937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e.scope: Deactivated successfully.
Dec 06 10:14:33 np0005548788.localdomain podman[310793]: 2025-12-06 10:14:33.497574651 +0000 UTC m=+0.040584974 container died 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:14:33 np0005548788.localdomain systemd[1]: tmp-crun.1HfmPz.mount: Deactivated successfully.
Dec 06 10:14:33 np0005548788.localdomain podman[310793]: 2025-12-06 10:14:33.555853624 +0000 UTC m=+0.098863907 container remove 937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47c8dfd1-b65f-41a2-985d-f7571459a4b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:14:33 np0005548788.localdomain systemd[1]: libpod-conmon-937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e.scope: Deactivated successfully.
Dec 06 10:14:33 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:14:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:33.846 262572 INFO neutron.agent.dhcp.agent [None req-ec9c5994-ea7b-4e74-b382-eac4610d3a88 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:33.852 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:33Z, description=, device_id=fcb0956b-3e0b-42ed-82cc-dda3a3b5cf85, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68c6820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68c66d0>], id=3de26a41-f1d0-42b2-a318-ae5c9e237ba9, ip_allocation=immediate, mac_address=fa:16:3e:0d:84:8e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:25Z, description=, dns_domain=, id=45604602-bc87-4608-9881-9568cbf90870, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-802114316-network, port_security_enabled=True, project_id=9167331b2c424ef6961b096b551f8434, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=60021, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=449, status=ACTIVE, subnets=['8d94ac4b-ae25-428b-9b88-b56a3fd37a8e'], tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, updated_at=2025-12-06T10:14:26Z, vlan_transparent=None, network_id=45604602-bc87-4608-9881-9568cbf90870, port_security_enabled=False, project_id=9167331b2c424ef6961b096b551f8434, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=492, status=DOWN, tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, updated_at=2025-12-06T10:14:33Z on network 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:14:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:34.028 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:34.028 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:14:34 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 1 addresses
Dec 06 10:14:34 np0005548788.localdomain podman[310837]: 2025-12-06 10:14:34.087140656 +0000 UTC m=+0.048637081 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:14:34 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:14:34 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:14:34 np0005548788.localdomain ceph-mon[293643]: pgmap v97: 177 pgs: 177 active+clean; 213 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 116 op/s
Dec 06 10:14:34 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:34.345 262572 INFO neutron.agent.dhcp.agent [None req-90be5b03-f191-476a-8a97-197ed8ece30c - - - - - -] DHCP configuration for ports {'3de26a41-f1d0-42b2-a318-ae5c9e237ba9'} is completed
Dec 06 10:14:34 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:34.388 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-47de62f4e4c01796363f2ef606e6915823850733203b2883a60d8ea290261e05-merged.mount: Deactivated successfully.
Dec 06 10:14:34 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-937fd50156820bc38334b861dfcc4102f3637a9f14516913fa7d64fb51606a2e-userdata-shm.mount: Deactivated successfully.
Dec 06 10:14:34 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d47c8dfd1\x2db65f\x2d41a2\x2d985d\x2df7571459a4b1.mount: Deactivated successfully.
Dec 06 10:14:35 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:35.041 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/838118298' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:36.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:36.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:36.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:36 np0005548788.localdomain ceph-mon[293643]: pgmap v98: 177 pgs: 177 active+clean; 213 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 475 KiB/s rd, 2.1 MiB/s wr, 50 op/s
Dec 06 10:14:36 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:36.450 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:33Z, description=, device_id=fcb0956b-3e0b-42ed-82cc-dda3a3b5cf85, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6838940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6838460>], id=3de26a41-f1d0-42b2-a318-ae5c9e237ba9, ip_allocation=immediate, mac_address=fa:16:3e:0d:84:8e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:25Z, description=, dns_domain=, id=45604602-bc87-4608-9881-9568cbf90870, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-802114316-network, port_security_enabled=True, project_id=9167331b2c424ef6961b096b551f8434, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=60021, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=449, status=ACTIVE, subnets=['8d94ac4b-ae25-428b-9b88-b56a3fd37a8e'], tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, updated_at=2025-12-06T10:14:26Z, vlan_transparent=None, network_id=45604602-bc87-4608-9881-9568cbf90870, port_security_enabled=False, project_id=9167331b2c424ef6961b096b551f8434, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=492, status=DOWN, tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, updated_at=2025-12-06T10:14:33Z on network 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:14:36 np0005548788.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 06 10:14:36 np0005548788.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000006.scope: Consumed 13.990s CPU time.
Dec 06 10:14:36 np0005548788.localdomain systemd-machined[202859]: Machine qemu-1-instance-00000006 terminated.
Dec 06 10:14:36 np0005548788.localdomain podman[310875]: 2025-12-06 10:14:36.674277514 +0000 UTC m=+0.058861963 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:36 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 1 addresses
Dec 06 10:14:36 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:14:36 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:14:36 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:36.894 262572 INFO neutron.agent.dhcp.agent [None req-99ef9b71-01a0-4a23-878b-7475f05038dc - - - - - -] DHCP configuration for ports {'3de26a41-f1d0-42b2-a318-ae5c9e237ba9'} is completed
Dec 06 10:14:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:37 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:37.252 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005548790.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:23Z, description=, device_id=87dc2ce3-2b16-4764-9803-711c2d12c20f, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6960d60>], dns_domain=, dns_name=tempest-liveautoblockmigrationv225test-server-1999616987, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68fd130>], id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62, ip_allocation=immediate, mac_address=fa:16:3e:0e:f5:37, name=tempest-parent-876689022, network_id=47d636a7-c520-4320-aa94-bfb41f418584, port_security_enabled=True, project_id=7897d6398eb64eb29c66df8db792e581, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['bfad329a-0ea3-4b02-8e91-9d15749f8c9b'], standard_attr_id=435, status=DOWN, tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6960400>], trunk_id=e769d97e-a772-49f9-b3b5-017ea160d521, updated_at=2025-12-06T10:14:36Z on network 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:14:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:37.406 281009 INFO nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance shutdown successfully after 15 seconds.
Dec 06 10:14:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:37.414 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance destroyed successfully.
Dec 06 10:14:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:37.415 281009 DEBUG nova.objects.instance [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lazy-loading 'numa_topology' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:37 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 2 addresses
Dec 06 10:14:37 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:14:37 np0005548788.localdomain podman[310914]: 2025-12-06 10:14:37.491298931 +0000 UTC m=+0.064030661 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:14:37 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:14:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:37.499 281009 INFO nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Beginning cold snapshot process
Dec 06 10:14:37 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:37.726 262572 INFO neutron.agent.dhcp.agent [None req-f20dbbff-7efa-4e49-81df-027efde472ac - - - - - -] DHCP configuration for ports {'e87832d3-ffc3-44e0-9f77-cd2eb6073d62'} is completed
Dec 06 10:14:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:37.873 281009 DEBUG nova.virt.libvirt.imagebackend [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] No parent info for 6a944ab6-8965-4055-b7fc-af6e395005ea; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163
Dec 06 10:14:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:37.908 281009 DEBUG nova.storage.rbd_utils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] creating snapshot(6dcffb74a2e24e4fb6ad1d2116dd5c05) on rbd image(3d34a856-7613-4158-b859-fb3089fe3bc7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 06 10:14:37 np0005548788.localdomain ceph-mon[293643]: pgmap v99: 177 pgs: 177 active+clean; 213 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 475 KiB/s rd, 2.1 MiB/s wr, 50 op/s
Dec 06 10:14:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:38.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:14:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:14:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e103 do_prune osdmap full prune enabled
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e104 e104: 6 total, 6 up, 6 in
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.988814) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016078988851, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 964, "num_deletes": 252, "total_data_size": 876550, "memory_usage": 893736, "flush_reason": "Manual Compaction"}
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4100360430' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016078997088, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 849236, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24512, "largest_seqno": 25475, "table_properties": {"data_size": 844735, "index_size": 2164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10723, "raw_average_key_size": 20, "raw_value_size": 835388, "raw_average_value_size": 1634, "num_data_blocks": 90, "num_entries": 511, "num_filter_entries": 511, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016021, "oldest_key_time": 1765016021, "file_creation_time": 1765016078, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 8351 microseconds, and 3380 cpu microseconds.
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:14:38 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.997151) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 849236 bytes OK
Dec 06 10:14:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:39.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.997189) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.999172) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.999256) EVENT_LOG_v1 {"time_micros": 1765016078999242, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.999287) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 871914, prev total WAL file size 871914, number of live WAL files 2.
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.000061) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(829KB)], [42(19MB)]
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079000111, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 21113965, "oldest_snapshot_seqno": -1}
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12153 keys, 19283302 bytes, temperature: kUnknown
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079100117, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 19283302, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19215727, "index_size": 36114, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30405, "raw_key_size": 326825, "raw_average_key_size": 26, "raw_value_size": 19010391, "raw_average_value_size": 1564, "num_data_blocks": 1367, "num_entries": 12153, "num_filter_entries": 12153, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016078, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.100713) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 19283302 bytes
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.103471) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.7 rd, 192.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 19.3 +0.0 blob) out(18.4 +0.0 blob), read-write-amplify(47.6) write-amplify(22.7) OK, records in: 12683, records dropped: 530 output_compression: NoCompression
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.103501) EVENT_LOG_v1 {"time_micros": 1765016079103488, "job": 24, "event": "compaction_finished", "compaction_time_micros": 100186, "compaction_time_cpu_micros": 52097, "output_level": 6, "num_output_files": 1, "total_output_size": 19283302, "num_input_records": 12683, "num_output_records": 12153, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079103817, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079106818, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:38.999891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.106943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.106951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.106954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.106957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:14:39.106960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:39.169 281009 DEBUG nova.storage.rbd_utils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] cloning vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk@6dcffb74a2e24e4fb6ad1d2116dd5c05 to images/7141663c-a695-4147-a03d-20e8d4f67069 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 06 10:14:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:39.379 281009 DEBUG nova.storage.rbd_utils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] flattening images/7141663c-a695-4147-a03d-20e8d4f67069 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:14:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1657712167' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:14:40 np0005548788.localdomain ceph-mon[293643]: osdmap e104: 6 total, 6 up, 6 in
Dec 06 10:14:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1657712167' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:14:40 np0005548788.localdomain ceph-mon[293643]: pgmap v101: 177 pgs: 177 active+clean; 257 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.8 MiB/s wr, 108 op/s
Dec 06 10:14:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/227781456' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/492198068' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.024 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.024 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquired lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.024 281009 DEBUG nova.network.neutron [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.025 281009 DEBUG nova.objects.instance [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.105 281009 DEBUG nova.network.neutron [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.301 281009 DEBUG nova.storage.rbd_utils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] removing snapshot(6dcffb74a2e24e4fb6ad1d2116dd5c05) on rbd image(3d34a856-7613-4158-b859-fb3089fe3bc7_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.341 281009 DEBUG nova.network.neutron [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.358 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Releasing lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.358 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.359 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:40.359 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.031 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.032 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.032 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.033 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.033 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e104 do_prune osdmap full prune enabled
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3664938098' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e105 e105: 6 total, 6 up, 6 in
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.147 281009 DEBUG nova.storage.rbd_utils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] creating snapshot(snap) on rbd image(7141663c-a695-4147-a03d-20e8d4f67069) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4088784122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.509 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.585 281009 DEBUG nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.586 281009 DEBUG nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.751 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.752 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11860MB free_disk=41.64639663696289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.753 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.753 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.843 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.844 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.844 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=640MB phys_disk=41GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:14:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:41.876 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e105 do_prune osdmap full prune enabled
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: pgmap v102: 177 pgs: 177 active+clean; 257 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.8 MiB/s wr, 108 op/s
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: osdmap e105: 6 total, 6 up, 6 in
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/95423717' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4088784122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e106 e106: 6 total, 6 up, 6 in
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1318618644' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.342 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.350 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.398 281009 ERROR nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [req-99163887-10ad-4012-a3b2-dd5d91ff0382] Failed to update inventory to [{'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 7413251f-98bc-4150-b1b6-b77ff1bcb5f1.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-99163887-10ad-4012-a3b2-dd5d91ff0382"}]}
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.421 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.448 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.448 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.473 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.512 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.557 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.917 281009 INFO nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Snapshot image upload complete
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.918 281009 DEBUG nova.compute.manager [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.975 281009 INFO nova.compute.manager [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Shelve offloading
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.985 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance destroyed successfully.
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.986 281009 DEBUG nova.compute.manager [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3370650055' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.990 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.990 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquired lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:14:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:42.990 281009 DEBUG nova.network.neutron [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.004 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.016 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.050 281009 DEBUG nova.network.neutron [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.058 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updated inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with generation 6 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.059 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 generation from 6 to 7 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.059 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.081 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.081 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:43 np0005548788.localdomain ceph-mon[293643]: osdmap e106: 6 total, 6 up, 6 in
Dec 06 10:14:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1318618644' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3370650055' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.500 281009 DEBUG nova.network.neutron [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.519 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Releasing lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.530 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance destroyed successfully.
Dec 06 10:14:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:43.530 281009 DEBUG nova.objects.instance [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lazy-loading 'resources' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:14:43 np0005548788.localdomain podman[311161]: 2025-12-06 10:14:43.691384213 +0000 UTC m=+0.081083573 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:14:43 np0005548788.localdomain podman[311161]: 2025-12-06 10:14:43.727748926 +0000 UTC m=+0.117448276 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:14:43 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:14:44 np0005548788.localdomain ceph-mon[293643]: pgmap v105: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 11 MiB/s wr, 304 op/s
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.145 281009 INFO nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Deleting instance files /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7_del
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.146 281009 INFO nova.virt.libvirt.driver [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Deletion of /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7_del complete
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.233 281009 DEBUG nova.virt.libvirt.host [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.233 281009 INFO nova.virt.libvirt.host [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] UEFI support detected
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.280 281009 INFO nova.scheduler.client.report [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Deleted allocations for instance 3d34a856-7613-4158-b859-fb3089fe3bc7
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.340 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.340 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.381 281009 DEBUG oslo_concurrency.processutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:44.823 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2238572858' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:44.826 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.851 281009 DEBUG oslo_concurrency.processutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.860 281009 DEBUG nova.compute.provider_tree [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.893 281009 DEBUG nova.scheduler.client.report [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.933 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:44.979 281009 DEBUG oslo_concurrency.lockutils [None req-a65736b6-8388-4cf5-a7a5-10b9e1bba322 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 22.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2238572858' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:46 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:14:46.016 2 INFO neutron.agent.securitygroups_rpc [req-842dc70c-4c90-4d04-97b8-ca0a150f47f3 req-694d7e2d-322f-485d-ac12-0a632bb0d8f8 3a50fae64027482ba5b10005ed97189e 024b6fbc052c4ed7a93c855bd2ae77da - - default default] Security group rule updated ['e6cef3ed-f2f1-4e9f-8bb7-b8303074aa1b']
Dec 06 10:14:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:14:46 np0005548788.localdomain ceph-mon[293643]: pgmap v106: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 9.2 MiB/s wr, 208 op/s
Dec 06 10:14:46 np0005548788.localdomain podman[311203]: 2025-12-06 10:14:46.262236063 +0000 UTC m=+0.089508121 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:14:46 np0005548788.localdomain podman[311203]: 2025-12-06 10:14:46.297575505 +0000 UTC m=+0.124847543 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:14:46 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:14:46 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:14:46.817 2 INFO neutron.agent.securitygroups_rpc [req-3aac6f95-4738-40dc-9407-49685a717c88 req-a8e1311a-c6c3-4f2f-8fef-2b7b3e5084e1 3a50fae64027482ba5b10005ed97189e 024b6fbc052c4ed7a93c855bd2ae77da - - default default] Security group rule updated ['e6cef3ed-f2f1-4e9f-8bb7-b8303074aa1b']
Dec 06 10:14:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e106 do_prune osdmap full prune enabled
Dec 06 10:14:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e107 e107: 6 total, 6 up, 6 in
Dec 06 10:14:46 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.287 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Acquiring lock "3d34a856-7613-4158-b859-fb3089fe3bc7" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.288 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.288 281009 INFO nova.compute.manager [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Unshelving
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.381 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.382 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.384 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.401 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.416 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.416 281009 INFO nova.compute.claims [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Claim successful on node np0005548788.localdomain
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.438 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.438 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.438 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:47.529 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:14:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:47.747 262572 INFO neutron.agent.linux.ip_lib [None req-bef9de67-d34b-49fd-a29c-3fe98e272ded - - - - - -] Device tap42458b72-69 cannot be used as it has no MAC address
Dec 06 10:14:47 np0005548788.localdomain podman[311229]: 2025-12-06 10:14:47.77743046 +0000 UTC m=+0.106282405 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 10:14:47 np0005548788.localdomain kernel: device tap42458b72-69 entered promiscuous mode
Dec 06 10:14:47 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016087.8357] manager: (tap42458b72-69): new Generic device (/org/freedesktop/NetworkManager/Devices/17)
Dec 06 10:14:47 np0005548788.localdomain systemd-udevd[311272]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:47Z|00047|binding|INFO|Claiming lport 42458b72-69f7-437c-a516-669c4984e830 for this chassis.
Dec 06 10:14:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:47Z|00048|binding|INFO|42458b72-69f7-437c-a516-669c4984e830: Claiming unknown
Dec 06 10:14:47 np0005548788.localdomain podman[311229]: 2025-12-06 10:14:47.859041427 +0000 UTC m=+0.187893392 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.857 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-8c344e67-0482-40ce-b72c-ef9b65d68fbc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c344e67-0482-40ce-b72c-ef9b65d68fbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '088ea8df069043ee8ed156bf735134b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e980c4c-0409-4cbd-bd81-e2db3d5bcca5, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=42458b72-69f7-437c-a516-669c4984e830) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.859 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 42458b72-69f7-437c-a516-669c4984e830 in datapath 8c344e67-0482-40ce-b72c-ef9b65d68fbc bound to our chassis
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.862 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8c344e67-0482-40ce-b72c-ef9b65d68fbc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:14:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:47.863 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a5624601-9fb6-4668-8e4f-ea922b26bfbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:47 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:14:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:47Z|00049|binding|INFO|Setting lport 42458b72-69f7-437c-a516-669c4984e830 ovn-installed in OVS
Dec 06 10:14:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:14:47Z|00050|binding|INFO|Setting lport 42458b72-69f7-437c-a516-669c4984e830 up in Southbound
Dec 06 10:14:47 np0005548788.localdomain ceph-mon[293643]: osdmap e107: 6 total, 6 up, 6 in
Dec 06 10:14:47 np0005548788.localdomain ceph-mon[293643]: pgmap v108: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 9.3 MiB/s wr, 210 op/s
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.030 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.039 281009 DEBUG nova.compute.provider_tree [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.060 281009 DEBUG nova.scheduler.client.report [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.094 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.172 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Acquiring lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.173 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Acquired lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.173 281009 DEBUG nova.network.neutron [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.244 281009 DEBUG nova.network.neutron [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.459 281009 DEBUG nova.network.neutron [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.483 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Releasing lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.485 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.485 281009 INFO nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Creating image(s)
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.522 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.526 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.581 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:48 np0005548788.localdomain sshd[311342]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.625 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.630 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Acquiring lock "9a4c9a53eddbf462557c06c6d0251712915b11ba" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.631 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "9a4c9a53eddbf462557c06c6d0251712915b11ba" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.679 281009 DEBUG nova.virt.libvirt.imagebackend [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Image locations are: [{'url': 'rbd://1939e851-b10c-5c3b-9bb7-8e7f380233e8/images/7141663c-a695-4147-a03d-20e8d4f67069/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://1939e851-b10c-5c3b-9bb7-8e7f380233e8/images/7141663c-a695-4147-a03d-20e8d4f67069/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.766 281009 DEBUG nova.virt.libvirt.imagebackend [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Selected location: {'url': 'rbd://1939e851-b10c-5c3b-9bb7-8e7f380233e8/images/7141663c-a695-4147-a03d-20e8d4f67069/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.767 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] cloning images/7141663c-a695-4147-a03d-20e8d4f67069@snap to None/3d34a856-7613-4158-b859-fb3089fe3bc7_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 06 10:14:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:14:48.829 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:14:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:48.971 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "9a4c9a53eddbf462557c06c6d0251712915b11ba" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1632793843' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:48 np0005548788.localdomain podman[311452]: 
Dec 06 10:14:49 np0005548788.localdomain podman[311452]: 2025-12-06 10:14:49.006907462 +0000 UTC m=+0.089129019 container create 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:49 np0005548788.localdomain systemd[1]: Started libpod-conmon-5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330.scope.
Dec 06 10:14:49 np0005548788.localdomain podman[311452]: 2025-12-06 10:14:48.971018713 +0000 UTC m=+0.053240320 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:49 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:49 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/439e3fd40f447976b905f7f7b88b6b6c3a05b830bbf38a0c7e415d883268a6b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:49 np0005548788.localdomain podman[311452]: 2025-12-06 10:14:49.095475332 +0000 UTC m=+0.177696879 container init 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:14:49 np0005548788.localdomain podman[311452]: 2025-12-06 10:14:49.105893881 +0000 UTC m=+0.188115458 container start 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:49 np0005548788.localdomain dnsmasq[311507]: started, version 2.85 cachesize 150
Dec 06 10:14:49 np0005548788.localdomain dnsmasq[311507]: DNS service limited to local subnets
Dec 06 10:14:49 np0005548788.localdomain dnsmasq[311507]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:49 np0005548788.localdomain dnsmasq[311507]: warning: no upstream servers configured
Dec 06 10:14:49 np0005548788.localdomain dnsmasq-dhcp[311507]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:49 np0005548788.localdomain dnsmasq[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/addn_hosts - 0 addresses
Dec 06 10:14:49 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/host
Dec 06 10:14:49 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/opts
Dec 06 10:14:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:49.229 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'migration_context' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:49 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:49.254 262572 INFO neutron.agent.dhcp.agent [None req-74367a54-cb18-452f-bec2-14e17a0e9362 - - - - - -] DHCP configuration for ports {'69e32f6a-50c9-4381-8616-34895e08e962'} is completed
Dec 06 10:14:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:49.322 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] flattening vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Dec 06 10:14:49 np0005548788.localdomain sshd[311342]: Received disconnect from 193.46.255.244 port 17576:11:  [preauth]
Dec 06 10:14:49 np0005548788.localdomain sshd[311342]: Disconnected from authenticating user root 193.46.255.244 port 17576 [preauth]
Dec 06 10:14:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:14:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:14:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:14:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160394 "" "Go-http-client/1.1"
Dec 06 10:14:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:14:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20138 "" "Go-http-client/1.1"
Dec 06 10:14:50 np0005548788.localdomain ceph-mon[293643]: pgmap v109: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.0 MiB/s wr, 263 op/s
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.207 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Image rbd:vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.208 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.209 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Ensure instance console log exists: /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.209 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.210 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.210 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.212 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-12-06T10:14:22Z,direct_url=<?>,disk_format='raw',id=7141663c-a695-4147-a03d-20e8d4f67069,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-65395191-shelved',owner='c6d84801a8b44d9da497e9761a0cd10c',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-12-06T10:14:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'size': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.218 281009 WARNING nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.220 281009 DEBUG nova.virt.libvirt.host [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Searching host: 'np0005548788.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.221 281009 DEBUG nova.virt.libvirt.host [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.222 281009 DEBUG nova.virt.libvirt.host [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Searching host: 'np0005548788.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.223 281009 DEBUG nova.virt.libvirt.host [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.224 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.224 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:13:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a0a7498e-22eb-495c-a2e3-89ba9e483bf6',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-12-06T10:14:22Z,direct_url=<?>,disk_format='raw',id=7141663c-a695-4147-a03d-20e8d4f67069,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-65395191-shelved',owner='c6d84801a8b44d9da497e9761a0cd10c',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-12-06T10:14:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.225 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.225 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.225 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.226 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.226 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.226 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.227 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.227 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.227 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.227 281009 DEBUG nova.virt.hardware [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.228 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.248 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:14:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2395424858' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.714 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.757 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:50.763 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:51 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2395424858' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:14:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/840869338' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.330 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.334 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.358 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] End _get_guest_xml xml=<domain type="kvm">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <uuid>3d34a856-7613-4158-b859-fb3089fe3bc7</uuid>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <name>instance-00000006</name>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <memory>131072</memory>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <vcpu>1</vcpu>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <metadata>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-65395191</nova:name>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:creationTime>2025-12-06 10:14:50</nova:creationTime>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:flavor name="m1.nano">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:memory>128</nova:memory>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:disk>1</nova:disk>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:swap>0</nova:swap>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:ephemeral>0</nova:ephemeral>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:vcpus>1</nova:vcpus>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       </nova:flavor>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:owner>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:user uuid="496ca8bf29dc4e81ba0b08a592dc45d3">tempest-UnshelveToHostMultiNodesTest-912460009-project-member</nova:user>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <nova:project uuid="c6d84801a8b44d9da497e9761a0cd10c">tempest-UnshelveToHostMultiNodesTest-912460009</nova:project>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       </nova:owner>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:root type="image" uuid="7141663c-a695-4147-a03d-20e8d4f67069"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <nova:ports/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </nova:instance>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </metadata>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <sysinfo type="smbios">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <system>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <entry name="manufacturer">RDO</entry>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <entry name="product">OpenStack Compute</entry>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <entry name="serial">3d34a856-7613-4158-b859-fb3089fe3bc7</entry>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <entry name="uuid">3d34a856-7613-4158-b859-fb3089fe3bc7</entry>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <entry name="family">Virtual Machine</entry>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </system>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </sysinfo>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <os>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <boot dev="hd"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <smbios mode="sysinfo"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <acpi/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <apic/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <clock offset="utc">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <timer name="pit" tickpolicy="delay"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <timer name="hpet" present="no"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </clock>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <cpu mode="host-model" match="exact">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <topology sockets="1" cores="1" threads="1"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <disk type="network" device="disk">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <driver type="raw" cache="none"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <source protocol="rbd" name="vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.103" port="6789"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.104" port="6789"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.105" port="6789"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       </source>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <auth username="openstack">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       </auth>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <target dev="vda" bus="virtio"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <disk type="network" device="cdrom">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <driver type="raw" cache="none"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <source protocol="rbd" name="vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.103" port="6789"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.104" port="6789"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.105" port="6789"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       </source>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <auth username="openstack">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       </auth>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <target dev="sda" bus="sata"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <serial type="pty">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <log file="/var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/console.log" append="off"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </serial>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <video>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <model type="virtio"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <input type="tablet" bus="usb"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <input type="keyboard" bus="usb"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <rng model="virtio">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <backend model="random">/dev/urandom</backend>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <controller type="usb" index="0"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     <memballoon model="virtio">
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:       <stats period="10"/>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:     </memballoon>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: </domain>
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.410 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.411 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.412 281009 INFO nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Using config drive
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.450 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.491 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.529 281009 DEBUG nova.objects.instance [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lazy-loading 'keypairs' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.635 281009 INFO nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Creating config drive at /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.646 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnopmjdfp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.781 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnopmjdfp" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.823 281009 DEBUG nova.storage.rbd_utils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] rbd image 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.829 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.851 281009 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765016076.797151, 3d34a856-7613-4158-b859-fb3089fe3bc7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.851 281009 INFO nova.compute.manager [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] VM Stopped (Lifecycle Event)
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.874 281009 DEBUG nova.compute.manager [None req-7a626966-280b-44f4-8f8f-8ca09e0daf5b - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.879 281009 DEBUG nova.compute.manager [None req-7a626966-280b-44f4-8f8f-8ca09e0daf5b - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:14:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:51.896 281009 INFO nova.compute.manager [None req-7a626966-280b-44f4-8f8f-8ca09e0daf5b - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:14:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.037 281009 DEBUG oslo_concurrency.processutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config 3d34a856-7613-4158-b859-fb3089fe3bc7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.038 281009 INFO nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Deleting local config drive /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7/disk.config because it was imported into RBD.
Dec 06 10:14:52 np0005548788.localdomain systemd-machined[202859]: New machine qemu-2-instance-00000006.
Dec 06 10:14:52 np0005548788.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Dec 06 10:14:52 np0005548788.localdomain ceph-mon[293643]: pgmap v110: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 6.2 MiB/s wr, 234 op/s
Dec 06 10:14:52 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/840869338' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.427 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016092.4275544, 3d34a856-7613-4158-b859-fb3089fe3bc7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.428 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] VM Resumed (Lifecycle Event)
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.433 281009 DEBUG nova.compute.manager [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.434 281009 DEBUG nova.virt.libvirt.driver [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.438 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance spawned successfully.
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.458 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.462 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.487 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.487 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016092.4340918, 3d34a856-7613-4158-b859-fb3089fe3bc7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.488 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] VM Started (Lifecycle Event)
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.508 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.512 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:14:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:52.530 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:14:52 np0005548788.localdomain sudo[311742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:14:52 np0005548788.localdomain sudo[311742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:14:52 np0005548788.localdomain sudo[311742]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:52 np0005548788.localdomain sudo[311760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:14:52 np0005548788.localdomain sudo[311760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:14:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e107 do_prune osdmap full prune enabled
Dec 06 10:14:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e108 e108: 6 total, 6 up, 6 in
Dec 06 10:14:53 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in
Dec 06 10:14:53 np0005548788.localdomain sudo[311760]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:14:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:14:53 np0005548788.localdomain sudo[311810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:14:53 np0005548788.localdomain sudo[311810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:14:53 np0005548788.localdomain sudo[311810]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:53 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:14:53.703 2 INFO neutron.agent.securitygroups_rpc [None req-2bc0f0e9-228c-4272-bb0d-cc31a9019510 b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:14:53 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:53.859 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:52Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68563a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68569a0>], id=feb6a13d-305a-4541-a50e-4988833ecf82, ip_allocation=immediate, mac_address=fa:16:3e:e5:ea:4a, name=tempest-parent-1146072664, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:25Z, description=, dns_domain=, id=45604602-bc87-4608-9881-9568cbf90870, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-802114316-network, port_security_enabled=True, project_id=9167331b2c424ef6961b096b551f8434, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=60021, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=449, status=ACTIVE, subnets=['8d94ac4b-ae25-428b-9b88-b56a3fd37a8e'], tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, updated_at=2025-12-06T10:14:26Z, vlan_transparent=None, network_id=45604602-bc87-4608-9881-9568cbf90870, port_security_enabled=True, project_id=9167331b2c424ef6961b096b551f8434, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4c82b56e-0fc5-4c7f-8922-ceb8236815fd'], standard_attr_id=627, status=DOWN, tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, updated_at=2025-12-06T10:14:52Z on network 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:14:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:53.859 281009 DEBUG nova.compute.manager [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:53.968 281009 DEBUG oslo_concurrency.lockutils [None req-bbe899fe-6369-4b74-ad77-151b5357613d 83cd451c5dc442c78643ba7608b7134a 48c16b78edd4478f9497cfefc6873978 - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 6.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:54 np0005548788.localdomain ceph-mon[293643]: pgmap v111: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 221 op/s
Dec 06 10:14:54 np0005548788.localdomain ceph-mon[293643]: osdmap e108: 6 total, 6 up, 6 in
Dec 06 10:14:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:14:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:14:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:14:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:14:54 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 2 addresses
Dec 06 10:14:54 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:14:54 np0005548788.localdomain podman[311845]: 2025-12-06 10:14:54.199902461 +0000 UTC m=+0.066063204 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:14:54 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:14:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:54.526 262572 INFO neutron.agent.dhcp.agent [None req-c39effe4-0b16-4ca1-932e-7fed99986302 - - - - - -] DHCP configuration for ports {'feb6a13d-305a-4541-a50e-4988833ecf82'} is completed
Dec 06 10:14:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:55.424 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "3d34a856-7613-4158-b859-fb3089fe3bc7" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:55.426 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:55.426 281009 INFO nova.compute.manager [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Shelving
Dec 06 10:14:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:14:55.472 281009 DEBUG nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 06 10:14:56 np0005548788.localdomain ceph-mon[293643]: pgmap v113: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.7 MiB/s rd, 5.8 MiB/s wr, 273 op/s
Dec 06 10:14:56 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:56.493 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:55Z, description=, device_id=110bfe4d-8dd3-4386-b8da-4c950d9b90e9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6864ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6864760>], id=36bbb516-3488-4eaa-b7bc-88d7750877d1, ip_allocation=immediate, mac_address=fa:16:3e:34:db:16, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:45Z, description=, dns_domain=, id=8c344e67-0482-40ce-b72c-ef9b65d68fbc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-1089920667-network, port_security_enabled=True, project_id=088ea8df069043ee8ed156bf735134b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55856, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=572, status=ACTIVE, subnets=['293a7165-4e70-4501-89e7-fc543cafb88d'], tags=[], tenant_id=088ea8df069043ee8ed156bf735134b7, updated_at=2025-12-06T10:14:46Z, vlan_transparent=None, network_id=8c344e67-0482-40ce-b72c-ef9b65d68fbc, port_security_enabled=False, project_id=088ea8df069043ee8ed156bf735134b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=639, status=DOWN, tags=[], tenant_id=088ea8df069043ee8ed156bf735134b7, updated_at=2025-12-06T10:14:55Z on network 8c344e67-0482-40ce-b72c-ef9b65d68fbc
Dec 06 10:14:56 np0005548788.localdomain systemd[1]: tmp-crun.VHsjXF.mount: Deactivated successfully.
Dec 06 10:14:56 np0005548788.localdomain podman[311882]: 2025-12-06 10:14:56.760900839 +0000 UTC m=+0.076939077 container kill 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:14:56 np0005548788.localdomain dnsmasq[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/addn_hosts - 1 addresses
Dec 06 10:14:56 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/host
Dec 06 10:14:56 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/opts
Dec 06 10:14:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:14:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:14:57 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:14:57.130 262572 INFO neutron.agent.dhcp.agent [None req-823322e5-129a-4b38-b070-8104ceba97dc - - - - - -] DHCP configuration for ports {'36bbb516-3488-4eaa-b7bc-88d7750877d1'} is completed
Dec 06 10:14:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:14:57 np0005548788.localdomain ceph-mon[293643]: pgmap v114: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 221 op/s
Dec 06 10:14:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:14:58 np0005548788.localdomain systemd[1]: tmp-crun.m8rSXs.mount: Deactivated successfully.
Dec 06 10:14:58 np0005548788.localdomain podman[311904]: 2025-12-06 10:14:58.286750861 +0000 UTC m=+0.105836810 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 06 10:14:58 np0005548788.localdomain podman[311904]: 2025-12-06 10:14:58.338510516 +0000 UTC m=+0.157596454 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:14:58 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:15:00 np0005548788.localdomain ceph-mon[293643]: pgmap v115: 177 pgs: 177 active+clean; 283 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 6.3 MiB/s wr, 256 op/s
Dec 06 10:15:01 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:01.805 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:55Z, description=, device_id=110bfe4d-8dd3-4386-b8da-4c950d9b90e9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688bf40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688b430>], id=36bbb516-3488-4eaa-b7bc-88d7750877d1, ip_allocation=immediate, mac_address=fa:16:3e:34:db:16, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:45Z, description=, dns_domain=, id=8c344e67-0482-40ce-b72c-ef9b65d68fbc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-1089920667-network, port_security_enabled=True, project_id=088ea8df069043ee8ed156bf735134b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55856, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=572, status=ACTIVE, subnets=['293a7165-4e70-4501-89e7-fc543cafb88d'], tags=[], tenant_id=088ea8df069043ee8ed156bf735134b7, updated_at=2025-12-06T10:14:46Z, vlan_transparent=None, network_id=8c344e67-0482-40ce-b72c-ef9b65d68fbc, port_security_enabled=False, project_id=088ea8df069043ee8ed156bf735134b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=639, status=DOWN, tags=[], tenant_id=088ea8df069043ee8ed156bf735134b7, updated_at=2025-12-06T10:14:55Z on network 8c344e67-0482-40ce-b72c-ef9b65d68fbc
Dec 06 10:15:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e108 do_prune osdmap full prune enabled
Dec 06 10:15:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e109 e109: 6 total, 6 up, 6 in
Dec 06 10:15:01 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in
Dec 06 10:15:02 np0005548788.localdomain dnsmasq[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/addn_hosts - 1 addresses
Dec 06 10:15:02 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/host
Dec 06 10:15:02 np0005548788.localdomain podman[311944]: 2025-12-06 10:15:02.035288509 +0000 UTC m=+0.057170352 container kill 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:15:02 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/opts
Dec 06 10:15:02 np0005548788.localdomain ceph-mon[293643]: pgmap v116: 177 pgs: 177 active+clean; 283 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 6.3 MiB/s wr, 256 op/s
Dec 06 10:15:02 np0005548788.localdomain ceph-mon[293643]: osdmap e109: 6 total, 6 up, 6 in
Dec 06 10:15:02 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:02.309 262572 INFO neutron.agent.dhcp.agent [None req-6874599b-c284-415f-af48-00138c662b2c - - - - - -] DHCP configuration for ports {'36bbb516-3488-4eaa-b7bc-88d7750877d1'} is completed
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:15:03 np0005548788.localdomain podman[311967]: 2025-12-06 10:15:03.257925963 +0000 UTC m=+0.082902969 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: tmp-crun.0YOHBP.mount: Deactivated successfully.
Dec 06 10:15:03 np0005548788.localdomain podman[311968]: 2025-12-06 10:15:03.333868666 +0000 UTC m=+0.151258839 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:15:03 np0005548788.localdomain podman[311967]: 2025-12-06 10:15:03.345565566 +0000 UTC m=+0.170542562 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:15:03 np0005548788.localdomain podman[311968]: 2025-12-06 10:15:03.398516913 +0000 UTC m=+0.215907086 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:15:03 np0005548788.localdomain podman[311966]: 2025-12-06 10:15:03.419009442 +0000 UTC m=+0.244528455 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:15:03 np0005548788.localdomain podman[311966]: 2025-12-06 10:15:03.436616004 +0000 UTC m=+0.262135017 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 10:15:03 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:15:04 np0005548788.localdomain ceph-mon[293643]: pgmap v118: 177 pgs: 177 active+clean; 304 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 2.6 MiB/s wr, 179 op/s
Dec 06 10:15:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:05.528 281009 DEBUG nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Dec 06 10:15:06 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:06.020 2 INFO neutron.agent.securitygroups_rpc [None req-32f5fe5b-2f75-4cad-9292-d5acba05dc94 b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:15:06 np0005548788.localdomain ceph-mon[293643]: pgmap v119: 177 pgs: 177 active+clean; 304 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 2.6 MiB/s wr, 178 op/s
Dec 06 10:15:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:07 np0005548788.localdomain systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 06 10:15:07 np0005548788.localdomain systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 13.158s CPU time.
Dec 06 10:15:07 np0005548788.localdomain systemd-machined[202859]: Machine qemu-2-instance-00000006 terminated.
Dec 06 10:15:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:07.909 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}de99ad19251b7510330745fc50be6cf57837aaa68a73b83dcceeadc2dda44da7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 10:15:07 np0005548788.localdomain ceph-mon[293643]: pgmap v120: 177 pgs: 177 active+clean; 304 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 2.6 MiB/s wr, 178 op/s
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.060 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 954 Content-Type: application/json Date: Sat, 06 Dec 2025 10:15:07 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-c77282b1-f6a1-4db7-ba2d-d9ff9cbeb7d9 x-openstack-request-id: req-c77282b1-f6a1-4db7-ba2d-d9ff9cbeb7d9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.061 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "3b9dcd46-fa1b-4714-ba2b-665da2f67af6", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}]}, {"id": "72bdd1eb-059b-401d-8f8a-ec7c66937f24", "name": "m1.micro", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/72bdd1eb-059b-401d-8f8a-ec7c66937f24"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/72bdd1eb-059b-401d-8f8a-ec7c66937f24"}]}, {"id": "a0a7498e-22eb-495c-a2e3-89ba9e483bf6", "name": "m1.nano", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.061 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-c77282b1-f6a1-4db7-ba2d-d9ff9cbeb7d9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.064 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}de99ad19251b7510330745fc50be6cf57837aaa68a73b83dcceeadc2dda44da7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.124 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 493 Content-Type: application/json Date: Sat, 06 Dec 2025 10:15:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d990df3c-6f66-4729-bb3c-c6acd0f5df22 x-openstack-request-id: req-d990df3c-6f66-4729-bb3c-c6acd0f5df22 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.124 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "a0a7498e-22eb-495c-a2e3-89ba9e483bf6", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.124 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6 used request id req-d990df3c-6f66-4729-bb3c-c6acd0f5df22 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.126 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3d34a856-7613-4158-b859-fb3089fe3bc7', 'name': 'tempest-UnshelveToHostMultiNodesTest-server-65395191', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '7141663c-a695-4147-a03d-20e8d4f67069'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'np0005548788.localdomain', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'c6d84801a8b44d9da497e9761a0cd10c', 'user_id': '496ca8bf29dc4e81ba0b08a592dc45d3', 'hostId': '43eba9c81c53b192889603146da935e657c7dfdbdbc03e888c9521e6', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.128 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.130 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.131 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.133 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.134 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.135 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.135 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>]
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.137 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>]
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.138 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.140 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.140 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.141 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.142 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.143 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.143 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>]
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.144 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.146 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.147 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.148 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.149 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.150 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.150 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-65395191>]
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.151 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.152 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.152 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.153 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.155 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.156 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:15:08 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:15:08.157 12 DEBUG ceilometer.compute.pollsters [-] Instance 3d34a856-7613-4158-b859-fb3089fe3bc7 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000006, id=3d34a856-7613-4158-b859-fb3089fe3bc7>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 10:15:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:08.544 281009 INFO nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance shutdown successfully after 13 seconds.
Dec 06 10:15:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:08.551 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance destroyed successfully.
Dec 06 10:15:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:08.552 281009 DEBUG nova.objects.instance [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lazy-loading 'numa_topology' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:08.617 281009 INFO nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Beginning cold snapshot process
Dec 06 10:15:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:08.817 281009 DEBUG nova.virt.libvirt.imagebackend [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] No parent info for 6a944ab6-8965-4055-b7fc-af6e395005ea; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163
Dec 06 10:15:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:08.845 281009 DEBUG nova.storage.rbd_utils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] creating snapshot(8edf00fad1b14e5a9e8076a1e0e93dcf) on rbd image(3d34a856-7613-4158-b859-fb3089fe3bc7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e109 do_prune osdmap full prune enabled
Dec 06 10:15:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e110 e110: 6 total, 6 up, 6 in
Dec 06 10:15:08 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in
Dec 06 10:15:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:09.088 281009 DEBUG nova.storage.rbd_utils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] cloning vms/3d34a856-7613-4158-b859-fb3089fe3bc7_disk@8edf00fad1b14e5a9e8076a1e0e93dcf to images/af540be2-bf52-4bff-b4bd-6dea5cca6542 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 06 10:15:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:09.281 281009 DEBUG nova.storage.rbd_utils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] flattening images/af540be2-bf52-4bff-b4bd-6dea5cca6542 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Dec 06 10:15:10 np0005548788.localdomain ceph-mon[293643]: osdmap e110: 6 total, 6 up, 6 in
Dec 06 10:15:10 np0005548788.localdomain ceph-mon[293643]: pgmap v122: 177 pgs: 177 active+clean; 306 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 144 op/s
Dec 06 10:15:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:10.215 281009 DEBUG nova.storage.rbd_utils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] removing snapshot(8edf00fad1b14e5a9e8076a1e0e93dcf) on rbd image(3d34a856-7613-4158-b859-fb3089fe3bc7_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Dec 06 10:15:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e110 do_prune osdmap full prune enabled
Dec 06 10:15:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e111 e111: 6 total, 6 up, 6 in
Dec 06 10:15:11 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in
Dec 06 10:15:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/263018422' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2133323897' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:11.102 281009 DEBUG nova.storage.rbd_utils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] creating snapshot(snap) on rbd image(af540be2-bf52-4bff-b4bd-6dea5cca6542) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 06 10:15:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e111 do_prune osdmap full prune enabled
Dec 06 10:15:12 np0005548788.localdomain ceph-mon[293643]: pgmap v124: 177 pgs: 177 active+clean; 306 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 793 KiB/s rd, 64 KiB/s wr, 70 op/s
Dec 06 10:15:12 np0005548788.localdomain ceph-mon[293643]: osdmap e111: 6 total, 6 up, 6 in
Dec 06 10:15:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e112 e112: 6 total, 6 up, 6 in
Dec 06 10:15:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.066 281009 INFO nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Snapshot image upload complete
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.067 281009 DEBUG nova.compute.manager [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:13 np0005548788.localdomain ceph-mon[293643]: osdmap e112: 6 total, 6 up, 6 in
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.120 281009 INFO nova.compute.manager [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Shelve offloading
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.129 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance destroyed successfully.
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.130 281009 DEBUG nova.compute.manager [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.133 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.133 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquired lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.133 281009 DEBUG nova.network.neutron [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.189 281009 DEBUG nova.network.neutron [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.403 281009 DEBUG nova.network.neutron [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.419 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Releasing lock "refresh_cache-3d34a856-7613-4158-b859-fb3089fe3bc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.427 281009 INFO nova.virt.libvirt.driver [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Instance destroyed successfully.
Dec 06 10:15:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:13.428 281009 DEBUG nova.objects.instance [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lazy-loading 'resources' on Instance uuid 3d34a856-7613-4158-b859-fb3089fe3bc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.046 281009 INFO nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Deleting instance files /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7_del
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.047 281009 INFO nova.virt.libvirt.driver [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Deletion of /var/lib/nova/instances/3d34a856-7613-4158-b859-fb3089fe3bc7_del complete
Dec 06 10:15:14 np0005548788.localdomain ceph-mon[293643]: pgmap v126: 177 pgs: 177 active+clean; 387 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.9 MiB/s rd, 7.8 MiB/s wr, 266 op/s
Dec 06 10:15:14 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2629540065' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.161 281009 INFO nova.scheduler.client.report [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Deleted allocations for instance 3d34a856-7613-4158-b859-fb3089fe3bc7
Dec 06 10:15:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.230 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.230 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.263 281009 DEBUG oslo_concurrency.processutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:14 np0005548788.localdomain podman[312188]: 2025-12-06 10:15:14.275803601 +0000 UTC m=+0.098989273 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:15:14 np0005548788.localdomain podman[312188]: 2025-12-06 10:15:14.316671767 +0000 UTC m=+0.139857439 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:15:14 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:15:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/975209313' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.818 281009 DEBUG oslo_concurrency.processutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.826 281009 DEBUG nova.compute.provider_tree [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.872 281009 DEBUG nova.scheduler.client.report [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.897 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:14.948 281009 DEBUG oslo_concurrency.lockutils [None req-85ee99f9-e59f-458b-8752-fbff28611d91 496ca8bf29dc4e81ba0b08a592dc45d3 c6d84801a8b44d9da497e9761a0cd10c - - default default] Lock "3d34a856-7613-4158-b859-fb3089fe3bc7" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:15 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/975209313' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:16.077 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005548790.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:52Z, description=, device_id=ed40901b-0bfc-426a-bf70-48d87ce95aa6, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686b1f0>], dns_domain=, dns_name=tempest-livemigrationtest-server-571789410, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686b700>], id=feb6a13d-305a-4541-a50e-4988833ecf82, ip_allocation=immediate, mac_address=fa:16:3e:e5:ea:4a, name=tempest-parent-1146072664, network_id=45604602-bc87-4608-9881-9568cbf90870, port_security_enabled=True, project_id=9167331b2c424ef6961b096b551f8434, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['4c82b56e-0fc5-4c7f-8922-ceb8236815fd'], standard_attr_id=627, status=DOWN, tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686b220>], trunk_id=113740e8-6296-4106-ae10-22f16d519315, updated_at=2025-12-06T10:15:15Z on network 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:15:16 np0005548788.localdomain ceph-mon[293643]: pgmap v127: 177 pgs: 177 active+clean; 387 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.7 MiB/s wr, 171 op/s
Dec 06 10:15:16 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 2 addresses
Dec 06 10:15:16 np0005548788.localdomain podman[312246]: 2025-12-06 10:15:16.29036863 +0000 UTC m=+0.057606940 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:15:16 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:15:16 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:15:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:16.480 262572 INFO neutron.agent.dhcp.agent [None req-7dae34bd-6ba3-44f9-8e82-c4ae3be887bf - - - - - -] DHCP configuration for ports {'feb6a13d-305a-4541-a50e-4988833ecf82'} is completed
Dec 06 10:15:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e112 do_prune osdmap full prune enabled
Dec 06 10:15:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e113 e113: 6 total, 6 up, 6 in
Dec 06 10:15:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in
Dec 06 10:15:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:15:17 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2224318560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:17 np0005548788.localdomain ceph-mon[293643]: osdmap e113: 6 total, 6 up, 6 in
Dec 06 10:15:17 np0005548788.localdomain podman[312267]: 2025-12-06 10:15:17.243270554 +0000 UTC m=+0.068345491 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:15:17 np0005548788.localdomain podman[312267]: 2025-12-06 10:15:17.281168999 +0000 UTC m=+0.106243976 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:15:17 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:15:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:17.548 2 INFO neutron.agent.securitygroups_rpc [None req-5e443fd1-82aa-48be-b4ff-976554ebf448 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group rule updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:17.774 2 INFO neutron.agent.securitygroups_rpc [None req-54187745-6fe9-48d8-bbb3-7e399880134e da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group rule updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:15:18 np0005548788.localdomain ceph-mon[293643]: pgmap v129: 177 pgs: 177 active+clean; 387 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.8 MiB/s wr, 173 op/s
Dec 06 10:15:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4212878909' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/212981679' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:18 np0005548788.localdomain podman[312290]: 2025-12-06 10:15:18.26100347 +0000 UTC m=+0.086295433 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 10:15:18 np0005548788.localdomain podman[312290]: 2025-12-06 10:15:18.297745869 +0000 UTC m=+0.123037822 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:15:18 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:15:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:15:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/412049606' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:19 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/412049606' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:15:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:15:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:15:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160394 "" "Go-http-client/1.1"
Dec 06 10:15:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:15:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20150 "" "Go-http-client/1.1"
Dec 06 10:15:19 np0005548788.localdomain dnsmasq[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/addn_hosts - 0 addresses
Dec 06 10:15:19 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/host
Dec 06 10:15:19 np0005548788.localdomain dnsmasq-dhcp[311507]: read /var/lib/neutron/dhcp/8c344e67-0482-40ce-b72c-ef9b65d68fbc/opts
Dec 06 10:15:19 np0005548788.localdomain podman[312324]: 2025-12-06 10:15:19.945478624 +0000 UTC m=+0.055036282 container kill 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:20 np0005548788.localdomain kernel: device tap42458b72-69 left promiscuous mode
Dec 06 10:15:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:20Z|00051|binding|INFO|Releasing lport 42458b72-69f7-437c-a516-669c4984e830 from this chassis (sb_readonly=0)
Dec 06 10:15:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:20Z|00052|binding|INFO|Setting lport 42458b72-69f7-437c-a516-669c4984e830 down in Southbound
Dec 06 10:15:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:20.114 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-8c344e67-0482-40ce-b72c-ef9b65d68fbc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c344e67-0482-40ce-b72c-ef9b65d68fbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '088ea8df069043ee8ed156bf735134b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e980c4c-0409-4cbd-bd81-e2db3d5bcca5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=42458b72-69f7-437c-a516-669c4984e830) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:20.116 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 42458b72-69f7-437c-a516-669c4984e830 in datapath 8c344e67-0482-40ce-b72c-ef9b65d68fbc unbound from our chassis
Dec 06 10:15:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:20.120 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c344e67-0482-40ce-b72c-ef9b65d68fbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:20.122 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[081bc805-758f-495b-8cd5-765d838d23ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:20 np0005548788.localdomain ceph-mon[293643]: pgmap v130: 177 pgs: 177 active+clean; 352 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 9.0 MiB/s rd, 8.5 MiB/s wr, 280 op/s
Dec 06 10:15:20 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/46663597' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.871 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.872 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.896 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.968 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.969 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.974 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 06 10:15:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:20.974 281009 INFO nova.compute.claims [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Claim successful on node np0005548788.localdomain
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.073 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e113 do_prune osdmap full prune enabled
Dec 06 10:15:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e114 e114: 6 total, 6 up, 6 in
Dec 06 10:15:21 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in
Dec 06 10:15:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1244880858' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.537 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.545 281009 DEBUG nova.compute.provider_tree [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.568 281009 DEBUG nova.scheduler.client.report [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.600 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.602 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.683 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.685 281009 DEBUG nova.network.neutron [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.703 281009 INFO nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.723 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.794 281009 WARNING oslo_policy.policy [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.794 281009 WARNING oslo_policy.policy [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.798 281009 DEBUG nova.policy [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'da7bbd24eb95438897585b10577ea2e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da995d8e002548889747013c0eeca935', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.838 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.840 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.841 281009 INFO nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Creating image(s)
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.881 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.922 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.963 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:21.969 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.049 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.050 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "cb68b180567fda17719a7393615b2f958ad3226e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.051 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "cb68b180567fda17719a7393615b2f958ad3226e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.051 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "cb68b180567fda17719a7393615b2f958ad3226e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.094 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.100 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:22 np0005548788.localdomain ceph-mon[293643]: pgmap v131: 177 pgs: 177 active+clean; 352 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 7.6 MiB/s wr, 250 op/s
Dec 06 10:15:22 np0005548788.localdomain ceph-mon[293643]: osdmap e114: 6 total, 6 up, 6 in
Dec 06 10:15:22 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1244880858' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.653 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/cb68b180567fda17719a7393615b2f958ad3226e 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:22 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:22.679 2 INFO neutron.agent.securitygroups_rpc [req-32f9c27c-7e39-487b-9f96-37ea07c2a545 req-64092713-96b8-4823-87de-00cf06a3e614 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group member updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.752 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] resizing rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.950 281009 DEBUG nova.objects.instance [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lazy-loading 'migration_context' on Instance uuid 89419cdc-1b37-4fdd-ad4b-013514e141a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.965 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.966 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Ensure instance console log exists: /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.967 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.967 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:22.968 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.010 281009 DEBUG nova.network.neutron [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Successfully created port: 22b2d742-fd5b-4bf4-898c-5da61dccc8af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 06 10:15:23 np0005548788.localdomain dnsmasq[311507]: exiting on receipt of SIGTERM
Dec 06 10:15:23 np0005548788.localdomain podman[312550]: 2025-12-06 10:15:23.015296591 +0000 UTC m=+0.057695424 container kill 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:15:23 np0005548788.localdomain systemd[1]: libpod-5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.068 281009 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765016108.0677137, 3d34a856-7613-4158-b859-fb3089fe3bc7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.069 281009 INFO nova.compute.manager [-] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] VM Stopped (Lifecycle Event)
Dec 06 10:15:23 np0005548788.localdomain podman[312565]: 2025-12-06 10:15:23.075957965 +0000 UTC m=+0.042558259 container died 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.089 281009 DEBUG nova.compute.manager [None req-9a9351b1-00b0-4d84-b0ad-2aac25365a71 - - - - - -] [instance: 3d34a856-7613-4158-b859-fb3089fe3bc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:23 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-439e3fd40f447976b905f7f7b88b6b6c3a05b830bbf38a0c7e415d883268a6b5-merged.mount: Deactivated successfully.
Dec 06 10:15:23 np0005548788.localdomain podman[312565]: 2025-12-06 10:15:23.133311178 +0000 UTC m=+0.099911452 container remove 5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c344e67-0482-40ce-b72c-ef9b65d68fbc, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:15:23 np0005548788.localdomain systemd[1]: libpod-conmon-5bcdefd210daf8e2a5669b053cb73ce9b8543b986898c6dfcb01044febe7d330.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:23.178 262572 INFO neutron.agent.dhcp.agent [None req-2a95302a-93ea-4a3c-8f99-7e5340e228eb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:23.178 262572 INFO neutron.agent.dhcp.agent [None req-2a95302a-93ea-4a3c-8f99-7e5340e228eb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.894 281009 DEBUG nova.network.neutron [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Successfully updated port: 22b2d742-fd5b-4bf4-898c-5da61dccc8af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.917 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.918 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquired lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.918 281009 DEBUG nova.network.neutron [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:15:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:23.974 281009 DEBUG nova.network.neutron [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 06 10:15:24 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d8c344e67\x2d0482\x2d40ce\x2db72c\x2def9b65d68fbc.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.060 281009 DEBUG nova.compute.manager [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-changed-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.061 281009 DEBUG nova.compute.manager [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Refreshing instance network info cache due to event network-changed-22b2d742-fd5b-4bf4-898c-5da61dccc8af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.061 281009 DEBUG oslo_concurrency.lockutils [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:24 np0005548788.localdomain ceph-mon[293643]: pgmap v133: 177 pgs: 177 active+clean; 399 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.4 MiB/s rd, 11 MiB/s wr, 345 op/s
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.515 281009 DEBUG nova.network.neutron [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updating instance_info_cache with network_info: [{"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.560 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Releasing lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.561 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Instance network_info: |[{"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.561 281009 DEBUG oslo_concurrency.lockutils [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquired lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.561 281009 DEBUG nova.network.neutron [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Refreshing network info cache for port 22b2d742-fd5b-4bf4-898c-5da61dccc8af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.566 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Start _get_guest_xml network_info=[{"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:11Z,direct_url=<?>,disk_format='qcow2',id=6a944ab6-8965-4055-b7fc-af6e395005ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3d603431c0bb4967bafc7a0aa6108bfe',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:13:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'size': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.573 281009 WARNING nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.577 281009 DEBUG nova.virt.libvirt.host [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Searching host: 'np0005548788.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.577 281009 DEBUG nova.virt.libvirt.host [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.580 281009 DEBUG nova.virt.libvirt.host [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Searching host: 'np0005548788.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.581 281009 DEBUG nova.virt.libvirt.host [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.581 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.582 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:13:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a0a7498e-22eb-495c-a2e3-89ba9e483bf6',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:11Z,direct_url=<?>,disk_format='qcow2',id=6a944ab6-8965-4055-b7fc-af6e395005ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3d603431c0bb4967bafc7a0aa6108bfe',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-06T10:13:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.582 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.583 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.583 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.584 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.584 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.585 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.585 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.585 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.586 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.586 281009 DEBUG nova.virt.hardware [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 06 10:15:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:24.591 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:15:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2447503790' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.020 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.055 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.060 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:25.087 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:23Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6854250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6854790>], id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62, ip_allocation=immediate, mac_address=fa:16:3e:0e:f5:37, name=tempest-parent-876689022, network_id=47d636a7-c520-4320-aa94-bfb41f418584, port_security_enabled=True, project_id=7897d6398eb64eb29c66df8db792e581, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=13, security_groups=['bfad329a-0ea3-4b02-8e91-9d15749f8c9b'], standard_attr_id=435, status=DOWN, tags=[], tenant_id=7897d6398eb64eb29c66df8db792e581, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6854760>], trunk_id=e769d97e-a772-49f9-b3b5-017ea160d521, updated_at=2025-12-06T10:15:24Z on network 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.252 281009 DEBUG nova.network.neutron [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updated VIF entry in instance network info cache for port 22b2d742-fd5b-4bf4-898c-5da61dccc8af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.253 281009 DEBUG nova.network.neutron [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updating instance_info_cache with network_info: [{"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.273 281009 DEBUG oslo_concurrency.lockutils [req-d024d82c-14ec-4c33-a1de-9fdc91fb9cb0 req-0dad440c-2135-4e07-aee2-072f12d9bb93 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Releasing lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1760790147' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2447503790' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:25 np0005548788.localdomain systemd[1]: tmp-crun.oA0mkP.mount: Deactivated successfully.
Dec 06 10:15:25 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 2 addresses
Dec 06 10:15:25 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:15:25 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:15:25 np0005548788.localdomain podman[312663]: 2025-12-06 10:15:25.352021841 +0000 UTC m=+0.073958444 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:15:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:15:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2463120775' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.515 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.518 281009 DEBUG nova.virt.libvirt.vif [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:15:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548788.localdomain',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=9,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOFbbRfjULcygMcOe9ruBCDT26FUItJ0QaOdMYjK7r+vTQZdzo5MJf7E5zeeYjA9Lq0uCGxe80r602PlTcDAghr7yHc2AbveusYZlzoK21BzQDiZ1oDD95ZIQiYc0Nj+wQ==',key_name='tempest-keypair-1299567895',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005548788.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548788.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da995d8e002548889747013c0eeca935',ramdisk_id='',reservation_id='r-25vgqb6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-1001740243',owner_user_name='tempest-ServersV294TestFqdnHostnames-1001740243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:15:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='da7bbd24eb95438897585b10577ea2e0',uuid=89419cdc-1b37-4fdd-ad4b-013514e141a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.519 281009 DEBUG nova.network.os_vif_util [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Converting VIF {"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.521 281009 DEBUG nova.network.os_vif_util [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.526 281009 DEBUG nova.objects.instance [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lazy-loading 'pci_devices' on Instance uuid 89419cdc-1b37-4fdd-ad4b-013514e141a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.541 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] End _get_guest_xml xml=<domain type="kvm">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <uuid>89419cdc-1b37-4fdd-ad4b-013514e141a9</uuid>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <name>instance-00000009</name>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <memory>131072</memory>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <vcpu>1</vcpu>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <metadata>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:name>guest-instance-1</nova:name>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:creationTime>2025-12-06 10:15:24</nova:creationTime>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:flavor name="m1.nano">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:memory>128</nova:memory>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:disk>1</nova:disk>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:swap>0</nova:swap>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:ephemeral>0</nova:ephemeral>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:vcpus>1</nova:vcpus>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </nova:flavor>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:owner>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:user uuid="da7bbd24eb95438897585b10577ea2e0">tempest-ServersV294TestFqdnHostnames-1001740243-project-member</nova:user>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:project uuid="da995d8e002548889747013c0eeca935">tempest-ServersV294TestFqdnHostnames-1001740243</nova:project>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </nova:owner>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:root type="image" uuid="6a944ab6-8965-4055-b7fc-af6e395005ea"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <nova:ports>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <nova:port uuid="22b2d742-fd5b-4bf4-898c-5da61dccc8af">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         </nova:port>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </nova:ports>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </nova:instance>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </metadata>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <sysinfo type="smbios">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <system>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <entry name="manufacturer">RDO</entry>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <entry name="product">OpenStack Compute</entry>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <entry name="serial">89419cdc-1b37-4fdd-ad4b-013514e141a9</entry>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <entry name="uuid">89419cdc-1b37-4fdd-ad4b-013514e141a9</entry>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <entry name="family">Virtual Machine</entry>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </system>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </sysinfo>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <os>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <boot dev="hd"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <smbios mode="sysinfo"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </os>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <features>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <acpi/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <apic/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <vmcoreinfo/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </features>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <clock offset="utc">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <timer name="pit" tickpolicy="delay"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <timer name="hpet" present="no"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </clock>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <cpu mode="host-model" match="exact">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <topology sockets="1" cores="1" threads="1"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </cpu>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   <devices>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <disk type="network" device="disk">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <driver type="raw" cache="none"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <source protocol="rbd" name="vms/89419cdc-1b37-4fdd-ad4b-013514e141a9_disk">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.103" port="6789"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.104" port="6789"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.105" port="6789"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </source>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <auth username="openstack">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </auth>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <target dev="vda" bus="virtio"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <disk type="network" device="cdrom">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <driver type="raw" cache="none"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <source protocol="rbd" name="vms/89419cdc-1b37-4fdd-ad4b-013514e141a9_disk.config">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.103" port="6789"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.104" port="6789"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <host name="172.18.0.105" port="6789"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </source>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <auth username="openstack">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       </auth>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <target dev="sda" bus="sata"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </disk>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <interface type="ethernet">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <mac address="fa:16:3e:df:0c:e8"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <model type="virtio"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <driver name="vhost" rx_queue_size="512"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <mtu size="1442"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <target dev="tap22b2d742-fd"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </interface>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <serial type="pty">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <log file="/var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/console.log" append="off"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </serial>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <video>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <model type="virtio"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </video>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <input type="tablet" bus="usb"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <rng model="virtio">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <backend model="random">/dev/urandom</backend>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </rng>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <controller type="usb" index="0"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     <memballoon model="virtio">
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:       <stats period="10"/>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:     </memballoon>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:   </devices>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: </domain>
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.543 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Preparing to wait for external event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.544 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.544 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.545 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.546 281009 DEBUG nova.virt.libvirt.vif [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:15:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548788.localdomain',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=9,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOFbbRfjULcygMcOe9ruBCDT26FUItJ0QaOdMYjK7r+vTQZdzo5MJf7E5zeeYjA9Lq0uCGxe80r602PlTcDAghr7yHc2AbveusYZlzoK21BzQDiZ1oDD95ZIQiYc0Nj+wQ==',key_name='tempest-keypair-1299567895',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005548788.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548788.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da995d8e002548889747013c0eeca935',ramdisk_id='',reservation_id='r-25vgqb6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-1001740243',owner_user_name='tempest-ServersV294TestFqdnHostnames-1001740243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:15:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='da7bbd24eb95438897585b10577ea2e0',uuid=89419cdc-1b37-4fdd-ad4b-013514e141a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.547 281009 DEBUG nova.network.os_vif_util [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Converting VIF {"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.548 281009 DEBUG nova.network.os_vif_util [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.549 281009 DEBUG os_vif [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 10:15:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:25.594 262572 INFO neutron.agent.dhcp.agent [None req-f5593e92-5231-455e-bf1f-177b176a1c1b - - - - - -] DHCP configuration for ports {'e87832d3-ffc3-44e0-9f77-cd2eb6073d62'} is completed
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.634 281009 DEBUG ovsdbapp.backend.ovs_idl [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.634 281009 DEBUG ovsdbapp.backend.ovs_idl [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.635 281009 DEBUG ovsdbapp.backend.ovs_idl [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.635 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.636 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.636 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.637 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.638 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.642 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.657 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.658 281009 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.658 281009 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:25.659 281009 INFO oslo.privsep.daemon [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpgso4mnb5/privsep.sock']
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.293 281009 INFO oslo.privsep.daemon [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Spawned new privsep daemon via rootwrap
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.184 312692 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.187 312692 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.189 312692 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.189 312692 INFO oslo.privsep.daemon [-] privsep daemon running as pid 312692
Dec 06 10:15:26 np0005548788.localdomain ceph-mon[293643]: pgmap v134: 177 pgs: 177 active+clean; 399 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.4 MiB/s rd, 11 MiB/s wr, 343 op/s
Dec 06 10:15:26 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2463120775' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.580 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.581 281009 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22b2d742-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.582 281009 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap22b2d742-fd, col_values=(('external_ids', {'iface-id': '22b2d742-fd5b-4bf4-898c-5da61dccc8af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:0c:e8', 'vm-uuid': '89419cdc-1b37-4fdd-ad4b-013514e141a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.584 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.588 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.590 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.591 281009 INFO os_vif [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd')
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.665 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.666 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.666 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] No VIF found with MAC fa:16:3e:df:0c:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.667 281009 INFO nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Using config drive
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.703 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.876 281009 INFO nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Creating config drive at /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/disk.config
Dec 06 10:15:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:26.886 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2frpkzx2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e114 do_prune osdmap full prune enabled
Dec 06 10:15:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:26.981 2 INFO neutron.agent.securitygroups_rpc [None req-4bf7090f-619c-441c-8a74-44ff051b2a47 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:15:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 e115: 6 total, 6 up, 6 in
Dec 06 10:15:26 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.019 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2frpkzx2" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.087 281009 DEBUG nova.storage.rbd_utils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] rbd image 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.094 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/disk.config 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.301 281009 DEBUG oslo_concurrency.processutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/disk.config 89419cdc-1b37-4fdd-ad4b-013514e141a9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.302 281009 INFO nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Deleting local config drive /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9/disk.config because it was imported into RBD.
Dec 06 10:15:27 np0005548788.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 06 10:15:27 np0005548788.localdomain kernel: device tap22b2d742-fd entered promiscuous mode
Dec 06 10:15:27 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016127.3783] manager: (tap22b2d742-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/18)
Dec 06 10:15:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:27Z|00053|binding|INFO|Claiming lport 22b2d742-fd5b-4bf4-898c-5da61dccc8af for this chassis.
Dec 06 10:15:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:27Z|00054|binding|INFO|22b2d742-fd5b-4bf4-898c-5da61dccc8af: Claiming fa:16:3e:df:0c:e8 10.100.0.7
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.379 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:27 np0005548788.localdomain systemd-udevd[312768]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.390 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:27 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016127.4030] device (tap22b2d742-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 10:15:27 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016127.4037] device (tap22b2d742-fd): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.402 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:0c:e8 10.100.0.7'], port_security=['fa:16:3e:df:0c:e8 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '89419cdc-1b37-4fdd-ad4b-013514e141a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da995d8e002548889747013c0eeca935', 'neutron:revision_number': '2', 'neutron:security_group_ids': '581a4637-eff2-45f4-92f3-d575b736a840', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cc41455-e125-49b5-8c35-a9f7e38c8e70, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=22b2d742-fd5b-4bf4-898c-5da61dccc8af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.405 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 22b2d742-fd5b-4bf4-898c-5da61dccc8af in datapath deb7774c-e96b-4e7f-88d7-ed9d740915f4 bound to our chassis
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.409 159620 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:27 np0005548788.localdomain systemd-machined[202859]: New machine qemu-3-instance-00000009.
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.440 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:27 np0005548788.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Dec 06 10:15:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:27Z|00055|binding|INFO|Setting lport 22b2d742-fd5b-4bf4-898c-5da61dccc8af ovn-installed in OVS
Dec 06 10:15:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:27Z|00056|binding|INFO|Setting lport 22b2d742-fd5b-4bf4-898c-5da61dccc8af up in Southbound
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.461 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.756 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016127.7553296, 89419cdc-1b37-4fdd-ad4b-013514e141a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.756 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] VM Started (Lifecycle Event)
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.784 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.789 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016127.7554739, 89419cdc-1b37-4fdd-ad4b-013514e141a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.789 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] VM Paused (Lifecycle Event)
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.808 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[64a1df7b-4fed-419b-b99d-832a2577ee79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.809 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdeb7774c-e1 in ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.811 309209 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdeb7774c-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.811 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[89f6bf84-ea87-417d-accf-8819c5cd69d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.813 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[310fff03-96c4-4c6b-bd39-24a5efc5cdd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.821 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.826 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.839 159785 DEBUG oslo.privsep.daemon [-] privsep: reply[9b10730e-d9af-4e40-8bb9-fada9b9158cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.862 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[375fb693-5136-42fb-9d5d-b4fde6f8eb47]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:27.865 159620 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpc6ryaucd/privsep.sock']
Dec 06 10:15:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:27.950 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:15:27 np0005548788.localdomain ceph-mon[293643]: osdmap e115: 6 total, 6 up, 6 in
Dec 06 10:15:27 np0005548788.localdomain ceph-mon[293643]: pgmap v136: 177 pgs: 177 active+clean; 399 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 4.3 MiB/s rd, 8.5 MiB/s wr, 195 op/s
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.280 281009 DEBUG nova.compute.manager [req-d31cf036-a724-4e6b-86ce-201ad86927e3 req-01bfb46e-bbd7-4155-9bc5-caaa0d91d446 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.281 281009 DEBUG oslo_concurrency.lockutils [req-d31cf036-a724-4e6b-86ce-201ad86927e3 req-01bfb46e-bbd7-4155-9bc5-caaa0d91d446 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.281 281009 DEBUG oslo_concurrency.lockutils [req-d31cf036-a724-4e6b-86ce-201ad86927e3 req-01bfb46e-bbd7-4155-9bc5-caaa0d91d446 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.281 281009 DEBUG oslo_concurrency.lockutils [req-d31cf036-a724-4e6b-86ce-201ad86927e3 req-01bfb46e-bbd7-4155-9bc5-caaa0d91d446 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.282 281009 DEBUG nova.compute.manager [req-d31cf036-a724-4e6b-86ce-201ad86927e3 req-01bfb46e-bbd7-4155-9bc5-caaa0d91d446 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Processing event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.283 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.286 281009 DEBUG nova.virt.driver [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] Emitting event <LifecycleEvent: 1765016128.2866282, 89419cdc-1b37-4fdd-ad4b-013514e141a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.287 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] VM Resumed (Lifecycle Event)
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.289 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.293 281009 INFO nova.virt.libvirt.driver [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Instance spawned successfully.
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.293 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.307 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.313 281009 DEBUG nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.319 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.320 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.320 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.321 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.322 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.322 281009 DEBUG nova.virt.libvirt.driver [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.333 281009 INFO nova.compute.manager [None req-1170b1b7-787e-4346-8f47-5c4d25bb2945 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.414 281009 INFO nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Took 6.58 seconds to spawn the instance on the hypervisor.
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.415 281009 DEBUG nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.472 281009 INFO nova.compute.manager [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Took 7.53 seconds to build instance.
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.488 281009 DEBUG oslo_concurrency.lockutils [None req-32f9c27c-7e39-487b-9f96-37ea07c2a545 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.540 159620 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.542 159620 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpc6ryaucd/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.430 312833 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.435 312833 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.439 312833 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.439 312833 INFO oslo.privsep.daemon [-] privsep daemon running as pid 312833
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.549 312833 DEBUG oslo.privsep.daemon [-] privsep: reply[7667d273-3404-4d3e-af75-904c3cc6b91d]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:28 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:28.683 2 INFO neutron.agent.securitygroups_rpc [None req-9fa949f8-0732-40f0-9fd9-bacbdfb578db ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:15:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:28.901 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.901 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.990 312833 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.990 312833 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:28.990 312833 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:29 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 1 addresses
Dec 06 10:15:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:15:29 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:15:29 np0005548788.localdomain podman[312855]: 2025-12-06 10:15:29.169886697 +0000 UTC m=+0.070256090 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:15:29 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:15:29 np0005548788.localdomain podman[312867]: 2025-12-06 10:15:29.269501978 +0000 UTC m=+0.088810860 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:29 np0005548788.localdomain podman[312867]: 2025-12-06 10:15:29.341541492 +0000 UTC m=+0.160850294 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 06 10:15:29 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:15:29 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:29.377 262572 INFO neutron.agent.linux.ip_lib [None req-9be662bf-2d20-4898-bcd4-8f0db5c8acf9 - - - - - -] Device tap14dccc5e-5a cannot be used as it has no MAC address
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.412 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain kernel: device tap14dccc5e-5a entered promiscuous mode
Dec 06 10:15:29 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016129.4256] manager: (tap14dccc5e-5a): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Dec 06 10:15:29 np0005548788.localdomain systemd-udevd[312766]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:29Z|00057|binding|INFO|Claiming lport 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 for this chassis.
Dec 06 10:15:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:29Z|00058|binding|INFO|14dccc5e-5a3b-4c3d-b511-1adf5a4491e7: Claiming unknown
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.427 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.438 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-5b56eac2-b5bc-4fdd-ab7f-704684288162', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b56eac2-b5bc-4fdd-ab7f-704684288162', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '47cdc09a0edc4e90a8790944545a6c24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa1dc569-e77b-4fe3-aa3e-e067b39105f5, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=14dccc5e-5a3b-4c3d-b511-1adf5a4491e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:29Z|00059|binding|INFO|Setting lport 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 ovn-installed in OVS
Dec 06 10:15:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:29Z|00060|binding|INFO|Setting lport 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 up in Southbound
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.463 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.464 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap14dccc5e-5a: No such device
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.539 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.570 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.589 312833 DEBUG oslo.privsep.daemon [-] privsep: reply[25c23e5b-2402-4e0c-aa12-a1d82893a391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.637 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a18c0e-1e6a-4dd9-ad2e-e8a5cc886098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016129.6414] manager: (tapdeb7774c-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/20)
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.674 312833 DEBUG oslo.privsep.daemon [-] privsep: reply[991edddd-7249-4517-a632-68f7950ae287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.680 312833 DEBUG oslo.privsep.daemon [-] privsep: reply[7037aba4-8013-425e-bc84-e67c2c0c38c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapdeb7774c-e1: link becomes ready
Dec 06 10:15:29 np0005548788.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapdeb7774c-e0: link becomes ready
Dec 06 10:15:29 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016129.7105] device (tapdeb7774c-e0): carrier: link connected
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.717 312833 DEBUG oslo.privsep.daemon [-] privsep: reply[a153bb40-564e-4c9e-ab8f-aacd7e5ff07c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.745 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[f4efafb9-642f-4334-b8c9-f5873a0ef18f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdeb7774c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d8:09:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1254982, 'reachable_time': 41763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312961, 'error': None, 'target': 'ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.768 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[cf522017-87fa-4017-b498-a2b860430896]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:9cd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1254982, 'tstamp': 1254982}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312963, 'error': None, 'target': 'ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.791 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac4cf94-b058-428c-bb8a-9177113a5c52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdeb7774c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d8:09:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1254982, 'reachable_time': 41763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312964, 'error': None, 'target': 'ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.832 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb79d467-1015-4c42-92ef-d99eea9f4437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.903 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[ab07ec63-b42b-48bd-901c-ad78061808fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.906 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdeb7774c-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.907 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.908 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdeb7774c-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:29 np0005548788.localdomain kernel: device tapdeb7774c-e0 entered promiscuous mode
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.917 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdeb7774c-e0, col_values=(('external_ids', {'iface-id': '431aeba8-5962-4449-b69d-46c4360741a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:29Z|00061|binding|INFO|Releasing lport 431aeba8-5962-4449-b69d-46c4360741a7 from this chassis (sb_readonly=0)
Dec 06 10:15:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:29.912 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.931 159620 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/deb7774c-e96b-4e7f-88d7-ed9d740915f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/deb7774c-e96b-4e7f-88d7-ed9d740915f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.933 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[fd00892f-d71a-4125-a6b8-36d1b38c5a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.935 159620 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: global
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     log         /dev/log local0 debug
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     log-tag     haproxy-metadata-proxy-deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     user        root
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     group       root
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     maxconn     1024
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     pidfile     /var/lib/neutron/external/pids/deb7774c-e96b-4e7f-88d7-ed9d740915f4.pid.haproxy
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     daemon
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: defaults
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     log global
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     mode http
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     option httplog
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     option dontlognull
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     option http-server-close
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     option forwardfor
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     retries                 3
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     timeout http-request    30s
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     timeout connect         30s
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     timeout client          32s
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     timeout server          32s
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     timeout http-keep-alive 30s
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: listen listener
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     bind 169.254.169.254:80
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     server metadata /var/lib/neutron/metadata_proxy
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:     http-request add-header X-OVN-Network-ID deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 06 10:15:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:29.937 159620 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'env', 'PROCESS_TAG=haproxy-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/deb7774c-e96b-4e7f-88d7-ed9d740915f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 06 10:15:30 np0005548788.localdomain ceph-mon[293643]: pgmap v137: 177 pgs: 177 active+clean; 238 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 8.5 MiB/s wr, 336 op/s
Dec 06 10:15:30 np0005548788.localdomain dnsmasq[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/addn_hosts - 0 addresses
Dec 06 10:15:30 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/host
Dec 06 10:15:30 np0005548788.localdomain dnsmasq-dhcp[309788]: read /var/lib/neutron/dhcp/47d636a7-c520-4320-aa94-bfb41f418584/opts
Dec 06 10:15:30 np0005548788.localdomain podman[313003]: 2025-12-06 10:15:30.236505734 +0000 UTC m=+0.064951566 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.330 281009 DEBUG nova.compute.manager [req-c01e3dc5-c1f3-47ec-b7bb-687aacf3c504 req-fac5ffde-ad87-49d4-9392-e39f28fe3939 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.331 281009 DEBUG oslo_concurrency.lockutils [req-c01e3dc5-c1f3-47ec-b7bb-687aacf3c504 req-fac5ffde-ad87-49d4-9392-e39f28fe3939 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.332 281009 DEBUG oslo_concurrency.lockutils [req-c01e3dc5-c1f3-47ec-b7bb-687aacf3c504 req-fac5ffde-ad87-49d4-9392-e39f28fe3939 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.332 281009 DEBUG oslo_concurrency.lockutils [req-c01e3dc5-c1f3-47ec-b7bb-687aacf3c504 req-fac5ffde-ad87-49d4-9392-e39f28fe3939 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.333 281009 DEBUG nova.compute.manager [req-c01e3dc5-c1f3-47ec-b7bb-687aacf3c504 req-fac5ffde-ad87-49d4-9392-e39f28fe3939 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] No waiting events found dispatching network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.333 281009 WARNING nova.compute.manager [req-c01e3dc5-c1f3-47ec-b7bb-687aacf3c504 req-fac5ffde-ad87-49d4-9392-e39f28fe3939 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received unexpected event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af for instance with vm_state active and task_state None.
Dec 06 10:15:30 np0005548788.localdomain podman[313041]: 
Dec 06 10:15:30 np0005548788.localdomain podman[313041]: 2025-12-06 10:15:30.440362069 +0000 UTC m=+0.127364624 container create 5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:15:30 np0005548788.localdomain systemd[1]: Started libpod-conmon-5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e.scope.
Dec 06 10:15:30 np0005548788.localdomain podman[313041]: 2025-12-06 10:15:30.39414997 +0000 UTC m=+0.081152585 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.565 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:30 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:30 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28d66a5ffc6b19c86ac80f6c23e0880e0c18b611c2125fb44a1ef881841a0b2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:30 np0005548788.localdomain podman[313041]: 2025-12-06 10:15:30.586909502 +0000 UTC m=+0.273912057 container init 5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.598 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:30 np0005548788.localdomain podman[313041]: 2025-12-06 10:15:30.600617984 +0000 UTC m=+0.287620539 container start 5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:30Z|00062|binding|INFO|Releasing lport 431aeba8-5962-4449-b69d-46c4360741a7 from this chassis (sb_readonly=0)
Dec 06 10:15:30 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [NOTICE]   (313086) : New worker (313094) forked
Dec 06 10:15:30 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [NOTICE]   (313086) : Loading success.
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.626 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.690 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.691 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 in datapath 5b56eac2-b5bc-4fdd-ab7f-704684288162 unbound from our chassis
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.693 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8824a4a3-1549-4c3e-82ba-de34c9855071 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.693 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b56eac2-b5bc-4fdd-ab7f-704684288162, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.694 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[5ddd9aab-4047-45c0-8c81-07739dfd398b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:30 np0005548788.localdomain podman[313085]: 
Dec 06 10:15:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:30Z|00063|binding|INFO|Releasing lport 14d978a5-89e5-4bec-87c4-0261c015709d from this chassis (sb_readonly=0)
Dec 06 10:15:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:30Z|00064|binding|INFO|Setting lport 14d978a5-89e5-4bec-87c4-0261c015709d down in Southbound
Dec 06 10:15:30 np0005548788.localdomain kernel: device tap14d978a5-89 left promiscuous mode
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.745 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.755 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-47d636a7-c520-4320-aa94-bfb41f418584', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47d636a7-c520-4320-aa94-bfb41f418584', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7897d6398eb64eb29c66df8db792e581', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6898c302-0153-460c-9cb1-4c62ebc9ff31, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=14d978a5-89e5-4bec-87c4-0261c015709d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:30 np0005548788.localdomain podman[313085]: 2025-12-06 10:15:30.756662799 +0000 UTC m=+0.146036378 container create 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.756 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 14d978a5-89e5-4bec-87c4-0261c015709d in datapath 47d636a7-c520-4320-aa94-bfb41f418584 unbound from our chassis
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.758 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47d636a7-c520-4320-aa94-bfb41f418584, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:30.759 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[78c43466-8c52-459d-9852-8452bec6f659]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:30.771 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:30 np0005548788.localdomain podman[313085]: 2025-12-06 10:15:30.688055612 +0000 UTC m=+0.077429261 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:15:30 np0005548788.localdomain systemd[1]: Started libpod-conmon-19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2.scope.
Dec 06 10:15:30 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:30 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895af9c7de1f357be93a099708275929c3eeb61d37c8ffdb41cd22b32f6ac895/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:30 np0005548788.localdomain podman[313085]: 2025-12-06 10:15:30.834318476 +0000 UTC m=+0.223692065 container init 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:30 np0005548788.localdomain podman[313085]: 2025-12-06 10:15:30.844014634 +0000 UTC m=+0.233388223 container start 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:30 np0005548788.localdomain dnsmasq[313115]: started, version 2.85 cachesize 150
Dec 06 10:15:30 np0005548788.localdomain dnsmasq[313115]: DNS service limited to local subnets
Dec 06 10:15:30 np0005548788.localdomain dnsmasq[313115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:15:30 np0005548788.localdomain dnsmasq[313115]: warning: no upstream servers configured
Dec 06 10:15:30 np0005548788.localdomain dnsmasq-dhcp[313115]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:15:30 np0005548788.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/addn_hosts - 0 addresses
Dec 06 10:15:30 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/host
Dec 06 10:15:30 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/opts
Dec 06 10:15:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:31.004 262572 INFO neutron.agent.dhcp.agent [None req-8aaddb77-7739-4274-9007-0f11f10d3763 - - - - - -] DHCP configuration for ports {'cd4b85dc-43c3-4a8f-b202-a93068ac6450'} is completed
Dec 06 10:15:31 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:31Z|00065|memory|INFO|peak resident set size grew 62% in last 2367.9 seconds, from 15040 kB to 24340 kB
Dec 06 10:15:31 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:31Z|00066|memory|INFO|idl-cells-OVN_Southbound:11526 idl-cells-Open_vSwitch:1212 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:392 lflow-cache-entries-cache-matches:287 lflow-cache-size-KB:1494 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:691 ofctrl_installed_flow_usage-KB:506 ofctrl_sb_flow_ref_usage-KB:257
Dec 06 10:15:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:31.585 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:31.599 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:30Z, description=, device_id=af0f743c-b34f-4641-9bca-6f879d4af6de, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6881ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6899b20>], id=37cba1c8-c4ae-4977-8c18-d42db9accf2f, ip_allocation=immediate, mac_address=fa:16:3e:cb:56:b9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:26Z, description=, dns_domain=, id=5b56eac2-b5bc-4fdd-ab7f-704684288162, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-194938257-network, port_security_enabled=True, project_id=47cdc09a0edc4e90a8790944545a6c24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12113, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=742, status=ACTIVE, subnets=['27b56f7c-d6aa-43fd-8e03-02fc9510b288'], tags=[], tenant_id=47cdc09a0edc4e90a8790944545a6c24, updated_at=2025-12-06T10:15:27Z, vlan_transparent=None, network_id=5b56eac2-b5bc-4fdd-ab7f-704684288162, port_security_enabled=False, project_id=47cdc09a0edc4e90a8790944545a6c24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=777, status=DOWN, tags=[], tenant_id=47cdc09a0edc4e90a8790944545a6c24, updated_at=2025-12-06T10:15:31Z on network 5b56eac2-b5bc-4fdd-ab7f-704684288162
Dec 06 10:15:31 np0005548788.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/addn_hosts - 1 addresses
Dec 06 10:15:31 np0005548788.localdomain podman[313133]: 2025-12-06 10:15:31.81362582 +0000 UTC m=+0.047387947 container kill 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:15:31 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/host
Dec 06 10:15:31 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/opts
Dec 06 10:15:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:32 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:32.065 262572 INFO neutron.agent.dhcp.agent [None req-0496baea-52d6-490a-b894-0b1970e23635 - - - - - -] DHCP configuration for ports {'37cba1c8-c4ae-4977-8c18-d42db9accf2f'} is completed
Dec 06 10:15:32 np0005548788.localdomain ceph-mon[293643]: pgmap v138: 177 pgs: 177 active+clean; 238 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 4.5 MiB/s rd, 7.0 MiB/s wr, 275 op/s
Dec 06 10:15:32 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/936145217' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:32.392 281009 DEBUG nova.compute.manager [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-changed-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:32.393 281009 DEBUG nova.compute.manager [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Refreshing instance network info cache due to event network-changed-22b2d742-fd5b-4bf4-898c-5da61dccc8af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 06 10:15:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:32.393 281009 DEBUG oslo_concurrency.lockutils [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:32.394 281009 DEBUG oslo_concurrency.lockutils [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquired lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:32.394 281009 DEBUG nova.network.neutron [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Refreshing network info cache for port 22b2d742-fd5b-4bf4-898c-5da61dccc8af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 06 10:15:33 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/843309173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:33.525 281009 DEBUG nova.network.neutron [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updated VIF entry in instance network info cache for port 22b2d742-fd5b-4bf4-898c-5da61dccc8af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 06 10:15:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:33.526 281009 DEBUG nova.network.neutron [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updating instance_info_cache with network_info: [{"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:33.555 281009 DEBUG oslo_concurrency.lockutils [req-e156d46e-dc16-481a-b481-de893f1067a8 req-4b618e81-422d-48b0-a2bb-e671b0845436 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Releasing lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:33.591 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:30Z, description=, device_id=af0f743c-b34f-4641-9bca-6f879d4af6de, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ed460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67edac0>], id=37cba1c8-c4ae-4977-8c18-d42db9accf2f, ip_allocation=immediate, mac_address=fa:16:3e:cb:56:b9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:26Z, description=, dns_domain=, id=5b56eac2-b5bc-4fdd-ab7f-704684288162, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-194938257-network, port_security_enabled=True, project_id=47cdc09a0edc4e90a8790944545a6c24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12113, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=742, status=ACTIVE, subnets=['27b56f7c-d6aa-43fd-8e03-02fc9510b288'], tags=[], tenant_id=47cdc09a0edc4e90a8790944545a6c24, updated_at=2025-12-06T10:15:27Z, vlan_transparent=None, network_id=5b56eac2-b5bc-4fdd-ab7f-704684288162, port_security_enabled=False, project_id=47cdc09a0edc4e90a8790944545a6c24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=777, status=DOWN, tags=[], tenant_id=47cdc09a0edc4e90a8790944545a6c24, updated_at=2025-12-06T10:15:31Z on network 5b56eac2-b5bc-4fdd-ab7f-704684288162
Dec 06 10:15:33 np0005548788.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/addn_hosts - 1 addresses
Dec 06 10:15:33 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/host
Dec 06 10:15:33 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/opts
Dec 06 10:15:33 np0005548788.localdomain podman[313172]: 2025-12-06 10:15:33.823524287 +0000 UTC m=+0.050515364 container kill 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:15:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:15:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:15:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:15:33 np0005548788.localdomain podman[313187]: 2025-12-06 10:15:33.917750842 +0000 UTC m=+0.067604809 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:15:33 np0005548788.localdomain podman[313188]: 2025-12-06 10:15:33.978375345 +0000 UTC m=+0.116094848 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:15:33 np0005548788.localdomain podman[313188]: 2025-12-06 10:15:33.98927763 +0000 UTC m=+0.126997143 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:15:33 np0005548788.localdomain podman[313187]: 2025-12-06 10:15:33.996794932 +0000 UTC m=+0.146648899 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:15:34 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:15:34 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:15:34 np0005548788.localdomain podman[313186]: 2025-12-06 10:15:33.957401511 +0000 UTC m=+0.105277297 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:34 np0005548788.localdomain podman[313186]: 2025-12-06 10:15:34.037125961 +0000 UTC m=+0.185001807 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:15:34 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:15:34 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:34.120 262572 INFO neutron.agent.dhcp.agent [None req-3d9e34d7-d138-45f5-b2e5-65217decc3f3 - - - - - -] DHCP configuration for ports {'37cba1c8-c4ae-4977-8c18-d42db9accf2f'} is completed
Dec 06 10:15:34 np0005548788.localdomain ceph-mon[293643]: pgmap v139: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 20 KiB/s wr, 234 op/s
Dec 06 10:15:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:34.605 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:35 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:35Z|00067|binding|INFO|Releasing lport 431aeba8-5962-4449-b69d-46c4360741a7 from this chassis (sb_readonly=0)
Dec 06 10:15:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:35.487 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:36.081 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:36.082 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:36.082 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:15:36 np0005548788.localdomain ceph-mon[293643]: pgmap v140: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 20 KiB/s wr, 234 op/s
Dec 06 10:15:36 np0005548788.localdomain systemd[1]: tmp-crun.BpP7RF.mount: Deactivated successfully.
Dec 06 10:15:36 np0005548788.localdomain dnsmasq[309788]: exiting on receipt of SIGTERM
Dec 06 10:15:36 np0005548788.localdomain systemd[1]: libpod-5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858.scope: Deactivated successfully.
Dec 06 10:15:36 np0005548788.localdomain podman[313268]: 2025-12-06 10:15:36.4877514 +0000 UTC m=+0.072129317 container kill 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:15:36 np0005548788.localdomain podman[313280]: 2025-12-06 10:15:36.563065955 +0000 UTC m=+0.058075286 container died 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:15:36 np0005548788.localdomain systemd[1]: tmp-crun.y7htI5.mount: Deactivated successfully.
Dec 06 10:15:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:36.587 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:15:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 3093 writes, 26K keys, 3093 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s
                                                           Cumulative WAL: 3093 writes, 3093 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 3093 writes, 26K keys, 3093 commit groups, 1.0 writes per commit group, ingest: 49.39 MB, 0.08 MB/s
                                                           Interval WAL: 3093 writes, 3093 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    147.7      0.25              0.09        12    0.021       0      0       0.0       0.0
                                                             L6      1/0   18.39 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   5.1    178.2    162.3      1.15              0.52        11    0.104    129K   5599       0.0       0.0
                                                            Sum      1/0   18.39 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   6.1    146.5    159.7      1.39              0.61        23    0.061    129K   5599       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   6.1    146.7    159.9      1.39              0.61        22    0.063    129K   5599       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    178.2    162.3      1.15              0.52        11    0.104    129K   5599       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    149.1      0.25              0.09        11    0.022       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.036, interval 0.036
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.22 GB write, 0.37 MB/s write, 0.20 GB read, 0.34 MB/s read, 1.4 seconds
                                                           Interval compaction: 0.22 GB write, 0.37 MB/s write, 0.20 GB read, 0.34 MB/s read, 1.4 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x564f1a09b350#2 capacity: 308.00 MB usage: 48.24 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000521 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(3336,47.39 MB,15.3875%) FilterBlock(23,375.05 KB,0.118915%) IndexBlock(23,494.45 KB,0.156774%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:15:36 np0005548788.localdomain podman[313280]: 2025-12-06 10:15:36.609913444 +0000 UTC m=+0.104922715 container cleanup 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:15:36 np0005548788.localdomain systemd[1]: libpod-conmon-5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858.scope: Deactivated successfully.
Dec 06 10:15:36 np0005548788.localdomain podman[313287]: 2025-12-06 10:15:36.63353624 +0000 UTC m=+0.113349424 container remove 5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:15:36 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:36.667 262572 INFO neutron.agent.dhcp.agent [None req-05df6b0e-0627-4dc8-bc88-e4d72f6a263e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:36.693 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:36 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:36.838 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:37.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:37.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-2febf276f78f79b0ded7a394bcb796cf54cbd4f8cda2019ce79a07034472eab4-merged.mount: Deactivated successfully.
Dec 06 10:15:37 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5682867e956dd47f48d4a6b2abb4a12883b9a1dafbc9e65f177aaef0518da858-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:37 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d47d636a7\x2dc520\x2d4320\x2daa94\x2dbfb41f418584.mount: Deactivated successfully.
Dec 06 10:15:38 np0005548788.localdomain ceph-mon[293643]: pgmap v141: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 20 KiB/s wr, 232 op/s
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:15:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:15:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:15:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:15:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2236174487' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:15:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:15:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2236174487' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:15:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2236174487' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:15:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2236174487' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:15:39 np0005548788.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/addn_hosts - 0 addresses
Dec 06 10:15:39 np0005548788.localdomain podman[313327]: 2025-12-06 10:15:39.264902324 +0000 UTC m=+0.043961221 container kill 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:15:39 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/host
Dec 06 10:15:39 np0005548788.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/5b56eac2-b5bc-4fdd-ab7f-704684288162/opts
Dec 06 10:15:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:39.421 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:39 np0005548788.localdomain kernel: device tap14dccc5e-5a left promiscuous mode
Dec 06 10:15:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:39Z|00068|binding|INFO|Releasing lport 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 from this chassis (sb_readonly=0)
Dec 06 10:15:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:39Z|00069|binding|INFO|Setting lport 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 down in Southbound
Dec 06 10:15:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:39.432 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-5b56eac2-b5bc-4fdd-ab7f-704684288162', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b56eac2-b5bc-4fdd-ab7f-704684288162', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '47cdc09a0edc4e90a8790944545a6c24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa1dc569-e77b-4fe3-aa3e-e067b39105f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=14dccc5e-5a3b-4c3d-b511-1adf5a4491e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:39.434 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 14dccc5e-5a3b-4c3d-b511-1adf5a4491e7 in datapath 5b56eac2-b5bc-4fdd-ab7f-704684288162 unbound from our chassis
Dec 06 10:15:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:39.437 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b56eac2-b5bc-4fdd-ab7f-704684288162, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:39.437 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[1dbe1e31-f498-4d22-a3fb-cbfb1ff2a6dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:39.446 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:39.447 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:39.489 2 INFO neutron.agent.securitygroups_rpc [None req-e5d6490d-2b46-4f4e-92e1-5479a93607f8 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:39.608 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:40.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:40.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:40 np0005548788.localdomain ceph-mon[293643]: pgmap v142: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 17 KiB/s wr, 200 op/s
Dec 06 10:15:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1736992612' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:40 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:40.227 2 INFO neutron.agent.securitygroups_rpc [None req-806a1120-e80b-4f72-b62c-6adbb0e69b26 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:15:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2671465232' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.085 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.086 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquired lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.086 281009 DEBUG nova.network.neutron [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.087 281009 DEBUG nova.objects.instance [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 89419cdc-1b37-4fdd-ad4b-013514e141a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:41Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:0c:e8 10.100.0.7
Dec 06 10:15:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:41Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:0c:e8 10.100.0.7
Dec 06 10:15:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:41Z|00070|binding|INFO|Releasing lport 431aeba8-5962-4449-b69d-46c4360741a7 from this chassis (sb_readonly=0)
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.541 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.589 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.659 281009 DEBUG nova.network.neutron [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updating instance_info_cache with network_info: [{"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.676 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Releasing lock "refresh_cache-89419cdc-1b37-4fdd-ad4b-013514e141a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.676 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.677 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.696 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.697 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.697 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.698 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:15:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:41.698 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:42 np0005548788.localdomain ceph-mon[293643]: pgmap v143: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 3.0 MiB/s rd, 106 op/s
Dec 06 10:15:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2676620455' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1782327393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.183 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.267 281009 DEBUG nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.267 281009 DEBUG nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.491 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.493 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11367MB free_disk=41.71154022216797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.494 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.494 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.583 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Instance 89419cdc-1b37-4fdd-ad4b-013514e141a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.584 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.584 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=640MB phys_disk=41GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.636 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:42Z|00071|binding|INFO|Releasing lport 431aeba8-5962-4449-b69d-46c4360741a7 from this chassis (sb_readonly=0)
Dec 06 10:15:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:42.763 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3377775268' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:43.082 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:43.089 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:15:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1782327393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4064141538' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3377775268' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:43.167 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:15:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:43.198 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:15:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:43.198 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:43 np0005548788.localdomain podman[313409]: 2025-12-06 10:15:43.287614915 +0000 UTC m=+0.063928336 container kill 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:15:43 np0005548788.localdomain dnsmasq[313115]: exiting on receipt of SIGTERM
Dec 06 10:15:43 np0005548788.localdomain systemd[1]: libpod-19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2.scope: Deactivated successfully.
Dec 06 10:15:43 np0005548788.localdomain podman[313422]: 2025-12-06 10:15:43.355270944 +0000 UTC m=+0.054322370 container died 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:15:43 np0005548788.localdomain systemd[1]: tmp-crun.q24qCD.mount: Deactivated successfully.
Dec 06 10:15:43 np0005548788.localdomain podman[313422]: 2025-12-06 10:15:43.391301031 +0000 UTC m=+0.090352427 container cleanup 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:15:43 np0005548788.localdomain systemd[1]: libpod-conmon-19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2.scope: Deactivated successfully.
Dec 06 10:15:43 np0005548788.localdomain podman[313424]: 2025-12-06 10:15:43.437315026 +0000 UTC m=+0.123292960 container remove 19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b56eac2-b5bc-4fdd-ab7f-704684288162, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:15:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:43.526 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:43.727 262572 INFO neutron.agent.dhcp.agent [None req-2d6fe75d-4f2b-4b14-8279-22273bcfd1ce - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:43.819 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:44 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:44.043 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:44 np0005548788.localdomain ceph-mon[293643]: pgmap v144: 177 pgs: 177 active+clean; 304 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 4.3 MiB/s wr, 233 op/s
Dec 06 10:15:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-895af9c7de1f357be93a099708275929c3eeb61d37c8ffdb41cd22b32f6ac895-merged.mount: Deactivated successfully.
Dec 06 10:15:44 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19585e9f60fbf99057689876372499b8f4207ab9aeccb217294526930fbd2ea2-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:44 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d5b56eac2\x2db5bc\x2d4fdd\x2dab7f\x2d704684288162.mount: Deactivated successfully.
Dec 06 10:15:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:44.611 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:15:45 np0005548788.localdomain podman[313452]: 2025-12-06 10:15:45.255790539 +0000 UTC m=+0.075343036 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:15:45 np0005548788.localdomain podman[313452]: 2025-12-06 10:15:45.269181971 +0000 UTC m=+0.088734438 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:45 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:15:46 np0005548788.localdomain ceph-mon[293643]: pgmap v145: 177 pgs: 177 active+clean; 304 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 682 KiB/s rd, 4.3 MiB/s wr, 132 op/s
Dec 06 10:15:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:46.592 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:46.762 159749 DEBUG eventlet.wsgi.server [-] (159749) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:46.763 159749 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: Accept: */*
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: Connection: close
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: Content-Type: text/plain
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: Host: 169.254.169.254
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: User-Agent: curl/7.84.0
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: X-Forwarded-For: 10.100.0.7
Dec 06 10:15:46 np0005548788.localdomain ovn_metadata_agent[159615]: X-Ovn-Network-Id: deb7774c-e96b-4e7f-88d7-ed9d740915f4 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 10:15:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.439 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.440 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.440 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:47 np0005548788.localdomain haproxy-metadata-proxy-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313094]: 10.100.0.7:41160 [06/Dec/2025:10:15:46.760] listener listener/metadata 0/0/0/766/766 200 1657 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.526 159749 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.527 159749 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1673 time: 0.7637908
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.659 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.660 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.662 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.663 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.663 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.666 281009 INFO nova.compute.manager [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Terminating instance
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.669 281009 DEBUG nova.compute.manager [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 06 10:15:47 np0005548788.localdomain kernel: device tap22b2d742-fd left promiscuous mode
Dec 06 10:15:47 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016147.7316] device (tap22b2d742-fd): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 06 10:15:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:47Z|00072|binding|INFO|Releasing lport 22b2d742-fd5b-4bf4-898c-5da61dccc8af from this chassis (sb_readonly=0)
Dec 06 10:15:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:47Z|00073|binding|INFO|Setting lport 22b2d742-fd5b-4bf4-898c-5da61dccc8af down in Southbound
Dec 06 10:15:47 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:47Z|00074|binding|INFO|Removing iface tap22b2d742-fd ovn-installed in OVS
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.785 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.799 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:0c:e8 10.100.0.7'], port_security=['fa:16:3e:df:0c:e8 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '89419cdc-1b37-4fdd-ad4b-013514e141a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da995d8e002548889747013c0eeca935', 'neutron:revision_number': '4', 'neutron:security_group_ids': '581a4637-eff2-45f4-92f3-d575b736a840', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain', 'neutron:port_fip': '192.168.122.226'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cc41455-e125-49b5-8c35-a9f7e38c8e70, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=22b2d742-fd5b-4bf4-898c-5da61dccc8af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.799 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 22b2d742-fd5b-4bf4-898c-5da61dccc8af in datapath deb7774c-e96b-4e7f-88d7-ed9d740915f4 unbound from our chassis
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.801 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network deb7774c-e96b-4e7f-88d7-ed9d740915f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.805 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.803 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[5cbec226-9dd1-448b-a5a0-64ed9eb66a95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:47.804 159620 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4 namespace which is not needed anymore
Dec 06 10:15:47 np0005548788.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 06 10:15:47 np0005548788.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 13.609s CPU time.
Dec 06 10:15:47 np0005548788.localdomain systemd-machined[202859]: Machine qemu-3-instance-00000009 terminated.
Dec 06 10:15:47 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016147.8901] manager: (tap22b2d742-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec 06 10:15:47 np0005548788.localdomain systemd[1]: tmp-crun.hPxY0G.mount: Deactivated successfully.
Dec 06 10:15:47 np0005548788.localdomain podman[313478]: 2025-12-06 10:15:47.901140393 +0000 UTC m=+0.098750216 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.915 281009 INFO nova.virt.libvirt.driver [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Instance destroyed successfully.
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.916 281009 DEBUG nova.objects.instance [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lazy-loading 'resources' on Instance uuid 89419cdc-1b37-4fdd-ad4b-013514e141a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.931 281009 DEBUG nova.virt.libvirt.vif [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:15:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548788.localdomain',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=9,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOFbbRfjULcygMcOe9ruBCDT26FUItJ0QaOdMYjK7r+vTQZdzo5MJf7E5zeeYjA9Lq0uCGxe80r602PlTcDAghr7yHc2AbveusYZlzoK21BzQDiZ1oDD95ZIQiYc0Nj+wQ==',key_name='tempest-keypair-1299567895',keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:15:28Z,launched_on='np0005548788.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005548788.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da995d8e002548889747013c0eeca935',ramdisk_id='',reservation_id='r-25vgqb6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-1001740243',owner_user_name='tempest-ServersV294TestFqdnHostnames-1001740243-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:15:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='da7bbd24eb95438897585b10577ea2e0',uuid=89419cdc-1b37-4fdd-ad4b-013514e141a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.932 281009 DEBUG nova.network.os_vif_util [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Converting VIF {"id": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "address": "fa:16:3e:df:0c:e8", "network": {"id": "deb7774c-e96b-4e7f-88d7-ed9d740915f4", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1078460514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "da995d8e002548889747013c0eeca935", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b2d742-fd", "ovs_interfaceid": "22b2d742-fd5b-4bf4-898c-5da61dccc8af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.933 281009 DEBUG nova.network.os_vif_util [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.933 281009 DEBUG os_vif [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.937 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.938 281009 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22b2d742-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.940 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.942 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:47.946 281009 INFO os_vif [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:0c:e8,bridge_name='br-int',has_traffic_filtering=True,id=22b2d742-fd5b-4bf4-898c-5da61dccc8af,network=Network(deb7774c-e96b-4e7f-88d7-ed9d740915f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b2d742-fd')
Dec 06 10:15:47 np0005548788.localdomain podman[313478]: 2025-12-06 10:15:47.990155668 +0000 UTC m=+0.187765501 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.021 281009 DEBUG nova.compute.manager [req-ae8303df-9d8a-4582-afb2-d4ba0c96695c req-62e64494-49a3-445d-b814-191a0dc21cb2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-vif-unplugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.021 281009 DEBUG oslo_concurrency.lockutils [req-ae8303df-9d8a-4582-afb2-d4ba0c96695c req-62e64494-49a3-445d-b814-191a0dc21cb2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.022 281009 DEBUG oslo_concurrency.lockutils [req-ae8303df-9d8a-4582-afb2-d4ba0c96695c req-62e64494-49a3-445d-b814-191a0dc21cb2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.022 281009 DEBUG oslo_concurrency.lockutils [req-ae8303df-9d8a-4582-afb2-d4ba0c96695c req-62e64494-49a3-445d-b814-191a0dc21cb2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:48 np0005548788.localdomain ceph-mon[293643]: pgmap v146: 177 pgs: 177 active+clean; 304 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 682 KiB/s rd, 4.3 MiB/s wr, 132 op/s
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.023 281009 DEBUG nova.compute.manager [req-ae8303df-9d8a-4582-afb2-d4ba0c96695c req-62e64494-49a3-445d-b814-191a0dc21cb2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] No waiting events found dispatching network-vif-unplugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.023 281009 DEBUG nova.compute.manager [req-ae8303df-9d8a-4582-afb2-d4ba0c96695c req-62e64494-49a3-445d-b814-191a0dc21cb2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-vif-unplugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 06 10:15:48 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [NOTICE]   (313086) : haproxy version is 2.8.14-c23fe91
Dec 06 10:15:48 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [NOTICE]   (313086) : path to executable is /usr/sbin/haproxy
Dec 06 10:15:48 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [WARNING]  (313086) : Exiting Master process...
Dec 06 10:15:48 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [ALERT]    (313086) : Current worker (313094) exited with code 143 (Terminated)
Dec 06 10:15:48 np0005548788.localdomain neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4[313075]: [WARNING]  (313086) : All workers exited. Exiting... (0)
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: libpod-5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e.scope: Deactivated successfully.
Dec 06 10:15:48 np0005548788.localdomain podman[313526]: 2025-12-06 10:15:48.040558737 +0000 UTC m=+0.081515326 container died 5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:48 np0005548788.localdomain podman[313526]: 2025-12-06 10:15:48.087037865 +0000 UTC m=+0.127994434 container cleanup 5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:15:48 np0005548788.localdomain podman[313570]: 2025-12-06 10:15:48.163457934 +0000 UTC m=+0.060347716 container remove 5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.167 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[7979724c-64a1-49e3-93e7-99d12ea991fb]: (4, ('Sat Dec  6 10:15:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4 (5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e)\n5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e\nSat Dec  6 10:15:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4 (5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e)\n5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.169 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4083e5-9b17-4d2a-a80e-173482238467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.170 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdeb7774c-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.172 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548788.localdomain kernel: device tapdeb7774c-e0 left promiscuous mode
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.181 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.184 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e875b34f-6b29-410d-9b8d-cb6998a14a03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.195 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[f30a22e7-73ba-41ed-a658-0e9c51decbee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: libpod-conmon-5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e.scope: Deactivated successfully.
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.198 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d08931-76de-44bc-b16c-bd8f9f10d85f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.219 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e2badd-898e-4d60-9016-a40a7d73468a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1254969, 'reachable_time': 40798, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313586, 'error': None, 'target': 'ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.230 159785 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-deb7774c-e96b-4e7f-88d7-ed9d740915f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 06 10:15:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:48.231 159785 DEBUG oslo.privsep.daemon [-] privsep: reply[853a630d-8048-424c-960f-0ce41cf5d0f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:48.510 2 INFO neutron.agent.securitygroups_rpc [None req-0ed3e916-bdef-45c7-9c1d-50729e74f02a 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.575 281009 INFO nova.virt.libvirt.driver [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Deleting instance files /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9_del
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.576 281009 INFO nova.virt.libvirt.driver [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Deletion of /var/lib/nova/instances/89419cdc-1b37-4fdd-ad4b-013514e141a9_del complete
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.629 281009 INFO nova.compute.manager [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Took 0.96 seconds to destroy the instance on the hypervisor.
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.630 281009 DEBUG oslo.service.loopingcall [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.630 281009 DEBUG nova.compute.manager [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 06 10:15:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:48.631 281009 DEBUG nova.network.neutron [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:15:48 np0005548788.localdomain podman[313589]: 2025-12-06 10:15:48.774061308 +0000 UTC m=+0.087126968 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:15:48 np0005548788.localdomain podman[313589]: 2025-12-06 10:15:48.779528076 +0000 UTC m=+0.092593716 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-28d66a5ffc6b19c86ac80f6c23e0880e0c18b611c2125fb44a1ef881841a0b2c-merged.mount: Deactivated successfully.
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5cf4e44f9e6e9e0273df1ed88bfbbd9b992526ef86eb9e6f58f42136e5753c7e-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:48 np0005548788.localdomain systemd[1]: run-netns-ovnmeta\x2ddeb7774c\x2de96b\x2d4e7f\x2d88d7\x2ded9d740915f4.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:15:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:15:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:49.615 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:15:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:15:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:15:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19210 "" "Go-http-client/1.1"
Dec 06 10:15:49 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:49.995 2 INFO neutron.agent.securitygroups_rpc [req-da70e705-23ca-45d2-aa6c-68d8abc979e1 req-8ff8aa52-7146-4285-bb8a-51bbd99a36a5 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group member updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:50 np0005548788.localdomain ceph-mon[293643]: pgmap v147: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 683 KiB/s rd, 4.3 MiB/s wr, 134 op/s
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.196 281009 DEBUG nova.compute.manager [req-2e6d3325-1150-4460-89b2-829a5cb5f773 req-0a024ac4-b250-4663-a1a5-fc8f32a2001f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.197 281009 DEBUG oslo_concurrency.lockutils [req-2e6d3325-1150-4460-89b2-829a5cb5f773 req-0a024ac4-b250-4663-a1a5-fc8f32a2001f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.197 281009 DEBUG oslo_concurrency.lockutils [req-2e6d3325-1150-4460-89b2-829a5cb5f773 req-0a024ac4-b250-4663-a1a5-fc8f32a2001f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.197 281009 DEBUG oslo_concurrency.lockutils [req-2e6d3325-1150-4460-89b2-829a5cb5f773 req-0a024ac4-b250-4663-a1a5-fc8f32a2001f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.198 281009 DEBUG nova.compute.manager [req-2e6d3325-1150-4460-89b2-829a5cb5f773 req-0a024ac4-b250-4663-a1a5-fc8f32a2001f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] No waiting events found dispatching network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.198 281009 WARNING nova.compute.manager [req-2e6d3325-1150-4460-89b2-829a5cb5f773 req-0a024ac4-b250-4663-a1a5-fc8f32a2001f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received unexpected event network-vif-plugged-22b2d742-fd5b-4bf4-898c-5da61dccc8af for instance with vm_state active and task_state deleting.
Dec 06 10:15:50 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:50.504 262572 INFO neutron.agent.linux.ip_lib [None req-b2842986-d7bd-4a82-9984-e06004db585b - - - - - -] Device tap1ef455c8-89 cannot be used as it has no MAC address
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.527 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:50 np0005548788.localdomain kernel: device tap1ef455c8-89 entered promiscuous mode
Dec 06 10:15:50 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016150.5329] manager: (tap1ef455c8-89): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Dec 06 10:15:50 np0005548788.localdomain systemd-udevd[313472]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:50Z|00075|binding|INFO|Claiming lport 1ef455c8-89fb-4d7b-8888-aa80d3de2436 for this chassis.
Dec 06 10:15:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:50Z|00076|binding|INFO|1ef455c8-89fb-4d7b-8888-aa80d3de2436: Claiming unknown
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.537 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:50.549 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-9a4364cd-0c9d-444a-a049-bf1e108774d5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a4364cd-0c9d-444a-a049-bf1e108774d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7f7a421249944a7a68a80478e32eed0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6f66b3bf-6539-447b-8759-20d6a34c5753, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=1ef455c8-89fb-4d7b-8888-aa80d3de2436) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:50.551 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 1ef455c8-89fb-4d7b-8888-aa80d3de2436 in datapath 9a4364cd-0c9d-444a-a049-bf1e108774d5 bound to our chassis
Dec 06 10:15:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:50.554 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1587d092-eeb1-4654-a2b1-fba4fecdd74c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:15:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:50.554 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a4364cd-0c9d-444a-a049-bf1e108774d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:50.555 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c60b2ed-fde9-476e-a288-811dcfea5307]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:50Z|00077|binding|INFO|Setting lport 1ef455c8-89fb-4d7b-8888-aa80d3de2436 ovn-installed in OVS
Dec 06 10:15:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:50Z|00078|binding|INFO|Setting lport 1ef455c8-89fb-4d7b-8888-aa80d3de2436 up in Southbound
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.576 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap1ef455c8-89: No such device
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.623 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.652 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.756 281009 DEBUG nova.network.neutron [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.791 281009 INFO nova.compute.manager [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Took 2.16 seconds to deallocate network for instance.
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.831 281009 DEBUG nova.compute.manager [req-fec2ee8f-8767-42ce-8214-0b3b294a43a1 req-8bba2f89-2312-4222-b58a-04abfb641a1f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Received event network-vif-deleted-22b2d742-fd5b-4bf4-898c-5da61dccc8af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.843 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.844 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:50.913 281009 DEBUG oslo_concurrency.processutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1067841479' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.373 281009 DEBUG oslo_concurrency.processutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.381 281009 DEBUG nova.compute.provider_tree [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.402 281009 DEBUG nova.scheduler.client.report [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.432 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.472 281009 INFO nova.scheduler.client.report [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Deleted allocations for instance 89419cdc-1b37-4fdd-ad4b-013514e141a9
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.550 281009 DEBUG oslo_concurrency.lockutils [None req-da70e705-23ca-45d2-aa6c-68d8abc979e1 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Lock "89419cdc-1b37-4fdd-ad4b-013514e141a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:51.553 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548788.localdomain podman[313706]: 
Dec 06 10:15:51 np0005548788.localdomain podman[313706]: 2025-12-06 10:15:51.72982627 +0000 UTC m=+0.091537764 container create 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:15:51 np0005548788.localdomain systemd[1]: Started libpod-conmon-3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08.scope.
Dec 06 10:15:51 np0005548788.localdomain podman[313706]: 2025-12-06 10:15:51.687486919 +0000 UTC m=+0.049198483 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:15:51 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:51 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/180c7d5604fc9a0ccf5893fd06dd2abf388ce2c705eb234ca354ebf3c71040b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:51 np0005548788.localdomain podman[313706]: 2025-12-06 10:15:51.807020512 +0000 UTC m=+0.168732006 container init 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:51 np0005548788.localdomain podman[313706]: 2025-12-06 10:15:51.815909295 +0000 UTC m=+0.177620799 container start 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:15:51 np0005548788.localdomain dnsmasq[313724]: started, version 2.85 cachesize 150
Dec 06 10:15:51 np0005548788.localdomain dnsmasq[313724]: DNS service limited to local subnets
Dec 06 10:15:51 np0005548788.localdomain dnsmasq[313724]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:15:51 np0005548788.localdomain dnsmasq[313724]: warning: no upstream servers configured
Dec 06 10:15:51 np0005548788.localdomain dnsmasq-dhcp[313724]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:15:51 np0005548788.localdomain dnsmasq[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/addn_hosts - 0 addresses
Dec 06 10:15:51 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/host
Dec 06 10:15:51 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/opts
Dec 06 10:15:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:52.012 262572 INFO neutron.agent.dhcp.agent [None req-ea474030-4cfa-4e1c-8476-384e2627641c - - - - - -] DHCP configuration for ports {'9f12c7ec-60d6-4ed7-b078-e3dd3f1c3d8c'} is completed
Dec 06 10:15:52 np0005548788.localdomain ceph-mon[293643]: pgmap v148: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 679 KiB/s rd, 4.3 MiB/s wr, 129 op/s
Dec 06 10:15:52 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1067841479' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:52.593 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:52Z, description=, device_id=dafd896d-42a7-4e64-be65-9942f12d900d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c706e280>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c706e040>], id=be9a8eed-98c4-4724-b051-713d7bb947ff, ip_allocation=immediate, mac_address=fa:16:3e:22:e6:83, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:47Z, description=, dns_domain=, id=9a4364cd-0c9d-444a-a049-bf1e108774d5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1017775685-network, port_security_enabled=True, project_id=f7f7a421249944a7a68a80478e32eed0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20962, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=834, status=ACTIVE, subnets=['8e21912d-daa1-4665-bde5-e05c599762eb'], tags=[], tenant_id=f7f7a421249944a7a68a80478e32eed0, updated_at=2025-12-06T10:15:48Z, vlan_transparent=None, network_id=9a4364cd-0c9d-444a-a049-bf1e108774d5, port_security_enabled=False, project_id=f7f7a421249944a7a68a80478e32eed0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=880, status=DOWN, tags=[], tenant_id=f7f7a421249944a7a68a80478e32eed0, updated_at=2025-12-06T10:15:52Z on network 9a4364cd-0c9d-444a-a049-bf1e108774d5
Dec 06 10:15:52 np0005548788.localdomain dnsmasq[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/addn_hosts - 1 addresses
Dec 06 10:15:52 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/host
Dec 06 10:15:52 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/opts
Dec 06 10:15:52 np0005548788.localdomain podman[313741]: 2025-12-06 10:15:52.812566863 +0000 UTC m=+0.067575797 container kill 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:52.940 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:52.950 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:52Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c687d1f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688e1f0>], id=feb6a13d-305a-4541-a50e-4988833ecf82, ip_allocation=immediate, mac_address=fa:16:3e:e5:ea:4a, name=tempest-parent-1146072664, network_id=45604602-bc87-4608-9881-9568cbf90870, port_security_enabled=True, project_id=9167331b2c424ef6961b096b551f8434, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=13, security_groups=['4c82b56e-0fc5-4c7f-8922-ceb8236815fd'], standard_attr_id=627, status=DOWN, tags=[], tenant_id=9167331b2c424ef6961b096b551f8434, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c687d0a0>], trunk_id=113740e8-6296-4106-ae10-22f16d519315, updated_at=2025-12-06T10:15:51Z on network 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:15:53 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:53.082 262572 INFO neutron.agent.dhcp.agent [None req-2367e758-87bf-49a3-8f33-24affb2eeb11 - - - - - -] DHCP configuration for ports {'be9a8eed-98c4-4724-b051-713d7bb947ff'} is completed
Dec 06 10:15:53 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 2 addresses
Dec 06 10:15:53 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:15:53 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:15:53 np0005548788.localdomain podman[313778]: 2025-12-06 10:15:53.226953748 +0000 UTC m=+0.067290369 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:15:53 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:53.403 262572 INFO neutron.agent.dhcp.agent [None req-4cbbaf89-e87b-452b-ba04-f1bc9b95ac05 - - - - - -] DHCP configuration for ports {'feb6a13d-305a-4541-a50e-4988833ecf82'} is completed
Dec 06 10:15:53 np0005548788.localdomain sudo[313798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:15:53 np0005548788.localdomain sudo[313798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:15:53 np0005548788.localdomain sudo[313798]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:53 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:53.971 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:52Z, description=, device_id=dafd896d-42a7-4e64-be65-9942f12d900d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6879d90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6879370>], id=be9a8eed-98c4-4724-b051-713d7bb947ff, ip_allocation=immediate, mac_address=fa:16:3e:22:e6:83, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:47Z, description=, dns_domain=, id=9a4364cd-0c9d-444a-a049-bf1e108774d5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1017775685-network, port_security_enabled=True, project_id=f7f7a421249944a7a68a80478e32eed0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20962, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=834, status=ACTIVE, subnets=['8e21912d-daa1-4665-bde5-e05c599762eb'], tags=[], tenant_id=f7f7a421249944a7a68a80478e32eed0, updated_at=2025-12-06T10:15:48Z, vlan_transparent=None, network_id=9a4364cd-0c9d-444a-a049-bf1e108774d5, port_security_enabled=False, project_id=f7f7a421249944a7a68a80478e32eed0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=880, status=DOWN, tags=[], tenant_id=f7f7a421249944a7a68a80478e32eed0, updated_at=2025-12-06T10:15:52Z on network 9a4364cd-0c9d-444a-a049-bf1e108774d5
Dec 06 10:15:54 np0005548788.localdomain sudo[313816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:15:54 np0005548788.localdomain sudo[313816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:15:54 np0005548788.localdomain dnsmasq[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/addn_hosts - 1 addresses
Dec 06 10:15:54 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/host
Dec 06 10:15:54 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/opts
Dec 06 10:15:54 np0005548788.localdomain podman[313851]: 2025-12-06 10:15:54.186244728 +0000 UTC m=+0.057154188 container kill 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:15:54 np0005548788.localdomain ceph-mon[293643]: pgmap v149: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 716 KiB/s rd, 4.3 MiB/s wr, 184 op/s
Dec 06 10:15:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:15:54.453 262572 INFO neutron.agent.dhcp.agent [None req-30e0f659-984f-4751-99ee-5fd0035ecf73 - - - - - -] DHCP configuration for ports {'be9a8eed-98c4-4724-b051-713d7bb947ff'} is completed
Dec 06 10:15:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:54.649 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:54 np0005548788.localdomain sudo[313816]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:15:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:15:54 np0005548788.localdomain sudo[313903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:15:54 np0005548788.localdomain sudo[313903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:15:54 np0005548788.localdomain sudo[313903]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:55.040 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:15:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:15:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:15:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:15:56 np0005548788.localdomain ceph-mon[293643]: pgmap v150: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 28 KiB/s wr, 56 op/s
Dec 06 10:15:56 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:56.498 2 INFO neutron.agent.securitygroups_rpc [None req-97ddf7c5-61a2-4ea7-a37a-afceb032745e 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:15:56 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:15:56.993 2 INFO neutron.agent.securitygroups_rpc [None req-e28bc6dc-5f9c-4334-81fe-cd06724fee5d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:15:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:15:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:57.991 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:15:58 np0005548788.localdomain ceph-mon[293643]: pgmap v151: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 28 KiB/s wr, 56 op/s
Dec 06 10:15:58 np0005548788.localdomain systemd[1]: tmp-crun.6ePbXs.mount: Deactivated successfully.
Dec 06 10:15:58 np0005548788.localdomain dnsmasq[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/addn_hosts - 0 addresses
Dec 06 10:15:58 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/host
Dec 06 10:15:58 np0005548788.localdomain podman[313939]: 2025-12-06 10:15:58.864537076 +0000 UTC m=+0.073951785 container kill 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:58 np0005548788.localdomain dnsmasq-dhcp[313724]: read /var/lib/neutron/dhcp/9a4364cd-0c9d-444a-a049-bf1e108774d5/opts
Dec 06 10:15:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:59.119 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:59 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:59Z|00079|binding|INFO|Releasing lport 1ef455c8-89fb-4d7b-8888-aa80d3de2436 from this chassis (sb_readonly=0)
Dec 06 10:15:59 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:15:59Z|00080|binding|INFO|Setting lport 1ef455c8-89fb-4d7b-8888-aa80d3de2436 down in Southbound
Dec 06 10:15:59 np0005548788.localdomain kernel: device tap1ef455c8-89 left promiscuous mode
Dec 06 10:15:59 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:59.144 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-9a4364cd-0c9d-444a-a049-bf1e108774d5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a4364cd-0c9d-444a-a049-bf1e108774d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7f7a421249944a7a68a80478e32eed0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6f66b3bf-6539-447b-8759-20d6a34c5753, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=1ef455c8-89fb-4d7b-8888-aa80d3de2436) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:59 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:59.145 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 1ef455c8-89fb-4d7b-8888-aa80d3de2436 in datapath 9a4364cd-0c9d-444a-a049-bf1e108774d5 unbound from our chassis
Dec 06 10:15:59 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:59.147 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a4364cd-0c9d-444a-a049-bf1e108774d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:59 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:15:59.147 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[d59b9965-298f-4f58-b5e6-f3a8bbf61b68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:59.151 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:15:59.654 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:00 np0005548788.localdomain ceph-mon[293643]: pgmap v152: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 28 KiB/s wr, 56 op/s
Dec 06 10:16:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:16:00 np0005548788.localdomain systemd[1]: tmp-crun.1cZqjK.mount: Deactivated successfully.
Dec 06 10:16:00 np0005548788.localdomain podman[313962]: 2025-12-06 10:16:00.279265691 +0000 UTC m=+0.098153267 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:16:00 np0005548788.localdomain podman[313962]: 2025-12-06 10:16:00.35701894 +0000 UTC m=+0.175906526 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:16:00 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:00.366 2 INFO neutron.agent.securitygroups_rpc [None req-96bdfd29-c14f-4ef8-b3b0-32d637d65e93 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:00 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:16:01 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:01.609 2 INFO neutron.agent.securitygroups_rpc [None req-7d84f32e-96fa-49ab-97a7-a8cf557247b9 b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:16:01 np0005548788.localdomain podman[314004]: 2025-12-06 10:16:01.873750451 +0000 UTC m=+0.062236433 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:16:01 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 1 addresses
Dec 06 10:16:01 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:16:01 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:16:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:02 np0005548788.localdomain ceph-mon[293643]: pgmap v153: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Dec 06 10:16:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:02.907 281009 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765016147.9053376, 89419cdc-1b37-4fdd-ad4b-013514e141a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:16:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:02.908 281009 INFO nova.compute.manager [-] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] VM Stopped (Lifecycle Event)
Dec 06 10:16:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:02.930 281009 DEBUG nova.compute.manager [None req-0a8ac773-aeb3-4392-afee-06536518d471 - - - - - -] [instance: 89419cdc-1b37-4fdd-ad4b-013514e141a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:16:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:02.994 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:03.271 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:03.386 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:16:04 np0005548788.localdomain ceph-mon[293643]: pgmap v154: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Dec 06 10:16:04 np0005548788.localdomain podman[314026]: 2025-12-06 10:16:04.255301158 +0000 UTC m=+0.076024567 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm)
Dec 06 10:16:04 np0005548788.localdomain podman[314026]: 2025-12-06 10:16:04.268014308 +0000 UTC m=+0.088737777 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, config_id=edpm)
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: tmp-crun.IEbgkv.mount: Deactivated successfully.
Dec 06 10:16:04 np0005548788.localdomain podman[314024]: 2025-12-06 10:16:04.319414618 +0000 UTC m=+0.149701202 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:16:04 np0005548788.localdomain podman[314024]: 2025-12-06 10:16:04.354616809 +0000 UTC m=+0.184903463 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:16:04 np0005548788.localdomain podman[314025]: 2025-12-06 10:16:04.381890018 +0000 UTC m=+0.206085844 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:16:04 np0005548788.localdomain podman[314025]: 2025-12-06 10:16:04.397744625 +0000 UTC m=+0.221940481 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:16:04 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:16:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:04.656 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:04 np0005548788.localdomain dnsmasq[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/addn_hosts - 0 addresses
Dec 06 10:16:04 np0005548788.localdomain podman[314102]: 2025-12-06 10:16:04.815339598 +0000 UTC m=+0.067958809 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:16:04 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/host
Dec 06 10:16:04 np0005548788.localdomain dnsmasq-dhcp[310698]: read /var/lib/neutron/dhcp/45604602-bc87-4608-9881-9568cbf90870/opts
Dec 06 10:16:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:04Z|00081|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:16:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:04Z|00082|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:16:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:04Z|00083|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:16:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:04.891 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:04.895 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:04.914 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:05.313 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:05 np0005548788.localdomain kernel: device tap791cf6d4-c9 left promiscuous mode
Dec 06 10:16:05 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:05Z|00084|binding|INFO|Releasing lport 791cf6d4-c971-4e4f-960d-1a47244446a2 from this chassis (sb_readonly=0)
Dec 06 10:16:05 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:05Z|00085|binding|INFO|Setting lport 791cf6d4-c971-4e4f-960d-1a47244446a2 down in Southbound
Dec 06 10:16:05 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:05.322 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-45604602-bc87-4608-9881-9568cbf90870', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45604602-bc87-4608-9881-9568cbf90870', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9167331b2c424ef6961b096b551f8434', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d40d335f-7e85-43c3-894d-993c12735497, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=791cf6d4-c971-4e4f-960d-1a47244446a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:05 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:05.324 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 791cf6d4-c971-4e4f-960d-1a47244446a2 in datapath 45604602-bc87-4608-9881-9568cbf90870 unbound from our chassis
Dec 06 10:16:05 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:05.328 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45604602-bc87-4608-9881-9568cbf90870, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:16:05 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:05.329 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[93a3457d-55aa-46c3-90cf-488324038fdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:05.339 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:05 np0005548788.localdomain dnsmasq[313724]: exiting on receipt of SIGTERM
Dec 06 10:16:05 np0005548788.localdomain podman[314140]: 2025-12-06 10:16:05.345925823 +0000 UTC m=+0.103300345 container kill 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:05 np0005548788.localdomain systemd[1]: libpod-3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08.scope: Deactivated successfully.
Dec 06 10:16:05 np0005548788.localdomain podman[314153]: 2025-12-06 10:16:05.408834567 +0000 UTC m=+0.051953958 container died 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:16:05 np0005548788.localdomain podman[314153]: 2025-12-06 10:16:05.497052157 +0000 UTC m=+0.140171568 container cleanup 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:05 np0005548788.localdomain systemd[1]: libpod-conmon-3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08.scope: Deactivated successfully.
Dec 06 10:16:05 np0005548788.localdomain podman[314160]: 2025-12-06 10:16:05.521749517 +0000 UTC m=+0.150041153 container remove 3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a4364cd-0c9d-444a-a049-bf1e108774d5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:16:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:05.785 262572 INFO neutron.agent.dhcp.agent [None req-b229a502-3833-4e22-b04d-83b0588e527d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:05 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:05.826 2 INFO neutron.agent.securitygroups_rpc [None req-be960e3b-e920-4ec4-8e87-e409a0af324a 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:05.886 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:06 np0005548788.localdomain ceph-mon[293643]: pgmap v155: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-180c7d5604fc9a0ccf5893fd06dd2abf388ce2c705eb234ca354ebf3c71040b2-merged.mount: Deactivated successfully.
Dec 06 10:16:06 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3bbe94708e6d8e0f4f3b3c9177c4cb9e68501907fa44ed50f03cf296201eed08-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:06 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d9a4364cd\x2d0c9d\x2d444a\x2da049\x2dbf1e108774d5.mount: Deactivated successfully.
Dec 06 10:16:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:07 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:07.167 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:08.032 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:08 np0005548788.localdomain ceph-mon[293643]: pgmap v156: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:16:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:16:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:09.693 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:10 np0005548788.localdomain ceph-mon[293643]: pgmap v157: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:11.325 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:11.325 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:11.327 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:16:11 np0005548788.localdomain dnsmasq[310698]: exiting on receipt of SIGTERM
Dec 06 10:16:11 np0005548788.localdomain podman[314203]: 2025-12-06 10:16:11.588432841 +0000 UTC m=+0.066864356 container kill f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:16:11 np0005548788.localdomain systemd[1]: tmp-crun.G4OLFZ.mount: Deactivated successfully.
Dec 06 10:16:11 np0005548788.localdomain systemd[1]: libpod-f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd.scope: Deactivated successfully.
Dec 06 10:16:11 np0005548788.localdomain podman[314215]: 2025-12-06 10:16:11.645465253 +0000 UTC m=+0.042291580 container died f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:16:11 np0005548788.localdomain podman[314215]: 2025-12-06 10:16:11.678988723 +0000 UTC m=+0.075815040 container cleanup f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:16:11 np0005548788.localdomain systemd[1]: libpod-conmon-f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd.scope: Deactivated successfully.
Dec 06 10:16:11 np0005548788.localdomain podman[314222]: 2025-12-06 10:16:11.748342935 +0000 UTC m=+0.132798403 container remove f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45604602-bc87-4608-9881-9568cbf90870, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:16:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:11.774 262572 INFO neutron.agent.dhcp.agent [None req-ee1a7f8f-e72c-4bf3-973b-ad9251d03026 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:11.850 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:12.160 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:12 np0005548788.localdomain ceph-mon[293643]: pgmap v158: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-44745714ded2a5c6c27641d7523732339da4255f252ccd80706818d6e03efb3f-merged.mount: Deactivated successfully.
Dec 06 10:16:12 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f705d94bb9386d029b4c2c2c98db88f32ef32dc19fee50482d942a9a216d82dd-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:12 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d45604602\x2dbc87\x2d4608\x2d9881\x2d9568cbf90870.mount: Deactivated successfully.
Dec 06 10:16:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:13.034 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:14.000 262572 INFO neutron.agent.linux.ip_lib [None req-26aeaf59-4185-4e92-b397-77090aea4b31 - - - - - -] Device tap9e965e19-7a cannot be used as it has no MAC address
Dec 06 10:16:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:14.028 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:14 np0005548788.localdomain kernel: device tap9e965e19-7a entered promiscuous mode
Dec 06 10:16:14 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016174.0405] manager: (tap9e965e19-7a): new Generic device (/org/freedesktop/NetworkManager/Devices/23)
Dec 06 10:16:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:14Z|00086|binding|INFO|Claiming lport 9e965e19-7a0c-4962-8985-a11388c56b05 for this chassis.
Dec 06 10:16:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:14Z|00087|binding|INFO|9e965e19-7a0c-4962-8985-a11388c56b05: Claiming unknown
Dec 06 10:16:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:14.041 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:14 np0005548788.localdomain systemd-udevd[314251]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:14.053 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-ab15e875-2913-4eec-9b0f-c69b1847b96b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab15e875-2913-4eec-9b0f-c69b1847b96b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a22ced63e346459ab637424ae7833af7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1d1d55b-61d9-4e60-9a48-af1eaa336718, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=9e965e19-7a0c-4962-8985-a11388c56b05) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:14.054 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 9e965e19-7a0c-4962-8985-a11388c56b05 in datapath ab15e875-2913-4eec-9b0f-c69b1847b96b bound to our chassis
Dec 06 10:16:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:14.057 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ab15e875-2913-4eec-9b0f-c69b1847b96b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:16:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:14.058 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[455f011f-05d3-4bd5-b007-fbaa5441d678]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:14Z|00088|binding|INFO|Setting lport 9e965e19-7a0c-4962-8985-a11388c56b05 ovn-installed in OVS
Dec 06 10:16:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:14Z|00089|binding|INFO|Setting lport 9e965e19-7a0c-4962-8985-a11388c56b05 up in Southbound
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:14.081 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap9e965e19-7a: No such device
Dec 06 10:16:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:14.128 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:14.158 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:14 np0005548788.localdomain ceph-mon[293643]: pgmap v159: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:14 np0005548788.localdomain snmpd[67478]: empty variable list in _query
Dec 06 10:16:14 np0005548788.localdomain snmpd[67478]: empty variable list in _query
Dec 06 10:16:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:14.732 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:15 np0005548788.localdomain podman[314322]: 
Dec 06 10:16:15 np0005548788.localdomain podman[314322]: 2025-12-06 10:16:15.153026544 +0000 UTC m=+0.102250882 container create 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:16:15 np0005548788.localdomain podman[314322]: 2025-12-06 10:16:15.100335675 +0000 UTC m=+0.049560033 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:15 np0005548788.localdomain systemd[1]: Started libpod-conmon-767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585.scope.
Dec 06 10:16:15 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef16a981332c5405a0296fd90089109d52bd1f32e045484279b81ab5decd3917/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:15 np0005548788.localdomain podman[314322]: 2025-12-06 10:16:15.232976442 +0000 UTC m=+0.182200770 container init 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:16:15 np0005548788.localdomain podman[314322]: 2025-12-06 10:16:15.243594847 +0000 UTC m=+0.192819175 container start 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:16:15 np0005548788.localdomain dnsmasq[314340]: started, version 2.85 cachesize 150
Dec 06 10:16:15 np0005548788.localdomain dnsmasq[314340]: DNS service limited to local subnets
Dec 06 10:16:15 np0005548788.localdomain dnsmasq[314340]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:15 np0005548788.localdomain dnsmasq[314340]: warning: no upstream servers configured
Dec 06 10:16:15 np0005548788.localdomain dnsmasq-dhcp[314340]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:15 np0005548788.localdomain dnsmasq[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/addn_hosts - 0 addresses
Dec 06 10:16:15 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/host
Dec 06 10:16:15 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/opts
Dec 06 10:16:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:15.328 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:16:15 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:15.380 262572 INFO neutron.agent.dhcp.agent [None req-3aea201c-29cf-43c1-8f7e-eb35ffa01f8b - - - - - -] DHCP configuration for ports {'28981b66-e562-422a-9fee-061d4ebef66f'} is completed
Dec 06 10:16:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:16:16 np0005548788.localdomain ceph-mon[293643]: pgmap v160: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:16 np0005548788.localdomain podman[314341]: 2025-12-06 10:16:16.265254065 +0000 UTC m=+0.086096128 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:16:16 np0005548788.localdomain podman[314341]: 2025-12-06 10:16:16.280985438 +0000 UTC m=+0.101827511 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Dec 06 10:16:16 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:16:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:18.037 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:18 np0005548788.localdomain ceph-mon[293643]: pgmap v161: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:16:18 np0005548788.localdomain podman[314361]: 2025-12-06 10:16:18.242316802 +0000 UTC m=+0.071943362 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:16:18 np0005548788.localdomain podman[314361]: 2025-12-06 10:16:18.249474412 +0000 UTC m=+0.079101012 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:16:18 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:16:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:16:19 np0005548788.localdomain podman[314383]: 2025-12-06 10:16:19.268328573 +0000 UTC m=+0.088876993 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:19 np0005548788.localdomain podman[314383]: 2025-12-06 10:16:19.278780474 +0000 UTC m=+0.099328864 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:16:19 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:16:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:16:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:16:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:16:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:16:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:16:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19202 "" "Go-http-client/1.1"
Dec 06 10:16:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:19.734 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:19.795 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:19Z, description=, device_id=1a41ced9-29be-4992-bdce-4aa27040262d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6899fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6899730>], id=eb99daaf-41c7-49c1-bb7b-7299afb74248, ip_allocation=immediate, mac_address=fa:16:3e:b2:00:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:11Z, description=, dns_domain=, id=ab15e875-2913-4eec-9b0f-c69b1847b96b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1180868772, port_security_enabled=True, project_id=a22ced63e346459ab637424ae7833af7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31602, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=947, status=ACTIVE, subnets=['46565012-aeac-4688-8ff5-8ecb6b53d353'], tags=[], tenant_id=a22ced63e346459ab637424ae7833af7, updated_at=2025-12-06T10:16:12Z, vlan_transparent=None, network_id=ab15e875-2913-4eec-9b0f-c69b1847b96b, port_security_enabled=False, project_id=a22ced63e346459ab637424ae7833af7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=984, status=DOWN, tags=[], tenant_id=a22ced63e346459ab637424ae7833af7, updated_at=2025-12-06T10:16:19Z on network ab15e875-2913-4eec-9b0f-c69b1847b96b
Dec 06 10:16:20 np0005548788.localdomain dnsmasq[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/addn_hosts - 1 addresses
Dec 06 10:16:20 np0005548788.localdomain podman[314418]: 2025-12-06 10:16:20.043775592 +0000 UTC m=+0.070164757 container kill 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:16:20 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/host
Dec 06 10:16:20 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/opts
Dec 06 10:16:20 np0005548788.localdomain ceph-mon[293643]: pgmap v162: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:20 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:20.370 262572 INFO neutron.agent.dhcp.agent [None req-6a1e1600-d5ad-4bf5-95df-84f611a3c9ae - - - - - -] DHCP configuration for ports {'eb99daaf-41c7-49c1-bb7b-7299afb74248'} is completed
Dec 06 10:16:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:21.835 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:19Z, description=, device_id=1a41ced9-29be-4992-bdce-4aa27040262d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c7106850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c7106b50>], id=eb99daaf-41c7-49c1-bb7b-7299afb74248, ip_allocation=immediate, mac_address=fa:16:3e:b2:00:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:11Z, description=, dns_domain=, id=ab15e875-2913-4eec-9b0f-c69b1847b96b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1180868772, port_security_enabled=True, project_id=a22ced63e346459ab637424ae7833af7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31602, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=947, status=ACTIVE, subnets=['46565012-aeac-4688-8ff5-8ecb6b53d353'], tags=[], tenant_id=a22ced63e346459ab637424ae7833af7, updated_at=2025-12-06T10:16:12Z, vlan_transparent=None, network_id=ab15e875-2913-4eec-9b0f-c69b1847b96b, port_security_enabled=False, project_id=a22ced63e346459ab637424ae7833af7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=984, status=DOWN, tags=[], tenant_id=a22ced63e346459ab637424ae7833af7, updated_at=2025-12-06T10:16:19Z on network ab15e875-2913-4eec-9b0f-c69b1847b96b
Dec 06 10:16:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:22 np0005548788.localdomain dnsmasq[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/addn_hosts - 1 addresses
Dec 06 10:16:22 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/host
Dec 06 10:16:22 np0005548788.localdomain podman[314456]: 2025-12-06 10:16:22.057831105 +0000 UTC m=+0.065021219 container kill 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:22 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/opts
Dec 06 10:16:22 np0005548788.localdomain ceph-mon[293643]: pgmap v163: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:22 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:22.391 262572 INFO neutron.agent.dhcp.agent [None req-b31635d7-34c6-4591-99f5-7540e402fc10 - - - - - -] DHCP configuration for ports {'eb99daaf-41c7-49c1-bb7b-7299afb74248'} is completed
Dec 06 10:16:22 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:22.878 2 INFO neutron.agent.securitygroups_rpc [None req-08283fcf-8c3f-4ce1-8201-1776fe09eb71 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:22 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:22.916 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:22Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686baf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c686ba60>], id=24e5e254-d0e4-4f85-8bf8-e66fa08c823b, ip_allocation=immediate, mac_address=fa:16:3e:53:11:6c, name=tempest-FloatingIPTestJSON-687576435, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:11Z, description=, dns_domain=, id=ab15e875-2913-4eec-9b0f-c69b1847b96b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1180868772, port_security_enabled=True, project_id=a22ced63e346459ab637424ae7833af7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31602, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=947, status=ACTIVE, subnets=['46565012-aeac-4688-8ff5-8ecb6b53d353'], tags=[], tenant_id=a22ced63e346459ab637424ae7833af7, updated_at=2025-12-06T10:16:12Z, vlan_transparent=None, network_id=ab15e875-2913-4eec-9b0f-c69b1847b96b, port_security_enabled=True, project_id=a22ced63e346459ab637424ae7833af7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['55c805cd-9bbe-4434-83af-206ee080e6b9'], standard_attr_id=1021, status=DOWN, tags=[], tenant_id=a22ced63e346459ab637424ae7833af7, updated_at=2025-12-06T10:16:22Z on network ab15e875-2913-4eec-9b0f-c69b1847b96b
Dec 06 10:16:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:23.041 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:23 np0005548788.localdomain podman[314493]: 2025-12-06 10:16:23.132890943 +0000 UTC m=+0.067107943 container kill 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:16:23 np0005548788.localdomain dnsmasq[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/addn_hosts - 2 addresses
Dec 06 10:16:23 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/host
Dec 06 10:16:23 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/opts
Dec 06 10:16:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:23.337 262572 INFO neutron.agent.dhcp.agent [None req-507bfc49-a301-40bf-afd3-174ca72fc843 - - - - - -] DHCP configuration for ports {'24e5e254-d0e4-4f85-8bf8-e66fa08c823b'} is completed
Dec 06 10:16:24 np0005548788.localdomain ceph-mon[293643]: pgmap v164: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:24.390 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:24.773 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:25.247 2 INFO neutron.agent.securitygroups_rpc [None req-2f0fe649-a0ce-475a-a444-c6db3fc27153 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:25 np0005548788.localdomain dnsmasq[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/addn_hosts - 1 addresses
Dec 06 10:16:25 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/host
Dec 06 10:16:25 np0005548788.localdomain podman[314531]: 2025-12-06 10:16:25.543269345 +0000 UTC m=+0.083855988 container kill 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:16:25 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/opts
Dec 06 10:16:26 np0005548788.localdomain dnsmasq[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/addn_hosts - 0 addresses
Dec 06 10:16:26 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/host
Dec 06 10:16:26 np0005548788.localdomain dnsmasq-dhcp[314340]: read /var/lib/neutron/dhcp/ab15e875-2913-4eec-9b0f-c69b1847b96b/opts
Dec 06 10:16:26 np0005548788.localdomain podman[314569]: 2025-12-06 10:16:26.109348841 +0000 UTC m=+0.059073526 container kill 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:16:26 np0005548788.localdomain ceph-mon[293643]: pgmap v165: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:26Z|00090|binding|INFO|Releasing lport 9e965e19-7a0c-4962-8985-a11388c56b05 from this chassis (sb_readonly=0)
Dec 06 10:16:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:26.313 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:26 np0005548788.localdomain kernel: device tap9e965e19-7a left promiscuous mode
Dec 06 10:16:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:26Z|00091|binding|INFO|Setting lport 9e965e19-7a0c-4962-8985-a11388c56b05 down in Southbound
Dec 06 10:16:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:26.324 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-ab15e875-2913-4eec-9b0f-c69b1847b96b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab15e875-2913-4eec-9b0f-c69b1847b96b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a22ced63e346459ab637424ae7833af7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1d1d55b-61d9-4e60-9a48-af1eaa336718, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=9e965e19-7a0c-4962-8985-a11388c56b05) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:26.326 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 9e965e19-7a0c-4962-8985-a11388c56b05 in datapath ab15e875-2913-4eec-9b0f-c69b1847b96b unbound from our chassis
Dec 06 10:16:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:26.328 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab15e875-2913-4eec-9b0f-c69b1847b96b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:16:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:26.329 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[1c445573-06f5-4e41-be71-a43979289250]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:26.337 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:27 np0005548788.localdomain systemd[1]: tmp-crun.9NlYqr.mount: Deactivated successfully.
Dec 06 10:16:27 np0005548788.localdomain dnsmasq[314340]: exiting on receipt of SIGTERM
Dec 06 10:16:27 np0005548788.localdomain podman[314608]: 2025-12-06 10:16:27.924324037 +0000 UTC m=+0.076457981 container kill 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:16:27 np0005548788.localdomain systemd[1]: libpod-767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585.scope: Deactivated successfully.
Dec 06 10:16:27 np0005548788.localdomain podman[314621]: 2025-12-06 10:16:27.985565639 +0000 UTC m=+0.047079558 container died 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:28 np0005548788.localdomain podman[314621]: 2025-12-06 10:16:28.010419213 +0000 UTC m=+0.071933122 container cleanup 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:16:28 np0005548788.localdomain systemd[1]: libpod-conmon-767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585.scope: Deactivated successfully.
Dec 06 10:16:28 np0005548788.localdomain ceph-mon[293643]: pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:28 np0005548788.localdomain podman[314623]: 2025-12-06 10:16:28.092100243 +0000 UTC m=+0.149837557 container remove 767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab15e875-2913-4eec-9b0f-c69b1847b96b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:28.093 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:28.119 262572 INFO neutron.agent.dhcp.agent [None req-a5ac3df4-ffc3-47e2-97e9-bbfde7bd93a5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:28.147 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:28.434 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ef16a981332c5405a0296fd90089109d52bd1f32e045484279b81ab5decd3917-merged.mount: Deactivated successfully.
Dec 06 10:16:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-767ee42377d71aedffde6ca7f7c97fe8f791d40737ebcc5974cd5b7ff8645585-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:28 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2dab15e875\x2d2913\x2d4eec\x2d9b0f\x2dc69b1847b96b.mount: Deactivated successfully.
Dec 06 10:16:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:29.311 2 INFO neutron.agent.securitygroups_rpc [None req-64ece17b-51fa-4f7d-ac9f-f7ae51f6ef1a 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:29.777 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:29.841 2 INFO neutron.agent.securitygroups_rpc [None req-e4d175d7-f151-45a2-bfa9-dd114b2ac98c 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:30 np0005548788.localdomain ceph-mon[293643]: pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:16:31 np0005548788.localdomain podman[314649]: 2025-12-06 10:16:31.253838735 +0000 UTC m=+0.079781123 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:16:31 np0005548788.localdomain podman[314649]: 2025-12-06 10:16:31.352432025 +0000 UTC m=+0.178374393 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:31 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:16:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:32 np0005548788.localdomain ceph-mon[293643]: pgmap v168: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:32 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2582979284' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:33.096 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:33 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4066488691' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:33.201 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:34 np0005548788.localdomain ceph-mon[293643]: pgmap v169: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:34.780 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:35.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:35.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:16:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:16:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:16:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:16:35 np0005548788.localdomain podman[314674]: 2025-12-06 10:16:35.382045257 +0000 UTC m=+0.203036400 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Dec 06 10:16:35 np0005548788.localdomain podman[314675]: 2025-12-06 10:16:35.428192816 +0000 UTC m=+0.243618808 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:16:35 np0005548788.localdomain podman[314675]: 2025-12-06 10:16:35.437996537 +0000 UTC m=+0.253422489 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:16:35 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:16:35 np0005548788.localdomain podman[314676]: 2025-12-06 10:16:35.533038288 +0000 UTC m=+0.346768878 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Dec 06 10:16:35 np0005548788.localdomain podman[314674]: 2025-12-06 10:16:35.547536603 +0000 UTC m=+0.368527766 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 06 10:16:35 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:16:35 np0005548788.localdomain podman[314676]: 2025-12-06 10:16:35.600165171 +0000 UTC m=+0.413895821 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:16:35 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:16:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:36.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:36 np0005548788.localdomain ceph-mon[293643]: pgmap v170: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:37.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:37.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:38 np0005548788.localdomain ceph-mon[293643]: pgmap v171: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:38.099 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:16:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:16:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:16:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1196607636' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:16:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1196607636' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:16:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:39.783 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:40 np0005548788.localdomain ceph-mon[293643]: pgmap v172: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.031 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.032 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.061 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.062 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.063 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.063 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.064 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:16:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2751703348' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:16:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1359574343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.520 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:16:41 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:41.741 262572 INFO neutron.agent.linux.ip_lib [None req-4f1ec005-363a-4da4-9d60-d56207778068 - - - - - -] Device tapc8de0746-01 cannot be used as it has no MAC address
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.770 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:41 np0005548788.localdomain kernel: device tapc8de0746-01 entered promiscuous mode
Dec 06 10:16:41 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016201.7853] manager: (tapc8de0746-01): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.786 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:41Z|00092|binding|INFO|Claiming lport c8de0746-01bd-4a54-bcab-62064dbd640a for this chassis.
Dec 06 10:16:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:41Z|00093|binding|INFO|c8de0746-01bd-4a54-bcab-62064dbd640a: Claiming unknown
Dec 06 10:16:41 np0005548788.localdomain systemd-udevd[314769]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:41.796 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-c419a468-c335-4923-8d17-63d564d8a340', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c419a468-c335-4923-8d17-63d564d8a340', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cee3e0c1575f4b46bd60ec5b2e858b9d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f1034ba-84b2-40f7-8d01-6a7cd3ab432d, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=c8de0746-01bd-4a54-bcab-62064dbd640a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:41.798 159620 INFO neutron.agent.ovn.metadata.agent [-] Port c8de0746-01bd-4a54-bcab-62064dbd640a in datapath c419a468-c335-4923-8d17-63d564d8a340 bound to our chassis
Dec 06 10:16:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:41.799 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c419a468-c335-4923-8d17-63d564d8a340 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:16:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:41.800 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[1293e290-f16e-4fa0-9bdc-8900f3b57334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.807 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.809 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11581MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.809 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.810 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:41Z|00094|binding|INFO|Setting lport c8de0746-01bd-4a54-bcab-62064dbd640a ovn-installed in OVS
Dec 06 10:16:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:41Z|00095|binding|INFO|Setting lport c8de0746-01bd-4a54-bcab-62064dbd640a up in Southbound
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.823 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc8de0746-01: No such device
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.865 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:41.895 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.089 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.089 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.128 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:16:42 np0005548788.localdomain ceph-mon[293643]: pgmap v173: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1359574343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2821502390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:16:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2914913536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.618 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.623 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.645 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.690 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:16:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:42.690 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:42 np0005548788.localdomain podman[314862]: 
Dec 06 10:16:42 np0005548788.localdomain podman[314862]: 2025-12-06 10:16:42.802725722 +0000 UTC m=+0.089734379 container create 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:16:42 np0005548788.localdomain systemd[1]: Started libpod-conmon-14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986.scope.
Dec 06 10:16:42 np0005548788.localdomain podman[314862]: 2025-12-06 10:16:42.758278896 +0000 UTC m=+0.045287593 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:42 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:42 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59ddd2c4d495b372ae7c944c6d86eea0d62ca3467e8e1b4e0f711c9029bd93c4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:42 np0005548788.localdomain podman[314862]: 2025-12-06 10:16:42.890720926 +0000 UTC m=+0.177729573 container init 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:42 np0005548788.localdomain podman[314862]: 2025-12-06 10:16:42.906280534 +0000 UTC m=+0.193289201 container start 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:16:42 np0005548788.localdomain dnsmasq[314880]: started, version 2.85 cachesize 150
Dec 06 10:16:42 np0005548788.localdomain dnsmasq[314880]: DNS service limited to local subnets
Dec 06 10:16:42 np0005548788.localdomain dnsmasq[314880]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:42 np0005548788.localdomain dnsmasq[314880]: warning: no upstream servers configured
Dec 06 10:16:42 np0005548788.localdomain dnsmasq-dhcp[314880]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:42 np0005548788.localdomain dnsmasq[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/addn_hosts - 0 addresses
Dec 06 10:16:42 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/host
Dec 06 10:16:42 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/opts
Dec 06 10:16:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:43.100 262572 INFO neutron.agent.dhcp.agent [None req-6eda20f4-1bc9-45a9-9667-1a8d6c7628d4 - - - - - -] DHCP configuration for ports {'3e66b580-e255-4b88-88cc-4f9599773ade'} is completed
Dec 06 10:16:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:43.140 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2914913536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:43.665 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:43.691 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:43 np0005548788.localdomain systemd[1]: tmp-crun.AHfWNI.mount: Deactivated successfully.
Dec 06 10:16:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:44.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:44 np0005548788.localdomain ceph-mon[293643]: pgmap v174: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:44.813 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:46 np0005548788.localdomain ceph-mon[293643]: pgmap v175: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:16:47 np0005548788.localdomain podman[314881]: 2025-12-06 10:16:47.252217598 +0000 UTC m=+0.072682184 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:47 np0005548788.localdomain podman[314881]: 2025-12-06 10:16:47.266437835 +0000 UTC m=+0.086902421 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:16:47 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:16:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:47.439 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:47.440 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:47.440 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:48 np0005548788.localdomain ceph-mon[293643]: pgmap v176: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:48.143 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:16:48 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:48.804 262572 INFO neutron.agent.linux.ip_lib [None req-a0ece95b-c0b7-48b6-b073-ee95bcdab26e - - - - - -] Device tapa7156b54-6e cannot be used as it has no MAC address
Dec 06 10:16:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:48.828 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548788.localdomain kernel: device tapa7156b54-6e entered promiscuous mode
Dec 06 10:16:48 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016208.8367] manager: (tapa7156b54-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Dec 06 10:16:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:48.838 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:48Z|00096|binding|INFO|Claiming lport a7156b54-6eeb-4afa-ad2a-9d4156b5f947 for this chassis.
Dec 06 10:16:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:48Z|00097|binding|INFO|a7156b54-6eeb-4afa-ad2a-9d4156b5f947: Claiming unknown
Dec 06 10:16:48 np0005548788.localdomain podman[314903]: 2025-12-06 10:16:48.84643017 +0000 UTC m=+0.102277614 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:16:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:48.844 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-45d80fd5-0998-4ce5-b225-353084cecb5e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45d80fd5-0998-4ce5-b225-353084cecb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cee3e0c1575f4b46bd60ec5b2e858b9d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c265b287-7d35-4e3f-977d-e3eb783c7748, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=a7156b54-6eeb-4afa-ad2a-9d4156b5f947) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:48.846 159620 INFO neutron.agent.ovn.metadata.agent [-] Port a7156b54-6eeb-4afa-ad2a-9d4156b5f947 in datapath 45d80fd5-0998-4ce5-b225-353084cecb5e bound to our chassis
Dec 06 10:16:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:48.847 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 45d80fd5-0998-4ce5-b225-353084cecb5e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:16:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:48.847 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a9924462-977e-4d61-8dd8-162a0c276473]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:48 np0005548788.localdomain systemd-udevd[314934]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:48Z|00098|binding|INFO|Setting lport a7156b54-6eeb-4afa-ad2a-9d4156b5f947 ovn-installed in OVS
Dec 06 10:16:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:48Z|00099|binding|INFO|Setting lport a7156b54-6eeb-4afa-ad2a-9d4156b5f947 up in Southbound
Dec 06 10:16:48 np0005548788.localdomain podman[314903]: 2025-12-06 10:16:48.875992768 +0000 UTC m=+0.131840192 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:48.876 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa7156b54-6e: No such device
Dec 06 10:16:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:48.913 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:48.944 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:16:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:16:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:16:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:16:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:16:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19209 "" "Go-http-client/1.1"
Dec 06 10:16:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:49.857 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:49 np0005548788.localdomain podman[315005]: 
Dec 06 10:16:49 np0005548788.localdomain podman[315005]: 2025-12-06 10:16:49.899061378 +0000 UTC m=+0.092708950 container create 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:16:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:16:49 np0005548788.localdomain systemd[1]: Started libpod-conmon-781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f.scope.
Dec 06 10:16:49 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:49 np0005548788.localdomain podman[315005]: 2025-12-06 10:16:49.851060963 +0000 UTC m=+0.044708565 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:49 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/796020a53917f7ae11644e03288af3994d36ef58fa672eac1ecd34f09d4171d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:49 np0005548788.localdomain podman[315005]: 2025-12-06 10:16:49.962165148 +0000 UTC m=+0.155812720 container init 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:16:49 np0005548788.localdomain podman[315005]: 2025-12-06 10:16:49.971116562 +0000 UTC m=+0.164764134 container start 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:16:49 np0005548788.localdomain dnsmasq[315032]: started, version 2.85 cachesize 150
Dec 06 10:16:49 np0005548788.localdomain dnsmasq[315032]: DNS service limited to local subnets
Dec 06 10:16:49 np0005548788.localdomain dnsmasq[315032]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:49 np0005548788.localdomain dnsmasq[315032]: warning: no upstream servers configured
Dec 06 10:16:49 np0005548788.localdomain dnsmasq-dhcp[315032]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:49 np0005548788.localdomain dnsmasq[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/addn_hosts - 0 addresses
Dec 06 10:16:49 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/host
Dec 06 10:16:49 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/opts
Dec 06 10:16:50 np0005548788.localdomain podman[315019]: 2025-12-06 10:16:50.020878472 +0000 UTC m=+0.085423677 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:16:50 np0005548788.localdomain podman[315019]: 2025-12-06 10:16:50.053545395 +0000 UTC m=+0.118090590 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:16:50 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:16:50 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:50.104 262572 INFO neutron.agent.dhcp.agent [None req-56299ce2-9df2-4bfd-a93b-742bf582eb81 - - - - - -] DHCP configuration for ports {'4de9a399-f614-455f-a44b-678028492c3f'} is completed
Dec 06 10:16:50 np0005548788.localdomain ceph-mon[293643]: pgmap v177: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:51 np0005548788.localdomain sshd[315040]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e115 do_prune osdmap full prune enabled
Dec 06 10:16:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e116 e116: 6 total, 6 up, 6 in
Dec 06 10:16:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in
Dec 06 10:16:52 np0005548788.localdomain ceph-mon[293643]: pgmap v178: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.146 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:53.166 2 INFO neutron.agent.securitygroups_rpc [None req-26ae0ef5-9433-41c4-a064-a0d5d3110043 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e116 do_prune osdmap full prune enabled
Dec 06 10:16:53 np0005548788.localdomain ceph-mon[293643]: osdmap e116: 6 total, 6 up, 6 in
Dec 06 10:16:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e117 e117: 6 total, 6 up, 6 in
Dec 06 10:16:53 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in
Dec 06 10:16:53 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:53.765 262572 INFO neutron.agent.linux.ip_lib [None req-4d7f38c7-37f3-4e06-9932-c98205b190bc - - - - - -] Device tapaf61bab4-fb cannot be used as it has no MAC address
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.849 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548788.localdomain kernel: device tapaf61bab4-fb entered promiscuous mode
Dec 06 10:16:53 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016213.8619] manager: (tapaf61bab4-fb): new Generic device (/org/freedesktop/NetworkManager/Devices/26)
Dec 06 10:16:53 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:53Z|00100|binding|INFO|Claiming lport af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 for this chassis.
Dec 06 10:16:53 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:53Z|00101|binding|INFO|af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88: Claiming unknown
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.864 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548788.localdomain systemd-udevd[315052]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:53Z|00102|binding|INFO|Setting lport af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 ovn-installed in OVS
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.902 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.904 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf61bab4-fb: No such device
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.945 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:53.971 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e117 do_prune osdmap full prune enabled
Dec 06 10:16:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e118 e118: 6 total, 6 up, 6 in
Dec 06 10:16:54 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in
Dec 06 10:16:54 np0005548788.localdomain ceph-mon[293643]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 4.3 KiB/s wr, 35 op/s
Dec 06 10:16:54 np0005548788.localdomain ceph-mon[293643]: osdmap e117: 6 total, 6 up, 6 in
Dec 06 10:16:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:16:54Z|00103|binding|INFO|Setting lport af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 up in Southbound
Dec 06 10:16:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:54.459 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-513cd954-584d-4b79-a078-541aa9ebb6a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-513cd954-584d-4b79-a078-541aa9ebb6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd6a02136413f4ad3ac51d2c4ffdad3d4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc8db9ff-0233-4f4c-859c-fa0093e57d19, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:54.461 159620 INFO neutron.agent.ovn.metadata.agent [-] Port af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 in datapath 513cd954-584d-4b79-a078-541aa9ebb6a4 bound to our chassis
Dec 06 10:16:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:54.463 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 513cd954-584d-4b79-a078-541aa9ebb6a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:16:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:16:54.464 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[8bab1aab-e9e5-4d50-8e50-dfc5e278bf9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:54.717 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:53Z, description=, device_id=08567390-a1ad-4d3f-b970-4834bfb7fa3e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6856af0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6864580>], id=0eb91bd8-ba3d-4eb5-99fc-557e7d8d8d53, ip_allocation=immediate, mac_address=fa:16:3e:45:c2:6d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:43Z, description=, dns_domain=, id=45d80fd5-0998-4ce5-b225-353084cecb5e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1069567051, port_security_enabled=True, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7754, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1144, status=ACTIVE, subnets=['f0c1abad-b6ee-4f61-ba77-39e16363722c'], tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:16:46Z, vlan_transparent=None, network_id=45d80fd5-0998-4ce5-b225-353084cecb5e, port_security_enabled=False, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1222, status=DOWN, tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:16:53Z on network 45d80fd5-0998-4ce5-b225-353084cecb5e
Dec 06 10:16:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:54.859 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548788.localdomain dnsmasq[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/addn_hosts - 1 addresses
Dec 06 10:16:54 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/host
Dec 06 10:16:54 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/opts
Dec 06 10:16:54 np0005548788.localdomain podman[315116]: 2025-12-06 10:16:54.946507189 +0000 UTC m=+0.057910721 container kill 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:16:55 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:55.064 2 INFO neutron.agent.securitygroups_rpc [None req-6596b1da-4291-462f-a9bc-899ad3053051 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:55 np0005548788.localdomain podman[315154]: 
Dec 06 10:16:55 np0005548788.localdomain podman[315154]: 2025-12-06 10:16:55.106079553 +0000 UTC m=+0.104482892 container create 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:16:55 np0005548788.localdomain podman[315154]: 2025-12-06 10:16:55.047167732 +0000 UTC m=+0.045571131 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:55 np0005548788.localdomain systemd[1]: Started libpod-conmon-1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495.scope.
Dec 06 10:16:55 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:55 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eada68c6d22aaba3e1d24a6c568d378ea4e48dfefbd5678bd95bb7752e4e4d23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:55 np0005548788.localdomain sudo[315173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:16:55 np0005548788.localdomain sudo[315173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:16:55 np0005548788.localdomain sudo[315173]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:55 np0005548788.localdomain podman[315154]: 2025-12-06 10:16:55.176903179 +0000 UTC m=+0.175306488 container init 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:55 np0005548788.localdomain podman[315154]: 2025-12-06 10:16:55.1844161 +0000 UTC m=+0.182819449 container start 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:55 np0005548788.localdomain dnsmasq[315197]: started, version 2.85 cachesize 150
Dec 06 10:16:55 np0005548788.localdomain dnsmasq[315197]: DNS service limited to local subnets
Dec 06 10:16:55 np0005548788.localdomain dnsmasq[315197]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:55 np0005548788.localdomain dnsmasq[315197]: warning: no upstream servers configured
Dec 06 10:16:55 np0005548788.localdomain dnsmasq-dhcp[315197]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:55 np0005548788.localdomain dnsmasq[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/addn_hosts - 0 addresses
Dec 06 10:16:55 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/host
Dec 06 10:16:55 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/opts
Dec 06 10:16:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:55.204 262572 INFO neutron.agent.dhcp.agent [None req-730910c3-54ed-467c-90be-3b3c00a4d239 - - - - - -] DHCP configuration for ports {'0eb91bd8-ba3d-4eb5-99fc-557e7d8d8d53'} is completed
Dec 06 10:16:55 np0005548788.localdomain sudo[315198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:16:55 np0005548788.localdomain sudo[315198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:16:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e118 do_prune osdmap full prune enabled
Dec 06 10:16:55 np0005548788.localdomain ceph-mon[293643]: osdmap e118: 6 total, 6 up, 6 in
Dec 06 10:16:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e119 e119: 6 total, 6 up, 6 in
Dec 06 10:16:55 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in
Dec 06 10:16:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:55.491 262572 INFO neutron.agent.dhcp.agent [None req-da59398b-479f-4358-b286-5508ed56942b - - - - - -] DHCP configuration for ports {'5118e0b3-37e4-4e73-8963-03cb6151fc03'} is completed
Dec 06 10:16:55 np0005548788.localdomain sshd[315040]: Received disconnect from 45.78.194.186 port 46724:11: Bye Bye [preauth]
Dec 06 10:16:55 np0005548788.localdomain sshd[315040]: Disconnected from authenticating user root 45.78.194.186 port 46724 [preauth]
Dec 06 10:16:55 np0005548788.localdomain sudo[315198]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:16:56 np0005548788.localdomain sudo[315247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:16:56 np0005548788.localdomain sudo[315247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:16:56 np0005548788.localdomain sudo[315247]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e119 do_prune osdmap full prune enabled
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e120 e120: 6 total, 6 up, 6 in
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 7.2 KiB/s wr, 59 op/s
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: osdmap e119: 6 total, 6 up, 6 in
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:16:56 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:16:56 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:56.534 2 INFO neutron.agent.securitygroups_rpc [None req-c7b28b51-59d5-4f0a-ad7d-a932cd0ad09d 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:16:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:16:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:57 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:57.214 2 INFO neutron.agent.securitygroups_rpc [None req-dca415e6-2c01-4081-a144-3151bae67c51 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:57 np0005548788.localdomain ceph-mon[293643]: osdmap e120: 6 total, 6 up, 6 in
Dec 06 10:16:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:16:58 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:58.039 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:53Z, description=, device_id=08567390-a1ad-4d3f-b970-4834bfb7fa3e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6856370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68564c0>], id=0eb91bd8-ba3d-4eb5-99fc-557e7d8d8d53, ip_allocation=immediate, mac_address=fa:16:3e:45:c2:6d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:43Z, description=, dns_domain=, id=45d80fd5-0998-4ce5-b225-353084cecb5e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1069567051, port_security_enabled=True, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7754, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1144, status=ACTIVE, subnets=['f0c1abad-b6ee-4f61-ba77-39e16363722c'], tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:16:46Z, vlan_transparent=None, network_id=45d80fd5-0998-4ce5-b225-353084cecb5e, port_security_enabled=False, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1222, status=DOWN, tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:16:53Z on network 45d80fd5-0998-4ce5-b225-353084cecb5e
Dec 06 10:16:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e120 do_prune osdmap full prune enabled
Dec 06 10:16:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e121 e121: 6 total, 6 up, 6 in
Dec 06 10:16:58 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in
Dec 06 10:16:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:58.185 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:58 np0005548788.localdomain dnsmasq[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/addn_hosts - 1 addresses
Dec 06 10:16:58 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/host
Dec 06 10:16:58 np0005548788.localdomain podman[315282]: 2025-12-06 10:16:58.29762589 +0000 UTC m=+0.062124610 container kill 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:16:58 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/opts
Dec 06 10:16:58 np0005548788.localdomain ceph-mon[293643]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:58 np0005548788.localdomain ceph-mon[293643]: osdmap e121: 6 total, 6 up, 6 in
Dec 06 10:16:58 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:58.382 2 INFO neutron.agent.securitygroups_rpc [None req-97e21b62-44c1-4fc5-958c-bcc0268c52d3 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:58 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:16:58.563 262572 INFO neutron.agent.dhcp.agent [None req-0dd9dfe5-d865-4323-b62b-17a10e5af1d5 - - - - - -] DHCP configuration for ports {'0eb91bd8-ba3d-4eb5-99fc-557e7d8d8d53'} is completed
Dec 06 10:16:59 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:16:59.204 2 INFO neutron.agent.securitygroups_rpc [None req-a68c2924-ed4d-4682-a596-626401a139a3 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e121 do_prune osdmap full prune enabled
Dec 06 10:16:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e122 e122: 6 total, 6 up, 6 in
Dec 06 10:16:59 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in
Dec 06 10:16:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:16:59.862 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:00 np0005548788.localdomain ceph-mon[293643]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 15 KiB/s wr, 153 op/s
Dec 06 10:17:00 np0005548788.localdomain ceph-mon[293643]: osdmap e122: 6 total, 6 up, 6 in
Dec 06 10:17:00 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:00.352 2 INFO neutron.agent.securitygroups_rpc [None req-1e316261-d213-40cc-b644-592e2d6242e7 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:01 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:01.032 2 INFO neutron.agent.securitygroups_rpc [None req-af1607ac-cf21-43e9-9dc2-41d6c40546b7 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:01 np0005548788.localdomain dnsmasq[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/addn_hosts - 0 addresses
Dec 06 10:17:01 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/host
Dec 06 10:17:01 np0005548788.localdomain dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/45d80fd5-0998-4ce5-b225-353084cecb5e/opts
Dec 06 10:17:01 np0005548788.localdomain podman[315318]: 2025-12-06 10:17:01.066346794 +0000 UTC m=+0.066645789 container kill 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:17:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:01.283 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:01 np0005548788.localdomain kernel: device tapa7156b54-6e left promiscuous mode
Dec 06 10:17:01 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:01Z|00104|binding|INFO|Releasing lport a7156b54-6eeb-4afa-ad2a-9d4156b5f947 from this chassis (sb_readonly=0)
Dec 06 10:17:01 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:01Z|00105|binding|INFO|Setting lport a7156b54-6eeb-4afa-ad2a-9d4156b5f947 down in Southbound
Dec 06 10:17:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:01.306 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e122 do_prune osdmap full prune enabled
Dec 06 10:17:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e123 e123: 6 total, 6 up, 6 in
Dec 06 10:17:02 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in
Dec 06 10:17:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:17:02 np0005548788.localdomain podman[315340]: 2025-12-06 10:17:02.28163396 +0000 UTC m=+0.101991865 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:17:02 np0005548788.localdomain podman[315340]: 2025-12-06 10:17:02.353690885 +0000 UTC m=+0.174048710 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:17:02 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:17:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:02.930 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-45d80fd5-0998-4ce5-b225-353084cecb5e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45d80fd5-0998-4ce5-b225-353084cecb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cee3e0c1575f4b46bd60ec5b2e858b9d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c265b287-7d35-4e3f-977d-e3eb783c7748, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=a7156b54-6eeb-4afa-ad2a-9d4156b5f947) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:02.933 159620 INFO neutron.agent.ovn.metadata.agent [-] Port a7156b54-6eeb-4afa-ad2a-9d4156b5f947 in datapath 45d80fd5-0998-4ce5-b225-353084cecb5e unbound from our chassis
Dec 06 10:17:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:02.935 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45d80fd5-0998-4ce5-b225-353084cecb5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:02.937 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[308323bf-1b45-4274-a4f8-48e4c46d9a3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:03 np0005548788.localdomain ceph-mon[293643]: pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 12 KiB/s wr, 128 op/s
Dec 06 10:17:03 np0005548788.localdomain ceph-mon[293643]: osdmap e123: 6 total, 6 up, 6 in
Dec 06 10:17:03 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:03.086 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:02Z, description=, device_id=f3eed2e0-6009-48cb-b29a-fc71e49972a4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c689e820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c689e8b0>], id=da697aca-2f76-4aa0-80d7-c36b79bdf2dc, ip_allocation=immediate, mac_address=fa:16:3e:ba:89:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:49Z, description=, dns_domain=, id=513cd954-584d-4b79-a078-541aa9ebb6a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-1149359120, port_security_enabled=True, project_id=d6a02136413f4ad3ac51d2c4ffdad3d4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24722, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1184, status=ACTIVE, subnets=['ee6f9bf4-d755-41b9-a986-580fbee011ca'], tags=[], tenant_id=d6a02136413f4ad3ac51d2c4ffdad3d4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=513cd954-584d-4b79-a078-541aa9ebb6a4, port_security_enabled=False, project_id=d6a02136413f4ad3ac51d2c4ffdad3d4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1248, status=DOWN, tags=[], tenant_id=d6a02136413f4ad3ac51d2c4ffdad3d4, updated_at=2025-12-06T10:17:02Z on network 513cd954-584d-4b79-a078-541aa9ebb6a4
Dec 06 10:17:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:03.187 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:03 np0005548788.localdomain podman[315382]: 2025-12-06 10:17:03.338084715 +0000 UTC m=+0.071120306 container kill 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:03 np0005548788.localdomain dnsmasq[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/addn_hosts - 1 addresses
Dec 06 10:17:03 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/host
Dec 06 10:17:03 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/opts
Dec 06 10:17:03 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:03.533 262572 INFO neutron.agent.dhcp.agent [None req-f2422c54-4b31-45b5-9d2d-f58d65853ccc - - - - - -] DHCP configuration for ports {'da697aca-2f76-4aa0-80d7-c36b79bdf2dc'} is completed
Dec 06 10:17:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e123 do_prune osdmap full prune enabled
Dec 06 10:17:04 np0005548788.localdomain ceph-mon[293643]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 18 KiB/s wr, 198 op/s
Dec 06 10:17:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e124 e124: 6 total, 6 up, 6 in
Dec 06 10:17:04 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in
Dec 06 10:17:04 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:04.263 2 INFO neutron.agent.securitygroups_rpc [None req-585d973c-6716-4802-939d-774d36a541bf 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:04.894 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:05 np0005548788.localdomain ceph-mon[293643]: osdmap e124: 6 total, 6 up, 6 in
Dec 06 10:17:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:05.582 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:02Z, description=, device_id=f3eed2e0-6009-48cb-b29a-fc71e49972a4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68382e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68f8a60>], id=da697aca-2f76-4aa0-80d7-c36b79bdf2dc, ip_allocation=immediate, mac_address=fa:16:3e:ba:89:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:49Z, description=, dns_domain=, id=513cd954-584d-4b79-a078-541aa9ebb6a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-1149359120, port_security_enabled=True, project_id=d6a02136413f4ad3ac51d2c4ffdad3d4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24722, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1184, status=ACTIVE, subnets=['ee6f9bf4-d755-41b9-a986-580fbee011ca'], tags=[], tenant_id=d6a02136413f4ad3ac51d2c4ffdad3d4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=513cd954-584d-4b79-a078-541aa9ebb6a4, port_security_enabled=False, project_id=d6a02136413f4ad3ac51d2c4ffdad3d4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1248, status=DOWN, tags=[], tenant_id=d6a02136413f4ad3ac51d2c4ffdad3d4, updated_at=2025-12-06T10:17:02Z on network 513cd954-584d-4b79-a078-541aa9ebb6a4
Dec 06 10:17:05 np0005548788.localdomain dnsmasq[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/addn_hosts - 1 addresses
Dec 06 10:17:05 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/host
Dec 06 10:17:05 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/opts
Dec 06 10:17:05 np0005548788.localdomain podman[315421]: 2025-12-06 10:17:05.799162486 +0000 UTC m=+0.060458349 container kill 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:17:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:17:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:17:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:17:05 np0005548788.localdomain podman[315436]: 2025-12-06 10:17:05.918332798 +0000 UTC m=+0.091326537 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:05 np0005548788.localdomain podman[315436]: 2025-12-06 10:17:05.929524572 +0000 UTC m=+0.102518271 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:05 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:17:05 np0005548788.localdomain podman[315438]: 2025-12-06 10:17:05.9818511 +0000 UTC m=+0.144796131 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:17:06 np0005548788.localdomain podman[315437]: 2025-12-06 10:17:06.017131285 +0000 UTC m=+0.183391288 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:17:06 np0005548788.localdomain podman[315437]: 2025-12-06 10:17:06.026455461 +0000 UTC m=+0.192715474 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:17:06 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:17:06 np0005548788.localdomain podman[315438]: 2025-12-06 10:17:06.049782367 +0000 UTC m=+0.212727348 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Dec 06 10:17:06 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:17:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e124 do_prune osdmap full prune enabled
Dec 06 10:17:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:06.108 262572 INFO neutron.agent.dhcp.agent [None req-3cf0652b-0b4a-4708-90ca-9a195ef97cbe - - - - - -] DHCP configuration for ports {'da697aca-2f76-4aa0-80d7-c36b79bdf2dc'} is completed
Dec 06 10:17:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e125 e125: 6 total, 6 up, 6 in
Dec 06 10:17:06 np0005548788.localdomain ceph-mon[293643]: pgmap v194: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 6.3 KiB/s wr, 74 op/s
Dec 06 10:17:06 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in
Dec 06 10:17:06 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:06.957 2 INFO neutron.agent.securitygroups_rpc [None req-0061dfd5-bb12-495d-9673-65d6ef0bbdf6 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:06 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:06.961 2 INFO neutron.agent.securitygroups_rpc [None req-113b5acf-416d-4ae9-b224-76f1b565f762 7dcd2b11aeb4499894c7ac7c29cb6997 d6a02136413f4ad3ac51d2c4ffdad3d4 - - default default] Security group member updated ['58296f43-3702-412f-8387-07510507ed41']
Dec 06 10:17:07 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:07.020 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688b6d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c688bc40>], id=14a2949c-3304-44e0-8c66-e2067777cec1, ip_allocation=immediate, mac_address=fa:16:3e:5e:66:65, name=tempest-FloatingIPAdminTestJSON-1722887507, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:49Z, description=, dns_domain=, id=513cd954-584d-4b79-a078-541aa9ebb6a4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-1149359120, port_security_enabled=True, project_id=d6a02136413f4ad3ac51d2c4ffdad3d4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24722, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1184, status=ACTIVE, subnets=['ee6f9bf4-d755-41b9-a986-580fbee011ca'], tags=[], tenant_id=d6a02136413f4ad3ac51d2c4ffdad3d4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=513cd954-584d-4b79-a078-541aa9ebb6a4, port_security_enabled=True, project_id=d6a02136413f4ad3ac51d2c4ffdad3d4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['58296f43-3702-412f-8387-07510507ed41'], standard_attr_id=1266, status=DOWN, tags=[], tenant_id=d6a02136413f4ad3ac51d2c4ffdad3d4, updated_at=2025-12-06T10:17:06Z on network 513cd954-584d-4b79-a078-541aa9ebb6a4
Dec 06 10:17:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e125 do_prune osdmap full prune enabled
Dec 06 10:17:07 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:07.066 262572 INFO neutron.agent.linux.ip_lib [None req-5518df9b-5363-434d-8520-19adb2d54429 - - - - - -] Device tap6111a240-ad cannot be used as it has no MAC address
Dec 06 10:17:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e126 e126: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:07.101 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:07 np0005548788.localdomain kernel: device tap6111a240-ad entered promiscuous mode
Dec 06 10:17:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:07Z|00106|binding|INFO|Claiming lport 6111a240-ada3-4083-b2fa-6157e4092850 for this chassis.
Dec 06 10:17:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:07Z|00107|binding|INFO|6111a240-ada3-4083-b2fa-6157e4092850: Claiming unknown
Dec 06 10:17:07 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016227.1142] manager: (tap6111a240-ad): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Dec 06 10:17:07 np0005548788.localdomain systemd-udevd[315524]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:07 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:07.127 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-4ecc6abe-ee03-49a8-a171-e9b65a6e9392', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ecc6abe-ee03-49a8-a171-e9b65a6e9392', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cee3e0c1575f4b46bd60ec5b2e858b9d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=101e0aab-e15b-4dde-9e78-abbe69813e58, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=6111a240-ada3-4083-b2fa-6157e4092850) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:07 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:07.132 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 6111a240-ada3-4083-b2fa-6157e4092850 in datapath 4ecc6abe-ee03-49a8-a171-e9b65a6e9392 bound to our chassis
Dec 06 10:17:07 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:07.139 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4ecc6abe-ee03-49a8-a171-e9b65a6e9392 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:07 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:07.141 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[36e67b3d-01b9-410e-90a2-8d7eceaf59b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:07Z|00108|binding|INFO|Setting lport 6111a240-ada3-4083-b2fa-6157e4092850 ovn-installed in OVS
Dec 06 10:17:07 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:07Z|00109|binding|INFO|Setting lport 6111a240-ada3-4083-b2fa-6157e4092850 up in Southbound
Dec 06 10:17:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:07.159 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap6111a240-ad: No such device
Dec 06 10:17:07 np0005548788.localdomain ceph-mon[293643]: osdmap e125: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548788.localdomain ceph-mon[293643]: osdmap e126: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:07.206 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:07.238 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:07 np0005548788.localdomain podman[315557]: 2025-12-06 10:17:07.292144177 +0000 UTC m=+0.067967860 container kill 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:17:07 np0005548788.localdomain dnsmasq[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/addn_hosts - 2 addresses
Dec 06 10:17:07 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/host
Dec 06 10:17:07 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/opts
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:17:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:08.190 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e126 do_prune osdmap full prune enabled
Dec 06 10:17:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e127 e127: 6 total, 6 up, 6 in
Dec 06 10:17:08 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:08.215 2 INFO neutron.agent.securitygroups_rpc [None req-7b711491-888b-4783-949a-3ad1e34a6987 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:08 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in
Dec 06 10:17:08 np0005548788.localdomain ceph-mon[293643]: pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 7.5 KiB/s wr, 88 op/s
Dec 06 10:17:08 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:08.256 262572 INFO neutron.agent.dhcp.agent [None req-f433337e-a69e-4efb-bd74-58da2ad28238 - - - - - -] DHCP configuration for ports {'14a2949c-3304-44e0-8c66-e2067777cec1'} is completed
Dec 06 10:17:08 np0005548788.localdomain podman[315626]: 
Dec 06 10:17:08 np0005548788.localdomain podman[315626]: 2025-12-06 10:17:08.754333121 +0000 UTC m=+0.110729984 container create d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:17:08 np0005548788.localdomain systemd[1]: Started libpod-conmon-d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96.scope.
Dec 06 10:17:08 np0005548788.localdomain podman[315626]: 2025-12-06 10:17:08.701493647 +0000 UTC m=+0.057890560 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:08 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:08 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ab8f188603ea922fb92946afafd2c6b4d4c11b1c4e2b36d8649b6f3a50eb383/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:08 np0005548788.localdomain podman[315626]: 2025-12-06 10:17:08.827141198 +0000 UTC m=+0.183538081 container init d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:17:08 np0005548788.localdomain podman[315626]: 2025-12-06 10:17:08.836428324 +0000 UTC m=+0.192825227 container start d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:17:08 np0005548788.localdomain dnsmasq[315644]: started, version 2.85 cachesize 150
Dec 06 10:17:08 np0005548788.localdomain dnsmasq[315644]: DNS service limited to local subnets
Dec 06 10:17:08 np0005548788.localdomain dnsmasq[315644]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:08 np0005548788.localdomain dnsmasq[315644]: warning: no upstream servers configured
Dec 06 10:17:08 np0005548788.localdomain dnsmasq-dhcp[315644]: DHCP, static leases only on 10.101.0.0, lease time 1d
Dec 06 10:17:08 np0005548788.localdomain dnsmasq[315644]: read /var/lib/neutron/dhcp/4ecc6abe-ee03-49a8-a171-e9b65a6e9392/addn_hosts - 0 addresses
Dec 06 10:17:08 np0005548788.localdomain dnsmasq-dhcp[315644]: read /var/lib/neutron/dhcp/4ecc6abe-ee03-49a8-a171-e9b65a6e9392/host
Dec 06 10:17:08 np0005548788.localdomain dnsmasq-dhcp[315644]: read /var/lib/neutron/dhcp/4ecc6abe-ee03-49a8-a171-e9b65a6e9392/opts
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:17:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:17:08 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:08.862 2 INFO neutron.agent.securitygroups_rpc [None req-5531eff7-1536-47cb-87b6-fc07778b8cfc 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:09 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:09.016 262572 INFO neutron.agent.dhcp.agent [None req-dce522ca-ecbd-495b-b3eb-0344b132730d - - - - - -] DHCP configuration for ports {'815d85b2-02e8-4ef5-b5c8-dfedf1f79514'} is completed
Dec 06 10:17:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e127 do_prune osdmap full prune enabled
Dec 06 10:17:09 np0005548788.localdomain ceph-mon[293643]: osdmap e127: 6 total, 6 up, 6 in
Dec 06 10:17:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e128 e128: 6 total, 6 up, 6 in
Dec 06 10:17:09 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in
Dec 06 10:17:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:09.897 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e128 do_prune osdmap full prune enabled
Dec 06 10:17:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e129 e129: 6 total, 6 up, 6 in
Dec 06 10:17:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in
Dec 06 10:17:10 np0005548788.localdomain ceph-mon[293643]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 188 KiB/s rd, 16 KiB/s wr, 256 op/s
Dec 06 10:17:10 np0005548788.localdomain ceph-mon[293643]: osdmap e128: 6 total, 6 up, 6 in
Dec 06 10:17:10 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:10.467 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:10Z, description=, device_id=08567390-a1ad-4d3f-b970-4834bfb7fa3e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6960610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6960c70>], id=a368a98f-cbf5-4544-ac7a-22004927c804, ip_allocation=immediate, mac_address=fa:16:3e:57:8f:b4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:38Z, description=, dns_domain=, id=c419a468-c335-4923-8d17-63d564d8a340, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-1395142435, port_security_enabled=True, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42628, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1128, status=ACTIVE, subnets=['73893788-aebc-4ef1-ad43-de254503b77b'], tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:16:40Z, vlan_transparent=None, network_id=c419a468-c335-4923-8d17-63d564d8a340, port_security_enabled=False, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1274, status=DOWN, tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:17:10Z on network c419a468-c335-4923-8d17-63d564d8a340
Dec 06 10:17:10 np0005548788.localdomain dnsmasq[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/addn_hosts - 1 addresses
Dec 06 10:17:10 np0005548788.localdomain podman[315663]: 2025-12-06 10:17:10.692012807 +0000 UTC m=+0.065220525 container kill 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:17:10 np0005548788.localdomain systemd[1]: tmp-crun.tjmXKa.mount: Deactivated successfully.
Dec 06 10:17:10 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/host
Dec 06 10:17:10 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/opts
Dec 06 10:17:10 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:10.973 262572 INFO neutron.agent.dhcp.agent [None req-09b16cf9-69b0-4cae-9ea1-b0c3de16b5eb - - - - - -] DHCP configuration for ports {'a368a98f-cbf5-4544-ac7a-22004927c804'} is completed
Dec 06 10:17:11 np0005548788.localdomain ceph-mon[293643]: osdmap e129: 6 total, 6 up, 6 in
Dec 06 10:17:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:11.614 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:11.615 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:11.616 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:17:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e129 do_prune osdmap full prune enabled
Dec 06 10:17:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e130 e130: 6 total, 6 up, 6 in
Dec 06 10:17:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in
Dec 06 10:17:12 np0005548788.localdomain ceph-mon[293643]: pgmap v202: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 234 KiB/s rd, 20 KiB/s wr, 319 op/s
Dec 06 10:17:12 np0005548788.localdomain ceph-mon[293643]: osdmap e130: 6 total, 6 up, 6 in
Dec 06 10:17:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:12.611 2 INFO neutron.agent.securitygroups_rpc [req-0f5bcb52-14d6-4090-84d5-2a6fc264a912 req-f6b94b35-a5a9-45fc-80c3-03af12f9ebaa b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['1e2df8fe-9d93-4483-a509-0caee18c220e']
Dec 06 10:17:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:12.814 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:10Z, description=, device_id=08567390-a1ad-4d3f-b970-4834bfb7fa3e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c684ba60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c684beb0>], id=a368a98f-cbf5-4544-ac7a-22004927c804, ip_allocation=immediate, mac_address=fa:16:3e:57:8f:b4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:38Z, description=, dns_domain=, id=c419a468-c335-4923-8d17-63d564d8a340, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-1395142435, port_security_enabled=True, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42628, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1128, status=ACTIVE, subnets=['73893788-aebc-4ef1-ad43-de254503b77b'], tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:16:40Z, vlan_transparent=None, network_id=c419a468-c335-4923-8d17-63d564d8a340, port_security_enabled=False, project_id=cee3e0c1575f4b46bd60ec5b2e858b9d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1274, status=DOWN, tags=[], tenant_id=cee3e0c1575f4b46bd60ec5b2e858b9d, updated_at=2025-12-06T10:17:10Z on network c419a468-c335-4923-8d17-63d564d8a340
Dec 06 10:17:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:12.965 2 INFO neutron.agent.securitygroups_rpc [None req-5b35cf9b-2c10-4acb-804d-e7f71d7bfae3 7dcd2b11aeb4499894c7ac7c29cb6997 d6a02136413f4ad3ac51d2c4ffdad3d4 - - default default] Security group member updated ['58296f43-3702-412f-8387-07510507ed41']
Dec 06 10:17:13 np0005548788.localdomain dnsmasq[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/addn_hosts - 1 addresses
Dec 06 10:17:13 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/host
Dec 06 10:17:13 np0005548788.localdomain podman[315699]: 2025-12-06 10:17:13.050226107 +0000 UTC m=+0.070562919 container kill 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:17:13 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/opts
Dec 06 10:17:13 np0005548788.localdomain systemd[1]: tmp-crun.k2nEZ2.mount: Deactivated successfully.
Dec 06 10:17:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:13.192 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:13 np0005548788.localdomain dnsmasq[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/addn_hosts - 1 addresses
Dec 06 10:17:13 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/host
Dec 06 10:17:13 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/opts
Dec 06 10:17:13 np0005548788.localdomain podman[315733]: 2025-12-06 10:17:13.235307925 +0000 UTC m=+0.065041520 container kill 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:17:13 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:13.471 262572 INFO neutron.agent.dhcp.agent [None req-fce1c00c-8e97-4249-bf3e-e092fe5e06f9 - - - - - -] DHCP configuration for ports {'a368a98f-cbf5-4544-ac7a-22004927c804'} is completed
Dec 06 10:17:13 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:13.898 2 INFO neutron.agent.securitygroups_rpc [req-01e69ff7-4c57-4a62-a8e2-72eac205e556 req-eb6ec33a-4a21-4246-bab7-a4fceda1903a b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['73772eb3-7feb-4994-9518-58f9e6d5a8ed']
Dec 06 10:17:14 np0005548788.localdomain ceph-mon[293643]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 137 KiB/s rd, 10 KiB/s wr, 188 op/s
Dec 06 10:17:14 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:14.677 2 INFO neutron.agent.securitygroups_rpc [None req-591e6c3a-21e9-4cb6-8654-ee5dfe5ee17d f89e0038548e41fa9a8202b7a7e9ade1 49bb78ce003e4bec87707ab7af03ae7e - - default default] Security group rule updated ['7d9717d3-d014-450e-9e8d-c62143b51d32']
Dec 06 10:17:14 np0005548788.localdomain dnsmasq[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/addn_hosts - 0 addresses
Dec 06 10:17:14 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/host
Dec 06 10:17:14 np0005548788.localdomain dnsmasq-dhcp[314880]: read /var/lib/neutron/dhcp/c419a468-c335-4923-8d17-63d564d8a340/opts
Dec 06 10:17:14 np0005548788.localdomain podman[315786]: 2025-12-06 10:17:14.739333895 +0000 UTC m=+0.063873944 container kill 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:14 np0005548788.localdomain systemd[1]: tmp-crun.27aBhw.mount: Deactivated successfully.
Dec 06 10:17:14 np0005548788.localdomain dnsmasq[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/addn_hosts - 0 addresses
Dec 06 10:17:14 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/host
Dec 06 10:17:14 np0005548788.localdomain podman[315798]: 2025-12-06 10:17:14.809420659 +0000 UTC m=+0.061110799 container kill 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:14 np0005548788.localdomain dnsmasq-dhcp[315197]: read /var/lib/neutron/dhcp/513cd954-584d-4b79-a078-541aa9ebb6a4/opts
Dec 06 10:17:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:14.899 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:14.959 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:14Z|00110|binding|INFO|Releasing lport c8de0746-01bd-4a54-bcab-62064dbd640a from this chassis (sb_readonly=0)
Dec 06 10:17:14 np0005548788.localdomain kernel: device tapc8de0746-01 left promiscuous mode
Dec 06 10:17:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:14Z|00111|binding|INFO|Setting lport c8de0746-01bd-4a54-bcab-62064dbd640a down in Southbound
Dec 06 10:17:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:14.970 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-c419a468-c335-4923-8d17-63d564d8a340', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c419a468-c335-4923-8d17-63d564d8a340', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cee3e0c1575f4b46bd60ec5b2e858b9d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f1034ba-84b2-40f7-8d01-6a7cd3ab432d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=c8de0746-01bd-4a54-bcab-62064dbd640a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:14.972 159620 INFO neutron.agent.ovn.metadata.agent [-] Port c8de0746-01bd-4a54-bcab-62064dbd640a in datapath c419a468-c335-4923-8d17-63d564d8a340 unbound from our chassis
Dec 06 10:17:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:14.974 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c419a468-c335-4923-8d17-63d564d8a340, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:14.975 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[110c754b-b34f-4837-bc4a-bd58d375c808]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:14.980 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:15.007 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:15Z|00112|binding|INFO|Releasing lport af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 from this chassis (sb_readonly=0)
Dec 06 10:17:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:15Z|00113|binding|INFO|Setting lport af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 down in Southbound
Dec 06 10:17:15 np0005548788.localdomain kernel: device tapaf61bab4-fb left promiscuous mode
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.020 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-513cd954-584d-4b79-a078-541aa9ebb6a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-513cd954-584d-4b79-a078-541aa9ebb6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd6a02136413f4ad3ac51d2c4ffdad3d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc8db9ff-0233-4f4c-859c-fa0093e57d19, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.022 159620 INFO neutron.agent.ovn.metadata.agent [-] Port af61bab4-fb90-4cb8-bd4f-bd0f2dc9be88 in datapath 513cd954-584d-4b79-a078-541aa9ebb6a4 unbound from our chassis
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.024 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 513cd954-584d-4b79-a078-541aa9ebb6a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.025 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe364cf-a401-48cc-9d70-8b13460f91b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:15.036 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:15.530 2 INFO neutron.agent.securitygroups_rpc [req-802ef8ad-2f30-424a-8810-ccf196e89ec8 req-2c9ca4ac-9b05-42f0-9546-b86c6383ded6 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['80cd7ff3-0b8b-4d61-9358-b2f28d5f4668']
Dec 06 10:17:15 np0005548788.localdomain podman[315846]: 2025-12-06 10:17:15.622526327 +0000 UTC m=+0.062152742 container kill d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:15 np0005548788.localdomain dnsmasq[315644]: exiting on receipt of SIGTERM
Dec 06 10:17:15 np0005548788.localdomain systemd[1]: libpod-d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96.scope: Deactivated successfully.
Dec 06 10:17:15 np0005548788.localdomain podman[315859]: 2025-12-06 10:17:15.695433097 +0000 UTC m=+0.058331464 container died d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:17:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:15Z|00114|binding|INFO|Removing iface tap6111a240-ad ovn-installed in OVS
Dec 06 10:17:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:15Z|00115|binding|INFO|Removing lport 6111a240-ada3-4083-b2fa-6157e4092850 ovn-installed in OVS
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.712 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bab68bca-1858-404f-9fb1-c335b219dc39 with type ""
Dec 06 10:17:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:15.713 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.714 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-4ecc6abe-ee03-49a8-a171-e9b65a6e9392', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ecc6abe-ee03-49a8-a171-e9b65a6e9392', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cee3e0c1575f4b46bd60ec5b2e858b9d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=101e0aab-e15b-4dde-9e78-abbe69813e58, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=6111a240-ada3-4083-b2fa-6157e4092850) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.716 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 6111a240-ada3-4083-b2fa-6157e4092850 in datapath 4ecc6abe-ee03-49a8-a171-e9b65a6e9392 unbound from our chassis
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.719 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ecc6abe-ee03-49a8-a171-e9b65a6e9392, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:15.722 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:15.721 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b74157d-1601-4bc6-b7c9-89a8aca4282b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-7ab8f188603ea922fb92946afafd2c6b4d4c11b1c4e2b36d8649b6f3a50eb383-merged.mount: Deactivated successfully.
Dec 06 10:17:15 np0005548788.localdomain podman[315859]: 2025-12-06 10:17:15.738765588 +0000 UTC m=+0.101663925 container cleanup d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:15 np0005548788.localdomain systemd[1]: libpod-conmon-d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96.scope: Deactivated successfully.
Dec 06 10:17:15 np0005548788.localdomain podman[315861]: 2025-12-06 10:17:15.779358166 +0000 UTC m=+0.133108832 container remove d3e6e1d96c267ff3c811955435cb7c0a51ef1048dbb9466a346b44afa435cd96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ecc6abe-ee03-49a8-a171-e9b65a6e9392, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:17:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:15.793 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain kernel: device tap6111a240-ad left promiscuous mode
Dec 06 10:17:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:15.806 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:15 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d4ecc6abe\x2dee03\x2d49a8\x2da171\x2de9b65a6e9392.mount: Deactivated successfully.
Dec 06 10:17:15 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:15.824 262572 INFO neutron.agent.dhcp.agent [None req-aaa61bfc-dc13-4d02-bdd5-1ccd4b304244 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:15 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:15.922 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:16.263 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:16 np0005548788.localdomain ceph-mon[293643]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 8.3 KiB/s wr, 153 op/s
Dec 06 10:17:16 np0005548788.localdomain dnsmasq[315197]: exiting on receipt of SIGTERM
Dec 06 10:17:16 np0005548788.localdomain podman[315905]: 2025-12-06 10:17:16.671226534 +0000 UTC m=+0.062217433 container kill 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:17:16 np0005548788.localdomain systemd[1]: libpod-1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495.scope: Deactivated successfully.
Dec 06 10:17:16 np0005548788.localdomain podman[315919]: 2025-12-06 10:17:16.740004847 +0000 UTC m=+0.052567996 container died 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:17:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:16 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-eada68c6d22aaba3e1d24a6c568d378ea4e48dfefbd5678bd95bb7752e4e4d23-merged.mount: Deactivated successfully.
Dec 06 10:17:16 np0005548788.localdomain podman[315919]: 2025-12-06 10:17:16.772294469 +0000 UTC m=+0.084857578 container cleanup 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:17:16 np0005548788.localdomain systemd[1]: libpod-conmon-1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495.scope: Deactivated successfully.
Dec 06 10:17:16 np0005548788.localdomain podman[315920]: 2025-12-06 10:17:16.816432146 +0000 UTC m=+0.123911440 container remove 1eee95a3654df13d3be195e9565461c9c566889e3fbe0ba6ab39af3254d80495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-513cd954-584d-4b79-a078-541aa9ebb6a4, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:17:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:16.845 262572 INFO neutron.agent.dhcp.agent [None req-8e7feb09-6e4a-425e-8435-fd05cba18447 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:17.042 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e130 do_prune osdmap full prune enabled
Dec 06 10:17:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e131 e131: 6 total, 6 up, 6 in
Dec 06 10:17:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in
Dec 06 10:17:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:17.225 2 INFO neutron.agent.securitygroups_rpc [req-d0c022c7-5c29-48d9-b4af-ef083b33fa00 req-5f51dca4-f136-4f7f-a521-ca766171afcb b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['48d24f9a-1de0-4ca7-bff4-bdd00474b49e']
Dec 06 10:17:17 np0005548788.localdomain dnsmasq[315032]: exiting on receipt of SIGTERM
Dec 06 10:17:17 np0005548788.localdomain podman[315963]: 2025-12-06 10:17:17.372141383 +0000 UTC m=+0.061148170 container kill 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: libpod-781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f.scope: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:17:17 np0005548788.localdomain podman[315975]: 2025-12-06 10:17:17.454780273 +0000 UTC m=+0.069520327 container died 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:17:17 np0005548788.localdomain podman[315984]: 2025-12-06 10:17:17.527137436 +0000 UTC m=+0.123046772 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:17 np0005548788.localdomain podman[315984]: 2025-12-06 10:17:17.543698805 +0000 UTC m=+0.139608171 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:17:17 np0005548788.localdomain podman[315975]: 2025-12-06 10:17:17.55523996 +0000 UTC m=+0.169979984 container cleanup 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: libpod-conmon-781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f.scope: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain podman[315982]: 2025-12-06 10:17:17.58845569 +0000 UTC m=+0.191096633 container remove 781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-45d80fd5-0998-4ce5-b225-353084cecb5e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:17:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:17.618 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:17:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:17.659 2 INFO neutron.agent.securitygroups_rpc [None req-2df808f7-3669-4bdd-a1f6-a6327b63c196 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d513cd954\x2d584d\x2d4b79\x2da078\x2d541aa9ebb6a4.mount: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-796020a53917f7ae11644e03288af3994d36ef58fa672eac1ecd34f09d4171d8-merged.mount: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-781a4842012961a3e13ecf934e6b5c2c872175c983c26c1f8ba06958f3894a4f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d45d80fd5\x2d0998\x2d4ce5\x2db225\x2d353084cecb5e.mount: Deactivated successfully.
Dec 06 10:17:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:17.977 262572 INFO neutron.agent.dhcp.agent [None req-5dfcfa81-69bf-45ca-a896-e162d068ec06 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:18 np0005548788.localdomain ceph-mon[293643]: osdmap e131: 6 total, 6 up, 6 in
Dec 06 10:17:18 np0005548788.localdomain ceph-mon[293643]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 98 KiB/s rd, 7.3 KiB/s wr, 135 op/s
Dec 06 10:17:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:18.194 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:18.534 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:18.890 2 INFO neutron.agent.securitygroups_rpc [None req-a4d21316-b177-48b7-92ec-319ed42d1b0b 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:17:19 np0005548788.localdomain podman[316023]: 2025-12-06 10:17:19.25190654 +0000 UTC m=+0.076745039 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:17:19 np0005548788.localdomain podman[316023]: 2025-12-06 10:17:19.290553647 +0000 UTC m=+0.115392096 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:17:19 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:17:19 np0005548788.localdomain podman[316063]: 2025-12-06 10:17:19.424462493 +0000 UTC m=+0.057049265 container kill 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:17:19 np0005548788.localdomain dnsmasq[314880]: exiting on receipt of SIGTERM
Dec 06 10:17:19 np0005548788.localdomain systemd[1]: libpod-14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986.scope: Deactivated successfully.
Dec 06 10:17:19 np0005548788.localdomain podman[316075]: 2025-12-06 10:17:19.498446216 +0000 UTC m=+0.057757456 container died 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:17:19 np0005548788.localdomain podman[316075]: 2025-12-06 10:17:19.524527137 +0000 UTC m=+0.083838307 container cleanup 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:19 np0005548788.localdomain systemd[1]: libpod-conmon-14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986.scope: Deactivated successfully.
Dec 06 10:17:19 np0005548788.localdomain podman[316077]: 2025-12-06 10:17:19.568110627 +0000 UTC m=+0.120830204 container remove 14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c419a468-c335-4923-8d17-63d564d8a340, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:17:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:17:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:17:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:17:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:19.633 2 INFO neutron.agent.securitygroups_rpc [None req-de533a6a-08ae-42c2-b158-11c15e64ecbf 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:17:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18731 "" "Go-http-client/1.1"
Dec 06 10:17:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:19.700 2 INFO neutron.agent.securitygroups_rpc [req-5de42e7e-0662-4156-9401-22106a567059 req-ed4ee54f-d494-4326-9cfe-66d7201bb9f8 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['9b6ed766-684e-4de1-9195-49dc13639cf2']
Dec 06 10:17:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:19.815 262572 INFO neutron.agent.dhcp.agent [None req-2316f8fc-6169-4718-9029-ebf38c79c80a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:19.816 262572 INFO neutron.agent.dhcp.agent [None req-2316f8fc-6169-4718-9029-ebf38c79c80a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:19.901 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:17:20 np0005548788.localdomain ceph-mon[293643]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 6.2 KiB/s wr, 114 op/s
Dec 06 10:17:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-59ddd2c4d495b372ae7c944c6d86eea0d62ca3467e8e1b4e0f711c9029bd93c4-merged.mount: Deactivated successfully.
Dec 06 10:17:20 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14d02e2f50fe1e61b99a4f455994aa825c638ad69bda453a6cae88ab18372986-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:20 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2dc419a468\x2dc335\x2d4923\x2d8d17\x2d63d564d8a340.mount: Deactivated successfully.
Dec 06 10:17:20 np0005548788.localdomain podman[316102]: 2025-12-06 10:17:20.257280956 +0000 UTC m=+0.082512916 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:20 np0005548788.localdomain podman[316102]: 2025-12-06 10:17:20.261131384 +0000 UTC m=+0.086363354 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:17:20 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:17:20 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:20.329 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:20 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:20.531 2 INFO neutron.agent.securitygroups_rpc [req-0deed905-7e01-4df3-9b96-c6dd2bc740af req-aa574686-cd75-40ee-9098-a2781b4cfdf3 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['9b6ed766-684e-4de1-9195-49dc13639cf2']
Dec 06 10:17:21 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:21.274 2 INFO neutron.agent.securitygroups_rpc [None req-a16f2b30-088a-4292-a104-7f6939a88353 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:21 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:21.332 2 INFO neutron.agent.securitygroups_rpc [req-eb1d2fcf-8073-401d-9c1d-cc925d78bfca req-148b7f3e-4a54-4ceb-8b18-a63fa0926a26 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['9b6ed766-684e-4de1-9195-49dc13639cf2']
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.093990) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242094036, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2313, "num_deletes": 264, "total_data_size": 2379960, "memory_usage": 2435568, "flush_reason": "Manual Compaction"}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242108258, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2275089, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25476, "largest_seqno": 27788, "table_properties": {"data_size": 2265916, "index_size": 5678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20684, "raw_average_key_size": 21, "raw_value_size": 2246918, "raw_average_value_size": 2321, "num_data_blocks": 247, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016079, "oldest_key_time": 1765016079, "file_creation_time": 1765016242, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 14316 microseconds, and 5702 cpu microseconds.
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.108305) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2275089 bytes OK
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.108327) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.109906) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.109926) EVENT_LOG_v1 {"time_micros": 1765016242109919, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.109948) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2370249, prev total WAL file size 2370249, number of live WAL files 2.
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.110753) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2221KB)], [45(18MB)]
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242110796, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 21558391, "oldest_snapshot_seqno": -1}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12583 keys, 17569288 bytes, temperature: kUnknown
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242201618, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 17569288, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17497821, "index_size": 38918, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 336683, "raw_average_key_size": 26, "raw_value_size": 17283884, "raw_average_value_size": 1373, "num_data_blocks": 1480, "num_entries": 12583, "num_filter_entries": 12583, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016242, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.201961) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 17569288 bytes
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.203675) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.1 rd, 193.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 18.4 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(17.2) write-amplify(7.7) OK, records in: 13121, records dropped: 538 output_compression: NoCompression
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.203709) EVENT_LOG_v1 {"time_micros": 1765016242203694, "job": 26, "event": "compaction_finished", "compaction_time_micros": 90935, "compaction_time_cpu_micros": 47047, "output_level": 6, "num_output_files": 1, "total_output_size": 17569288, "num_input_records": 13121, "num_output_records": 12583, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242204441, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242207701, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.110661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.207831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.207837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.207840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.207843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:17:22.207846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548788.localdomain ceph-mon[293643]: pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 5.5 KiB/s wr, 101 op/s
Dec 06 10:17:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:23.197 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:24 np0005548788.localdomain ceph-mon[293643]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:24.904 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:26 np0005548788.localdomain ceph-mon[293643]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:27 np0005548788.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 06 10:17:28 np0005548788.localdomain ceph-mon[293643]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:28.222 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:29.500 2 INFO neutron.agent.securitygroups_rpc [None req-b4a3dd75-3886-433c-a68a-5b82ba491223 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:29.927 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:30 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:30.065 2 INFO neutron.agent.securitygroups_rpc [None req-77263169-ab43-473e-a592-07200b19e18c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:30 np0005548788.localdomain ceph-mon[293643]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:30 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:30.256 2 INFO neutron.agent.securitygroups_rpc [None req-b0fdf288-4ef8-4212-8aee-98bfee473c24 8eeb1ce8ea6f4981a55c23fbea57f4cb f9595f0635f14c2196533c0f5ee5dc3b - - default default] Security group member updated ['cab1d39e-aba5-4938-880e-87b80fed90d0']
Dec 06 10:17:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:32 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:32.193 2 INFO neutron.agent.securitygroups_rpc [None req-a2daea0b-127d-4cb1-8d58-679cf0ec3092 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:32 np0005548788.localdomain ceph-mon[293643]: pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:32 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:32.684 2 INFO neutron.agent.securitygroups_rpc [None req-a950d9cb-4b90-43c7-9619-4f314921acec 8eeb1ce8ea6f4981a55c23fbea57f4cb f9595f0635f14c2196533c0f5ee5dc3b - - default default] Security group member updated ['cab1d39e-aba5-4938-880e-87b80fed90d0']
Dec 06 10:17:33 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:33.079 2 INFO neutron.agent.securitygroups_rpc [None req-675c08cc-007c-4dc9-986b-f4514913c9a2 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:33 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:17:33 np0005548788.localdomain podman[316122]: 2025-12-06 10:17:33.19886747 +0000 UTC m=+0.086874361 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:17:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:33.224 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548788.localdomain podman[316122]: 2025-12-06 10:17:33.243702728 +0000 UTC m=+0.131709619 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:17:33 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:17:33 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:33.674 2 INFO neutron.agent.securitygroups_rpc [None req-2bf571b9-2f59-4b7c-8546-bb481f9be7b1 3ea76362796945abb0389f60eab07566 23fdd860878442e1b8fc77e4ae3ef271 - - default default] Security group member updated ['dd9785c1-eb5d-4293-ac78-0fc1ce108f20']
Dec 06 10:17:34 np0005548788.localdomain ceph-mon[293643]: pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1151500263' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:34 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:34.249 2 INFO neutron.agent.securitygroups_rpc [None req-f3a7982c-6432-4aaa-a51f-6f45752d4aa1 440e57a58b9f4b64af7435927930ce6a 37eea2b31d9543b793c928d777810de4 - - default default] Security group member updated ['5bf6ab1c-c80a-456c-9ce8-d446d055d129']
Dec 06 10:17:34 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:34.798 2 INFO neutron.agent.securitygroups_rpc [None req-b1db9883-f5c1-471b-9a07-cebf6b7ffba6 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:34.967 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:35.199 2 INFO neutron.agent.securitygroups_rpc [None req-5b990af0-9142-4008-b949-8f1c6c9fa9d7 440e57a58b9f4b64af7435927930ce6a 37eea2b31d9543b793c928d777810de4 - - default default] Security group member updated ['5bf6ab1c-c80a-456c-9ce8-d446d055d129']
Dec 06 10:17:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2930003635' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:35.857 2 INFO neutron.agent.securitygroups_rpc [None req-a5058513-5128-4405-b292-62b6045d3f2a a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:36.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:36.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:17:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:17:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:17:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:17:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e131 do_prune osdmap full prune enabled
Dec 06 10:17:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e132 e132: 6 total, 6 up, 6 in
Dec 06 10:17:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in
Dec 06 10:17:36 np0005548788.localdomain podman[316148]: 2025-12-06 10:17:36.273065172 +0000 UTC m=+0.096349751 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:36 np0005548788.localdomain ceph-mon[293643]: pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:36 np0005548788.localdomain podman[316148]: 2025-12-06 10:17:36.286745483 +0000 UTC m=+0.110030102 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:36 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:17:36 np0005548788.localdomain podman[316149]: 2025-12-06 10:17:36.368637299 +0000 UTC m=+0.186606275 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:17:36 np0005548788.localdomain podman[316149]: 2025-12-06 10:17:36.381938298 +0000 UTC m=+0.199907234 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:17:36 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:17:36 np0005548788.localdomain podman[316150]: 2025-12-06 10:17:36.42790182 +0000 UTC m=+0.244190035 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Dec 06 10:17:36 np0005548788.localdomain podman[316150]: 2025-12-06 10:17:36.445656967 +0000 UTC m=+0.261945232 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41)
Dec 06 10:17:36 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:17:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e132 do_prune osdmap full prune enabled
Dec 06 10:17:37 np0005548788.localdomain ceph-mon[293643]: osdmap e132: 6 total, 6 up, 6 in
Dec 06 10:17:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e133 e133: 6 total, 6 up, 6 in
Dec 06 10:17:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in
Dec 06 10:17:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:38.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:38.227 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548788.localdomain ceph-mon[293643]: pgmap v218: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:38 np0005548788.localdomain ceph-mon[293643]: osdmap e133: 6 total, 6 up, 6 in
Dec 06 10:17:38 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:38.466 2 INFO neutron.agent.securitygroups_rpc [None req-ab57ea17-3add-445e-9d4b-332ca72ce0af a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:17:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:17:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:17:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:39.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:39.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/80922449' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/80922449' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e133 do_prune osdmap full prune enabled
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e134 e134: 6 total, 6 up, 6 in
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/80922449' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/80922449' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:39 np0005548788.localdomain ceph-mon[293643]: osdmap e134: 6 total, 6 up, 6 in
Dec 06 10:17:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:39.758 2 INFO neutron.agent.securitygroups_rpc [None req-24de80cf-8a07-42c3-8966-675d0403c3d2 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:40.014 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:40 np0005548788.localdomain ceph-mon[293643]: pgmap v220: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 767 B/s wr, 1 op/s
Dec 06 10:17:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:41.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:41.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:41.004 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:17:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:41.004 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:17:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:41.019 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:17:41 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:41.107 2 INFO neutron.agent.securitygroups_rpc [None req-4acfb63b-6c96-4af3-b5fa-66e73a2e25c0 cf2cadf875da4c9b86fb2902b9ee90bb 2b975a1e6b7941c09260aeb20365b968 - - default default] Security group member updated ['f9be6b32-ff8a-467f-8358-ff505a55042e']
Dec 06 10:17:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e134 do_prune osdmap full prune enabled
Dec 06 10:17:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e135 e135: 6 total, 6 up, 6 in
Dec 06 10:17:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:42.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:42 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:42.013 2 INFO neutron.agent.securitygroups_rpc [None req-f035cee5-5c71-4777-a408-c824903df12b 3ea76362796945abb0389f60eab07566 23fdd860878442e1b8fc77e4ae3ef271 - - default default] Security group member updated ['dd9785c1-eb5d-4293-ac78-0fc1ce108f20']
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e135 do_prune osdmap full prune enabled
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e136 e136: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:42.183 2 INFO neutron.agent.securitygroups_rpc [None req-2d1fe085-81b9-49e2-b303-f7feeabc4137 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: pgmap v222: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 1023 B/s wr, 1 op/s
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: osdmap e135: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/619121492' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:42 np0005548788.localdomain ceph-mon[293643]: osdmap e136: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:42.570 2 INFO neutron.agent.securitygroups_rpc [None req-463c5a9c-1342-4628-be66-c954070435e6 cf2cadf875da4c9b86fb2902b9ee90bb 2b975a1e6b7941c09260aeb20365b968 - - default default] Security group member updated ['f9be6b32-ff8a-467f-8358-ff505a55042e']
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.029 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.029 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.029 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.230 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:43.231 2 INFO neutron.agent.securitygroups_rpc [None req-74f6711f-47e9-487d-bd32-5a2f1bba6efe a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1627444727' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1439659764' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1439659764' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:17:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3506080910' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.479 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.727 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.729 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11571MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.730 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.730 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:43.789 262572 INFO neutron.agent.linux.ip_lib [None req-64cb62f9-5de0-4488-a91a-2776a3243e40 - - - - - -] Device tapaf6a5ec1-24 cannot be used as it has no MAC address
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.808 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.809 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.813 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548788.localdomain kernel: device tapaf6a5ec1-24 entered promiscuous mode
Dec 06 10:17:43 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016263.8188] manager: (tapaf6a5ec1-24): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Dec 06 10:17:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:43Z|00116|binding|INFO|Claiming lport af6a5ec1-243d-4374-a6bb-28aef20ea93f for this chassis.
Dec 06 10:17:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:43Z|00117|binding|INFO|af6a5ec1-243d-4374-a6bb-28aef20ea93f: Claiming unknown
Dec 06 10:17:43 np0005548788.localdomain systemd-udevd[316243]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:43.836 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-e7134938-cb89-4050-bde5-c9273441c423', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7134938-cb89-4050-bde5-c9273441c423', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82e66d987af642e79e2539d816511c7b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c31cd2fe-5b9a-4a3d-99fd-81d43928afea, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=af6a5ec1-243d-4374-a6bb-28aef20ea93f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.839 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:17:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:43.838 159620 INFO neutron.agent.ovn.metadata.agent [-] Port af6a5ec1-243d-4374-a6bb-28aef20ea93f in datapath e7134938-cb89-4050-bde5-c9273441c423 bound to our chassis
Dec 06 10:17:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:43.843 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e7134938-cb89-4050-bde5-c9273441c423 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:43.844 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb2b109-4023-4586-87a7-79285f5bd685]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:43Z|00118|binding|INFO|Setting lport af6a5ec1-243d-4374-a6bb-28aef20ea93f ovn-installed in OVS
Dec 06 10:17:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:43Z|00119|binding|INFO|Setting lport af6a5ec1-243d-4374-a6bb-28aef20ea93f up in Southbound
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.867 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapaf6a5ec1-24: No such device
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.907 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:43.935 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:17:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4032121207' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:44.306 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:17:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:44.312 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:17:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:44.335 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:17:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:44.338 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:17:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:44.338 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:44 np0005548788.localdomain ceph-mon[293643]: pgmap v225: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 3.3 KiB/s wr, 59 op/s
Dec 06 10:17:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3506080910' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4032121207' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:44 np0005548788.localdomain podman[316337]: 
Dec 06 10:17:44 np0005548788.localdomain podman[316337]: 2025-12-06 10:17:44.750150359 +0000 UTC m=+0.086871641 container create d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:17:44 np0005548788.localdomain systemd[1]: Started libpod-conmon-d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5.scope.
Dec 06 10:17:44 np0005548788.localdomain systemd[1]: tmp-crun.lO7MEp.mount: Deactivated successfully.
Dec 06 10:17:44 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:44 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38b7c834555bbfef7f0b6517ca49182f0d5d2d6942a1519a49e8534a4e0eecbf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:44 np0005548788.localdomain podman[316337]: 2025-12-06 10:17:44.708129408 +0000 UTC m=+0.044850720 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:44 np0005548788.localdomain podman[316337]: 2025-12-06 10:17:44.813757854 +0000 UTC m=+0.150479126 container init d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:17:44 np0005548788.localdomain podman[316337]: 2025-12-06 10:17:44.821925015 +0000 UTC m=+0.158646277 container start d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:44 np0005548788.localdomain dnsmasq[316355]: started, version 2.85 cachesize 150
Dec 06 10:17:44 np0005548788.localdomain dnsmasq[316355]: DNS service limited to local subnets
Dec 06 10:17:44 np0005548788.localdomain dnsmasq[316355]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:44 np0005548788.localdomain dnsmasq[316355]: warning: no upstream servers configured
Dec 06 10:17:44 np0005548788.localdomain dnsmasq-dhcp[316355]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:17:44 np0005548788.localdomain dnsmasq[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/addn_hosts - 0 addresses
Dec 06 10:17:44 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/host
Dec 06 10:17:44 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/opts
Dec 06 10:17:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:45.023 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:45.196 262572 INFO neutron.agent.dhcp.agent [None req-67a9b473-f9a2-4b4f-96d1-00083d3a7a28 - - - - - -] DHCP configuration for ports {'e61fba60-96b6-46a6-a626-298afeb9f278'} is completed
Dec 06 10:17:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:45.338 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:45 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:45.569 2 INFO neutron.agent.securitygroups_rpc [None req-a9308ef0-170e-430a-9f5f-6439b979faf7 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:46 np0005548788.localdomain ceph-mon[293643]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.2 KiB/s wr, 55 op/s
Dec 06 10:17:46 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:46.575 2 INFO neutron.agent.securitygroups_rpc [None req-38541453-b414-4a96-8c97-455c5ffb96a0 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e136 do_prune osdmap full prune enabled
Dec 06 10:17:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e137 e137: 6 total, 6 up, 6 in
Dec 06 10:17:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in
Dec 06 10:17:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:47.441 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:47.441 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:17:47.442 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:48 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:48.022 2 INFO neutron.agent.securitygroups_rpc [None req-77939ad8-3a8c-44db-b1d8-896917e1a291 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:48 np0005548788.localdomain ceph-mon[293643]: osdmap e137: 6 total, 6 up, 6 in
Dec 06 10:17:48 np0005548788.localdomain ceph-mon[293643]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.2 KiB/s wr, 55 op/s
Dec 06 10:17:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:17:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:48.235 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:48 np0005548788.localdomain podman[316356]: 2025-12-06 10:17:48.239410137 +0000 UTC m=+0.068303891 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:17:48 np0005548788.localdomain podman[316356]: 2025-12-06 10:17:48.254548821 +0000 UTC m=+0.083442585 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:17:48 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:17:49 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:49.063 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:47Z, description=, device_id=ef42a7b7-856f-4d93-83fd-eafb16254770, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c70b4ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6856dc0>], id=b0c22246-534b-45a1-a1b8-bb9d6190b28e, ip_allocation=immediate, mac_address=fa:16:3e:cf:91:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:41Z, description=, dns_domain=, id=e7134938-cb89-4050-bde5-c9273441c423, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1466997883-network, port_security_enabled=True, project_id=82e66d987af642e79e2539d816511c7b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1516, status=ACTIVE, subnets=['660602d1-0571-4050-ade4-939d33674cd3'], tags=[], tenant_id=82e66d987af642e79e2539d816511c7b, updated_at=2025-12-06T10:17:42Z, vlan_transparent=None, network_id=e7134938-cb89-4050-bde5-c9273441c423, port_security_enabled=False, project_id=82e66d987af642e79e2539d816511c7b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1555, status=DOWN, tags=[], tenant_id=82e66d987af642e79e2539d816511c7b, updated_at=2025-12-06T10:17:48Z on network e7134938-cb89-4050-bde5-c9273441c423
Dec 06 10:17:49 np0005548788.localdomain dnsmasq[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/addn_hosts - 1 addresses
Dec 06 10:17:49 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/host
Dec 06 10:17:49 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/opts
Dec 06 10:17:49 np0005548788.localdomain podman[316390]: 2025-12-06 10:17:49.282513242 +0000 UTC m=+0.068742754 container kill d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:49 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:49.579 262572 INFO neutron.agent.dhcp.agent [None req-3e02b575-ae6c-4994-ac7e-089bfe3a3dd4 - - - - - -] DHCP configuration for ports {'b0c22246-534b-45a1-a1b8-bb9d6190b28e'} is completed
Dec 06 10:17:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:17:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:17:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:17:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:17:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:17:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19210 "" "Go-http-client/1.1"
Dec 06 10:17:49 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:17:49.736 2 INFO neutron.agent.securitygroups_rpc [None req-028fe2d3-a2af-4154-9a69-d7d602ad3ddf a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:50.065 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:17:50 np0005548788.localdomain ceph-mon[293643]: pgmap v229: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 45 op/s
Dec 06 10:17:50 np0005548788.localdomain podman[316411]: 2025-12-06 10:17:50.255152572 +0000 UTC m=+0.080991320 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:17:50 np0005548788.localdomain podman[316411]: 2025-12-06 10:17:50.267658867 +0000 UTC m=+0.093497625 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:17:50 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:17:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:17:50 np0005548788.localdomain podman[316434]: 2025-12-06 10:17:50.401164419 +0000 UTC m=+0.089477621 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:17:50 np0005548788.localdomain podman[316434]: 2025-12-06 10:17:50.406489243 +0000 UTC m=+0.094802415 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:17:50 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:17:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:51.866 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:47Z, description=, device_id=ef42a7b7-856f-4d93-83fd-eafb16254770, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68c0610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68c0250>], id=b0c22246-534b-45a1-a1b8-bb9d6190b28e, ip_allocation=immediate, mac_address=fa:16:3e:cf:91:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:41Z, description=, dns_domain=, id=e7134938-cb89-4050-bde5-c9273441c423, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1466997883-network, port_security_enabled=True, project_id=82e66d987af642e79e2539d816511c7b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1516, status=ACTIVE, subnets=['660602d1-0571-4050-ade4-939d33674cd3'], tags=[], tenant_id=82e66d987af642e79e2539d816511c7b, updated_at=2025-12-06T10:17:42Z, vlan_transparent=None, network_id=e7134938-cb89-4050-bde5-c9273441c423, port_security_enabled=False, project_id=82e66d987af642e79e2539d816511c7b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1555, status=DOWN, tags=[], tenant_id=82e66d987af642e79e2539d816511c7b, updated_at=2025-12-06T10:17:48Z on network e7134938-cb89-4050-bde5-c9273441c423
Dec 06 10:17:52 np0005548788.localdomain podman[316467]: 2025-12-06 10:17:52.076019059 +0000 UTC m=+0.055900299 container kill d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:17:52 np0005548788.localdomain dnsmasq[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/addn_hosts - 1 addresses
Dec 06 10:17:52 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/host
Dec 06 10:17:52 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/opts
Dec 06 10:17:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:52 np0005548788.localdomain ceph-mon[293643]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 1.4 KiB/s wr, 38 op/s
Dec 06 10:17:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:17:52.446 262572 INFO neutron.agent.dhcp.agent [None req-fc87e247-8ac6-48f6-a3bd-4f9b90c6340c - - - - - -] DHCP configuration for ports {'b0c22246-534b-45a1-a1b8-bb9d6190b28e'} is completed
Dec 06 10:17:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e137 do_prune osdmap full prune enabled
Dec 06 10:17:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e138 e138: 6 total, 6 up, 6 in
Dec 06 10:17:53 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in
Dec 06 10:17:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:53.238 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:54Z|00120|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:17:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:54Z|00121|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:17:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:17:54Z|00122|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:17:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:54.087 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:54.102 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:54.108 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:54.113 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:54.151 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:54.196 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548788.localdomain ceph-mon[293643]: pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 409 B/s wr, 2 op/s
Dec 06 10:17:54 np0005548788.localdomain ceph-mon[293643]: osdmap e138: 6 total, 6 up, 6 in
Dec 06 10:17:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:55.068 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:55.093 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:55.152 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:55.981 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:56 np0005548788.localdomain ceph-mon[293643]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s rd, 511 B/s wr, 3 op/s
Dec 06 10:17:56 np0005548788.localdomain sudo[316490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:17:56 np0005548788.localdomain sudo[316490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:17:56 np0005548788.localdomain sudo[316490]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:56 np0005548788.localdomain sudo[316508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:17:56 np0005548788.localdomain sudo[316508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:17:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:57 np0005548788.localdomain sudo[316508]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1671277929' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1671277929' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:17:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:17:57 np0005548788.localdomain sudo[316557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:17:57 np0005548788.localdomain sudo[316557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:17:57 np0005548788.localdomain sudo[316557]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:17:58.242 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 409 B/s wr, 2 op/s
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e138 do_prune osdmap full prune enabled
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e139 e139: 6 total, 6 up, 6 in
Dec 06 10:17:58 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in
Dec 06 10:17:59 np0005548788.localdomain ceph-mon[293643]: osdmap e139: 6 total, 6 up, 6 in
Dec 06 10:18:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:00.071 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:00 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:00.237 262572 INFO neutron.agent.linux.ip_lib [None req-9acd361c-b060-499b-973f-176a2c761add - - - - - -] Device tapee0bfbaa-e1 cannot be used as it has no MAC address
Dec 06 10:18:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:00.259 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:00 np0005548788.localdomain kernel: device tapee0bfbaa-e1 entered promiscuous mode
Dec 06 10:18:00 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016280.2693] manager: (tapee0bfbaa-e1): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Dec 06 10:18:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:00.270 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:00Z|00123|binding|INFO|Claiming lport ee0bfbaa-e117-417c-8f91-c71c171d694f for this chassis.
Dec 06 10:18:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:00Z|00124|binding|INFO|ee0bfbaa-e117-417c-8f91-c71c171d694f: Claiming unknown
Dec 06 10:18:00 np0005548788.localdomain systemd-udevd[316585]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:00.282 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-d747da61-f366-4c62-b8b8-85e410a4e587', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d747da61-f366-4c62-b8b8-85e410a4e587', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44dbf22d-4ec3-4a84-9020-b5eba9dacdbc, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=ee0bfbaa-e117-417c-8f91-c71c171d694f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:00.284 159620 INFO neutron.agent.ovn.metadata.agent [-] Port ee0bfbaa-e117-417c-8f91-c71c171d694f in datapath d747da61-f366-4c62-b8b8-85e410a4e587 bound to our chassis
Dec 06 10:18:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:00.286 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d747da61-f366-4c62-b8b8-85e410a4e587 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:00.287 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[b880de12-3e31-419f-a19d-36767088af64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:00Z|00125|binding|INFO|Setting lport ee0bfbaa-e117-417c-8f91-c71c171d694f ovn-installed in OVS
Dec 06 10:18:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:00Z|00126|binding|INFO|Setting lport ee0bfbaa-e117-417c-8f91-c71c171d694f up in Southbound
Dec 06 10:18:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:00.316 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapee0bfbaa-e1: No such device
Dec 06 10:18:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:00.407 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:00 np0005548788.localdomain ceph-mon[293643]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 2.7 KiB/s wr, 70 op/s
Dec 06 10:18:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1856326074' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1856326074' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:01 np0005548788.localdomain podman[316656]: 
Dec 06 10:18:01 np0005548788.localdomain podman[316656]: 2025-12-06 10:18:01.297808391 +0000 UTC m=+0.092756551 container create a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:01 np0005548788.localdomain systemd[1]: Started libpod-conmon-a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9.scope.
Dec 06 10:18:01 np0005548788.localdomain podman[316656]: 2025-12-06 10:18:01.253375545 +0000 UTC m=+0.048323745 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:01 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:01 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f98497d1a8feb5ba1758d63b1509fead819e9ffa51a9d76606099b377b2b4e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:01 np0005548788.localdomain podman[316656]: 2025-12-06 10:18:01.37815428 +0000 UTC m=+0.173102450 container init a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:18:01 np0005548788.localdomain podman[316656]: 2025-12-06 10:18:01.389713436 +0000 UTC m=+0.184661606 container start a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:18:01 np0005548788.localdomain dnsmasq[316675]: started, version 2.85 cachesize 150
Dec 06 10:18:01 np0005548788.localdomain dnsmasq[316675]: DNS service limited to local subnets
Dec 06 10:18:01 np0005548788.localdomain dnsmasq[316675]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:01 np0005548788.localdomain dnsmasq[316675]: warning: no upstream servers configured
Dec 06 10:18:01 np0005548788.localdomain dnsmasq-dhcp[316675]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Dec 06 10:18:01 np0005548788.localdomain dnsmasq[316675]: read /var/lib/neutron/dhcp/d747da61-f366-4c62-b8b8-85e410a4e587/addn_hosts - 0 addresses
Dec 06 10:18:01 np0005548788.localdomain dnsmasq-dhcp[316675]: read /var/lib/neutron/dhcp/d747da61-f366-4c62-b8b8-85e410a4e587/host
Dec 06 10:18:01 np0005548788.localdomain dnsmasq-dhcp[316675]: read /var/lib/neutron/dhcp/d747da61-f366-4c62-b8b8-85e410a4e587/opts
Dec 06 10:18:01 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:01.554 262572 INFO neutron.agent.dhcp.agent [None req-bb7aa9b5-f69b-4d09-aa80-46dba1f2fc71 - - - - - -] DHCP configuration for ports {'f0468471-b728-4405-9876-bb755fb9fea7'} is completed
Dec 06 10:18:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2389509646' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2389509646' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 2.2 KiB/s wr, 68 op/s
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2062782985' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2062782985' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:18:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:03.270 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:04 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:18:04 np0005548788.localdomain podman[316676]: 2025-12-06 10:18:04.260495417 +0000 UTC m=+0.079194645 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Dec 06 10:18:04 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:04.288 2 INFO neutron.agent.securitygroups_rpc [None req-a84ff9e7-4dda-4f24-9c52-73179c1374d1 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:04 np0005548788.localdomain podman[316676]: 2025-12-06 10:18:04.329151307 +0000 UTC m=+0.147850615 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:18:04 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:18:04 np0005548788.localdomain ceph-mon[293643]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 3.3 KiB/s wr, 91 op/s
Dec 06 10:18:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:05.079 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:06 np0005548788.localdomain ceph-mon[293643]: pgmap v239: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec 06 10:18:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e139 do_prune osdmap full prune enabled
Dec 06 10:18:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e140 e140: 6 total, 6 up, 6 in
Dec 06 10:18:07 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: tmp-crun.vw74dO.mount: Deactivated successfully.
Dec 06 10:18:07 np0005548788.localdomain podman[316702]: 2025-12-06 10:18:07.282550007 +0000 UTC m=+0.096910219 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:18:07 np0005548788.localdomain podman[316702]: 2025-12-06 10:18:07.291606586 +0000 UTC m=+0.105966798 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:18:07 np0005548788.localdomain podman[316701]: 2025-12-06 10:18:07.374267466 +0000 UTC m=+0.194165588 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2)
Dec 06 10:18:07 np0005548788.localdomain podman[316707]: 2025-12-06 10:18:07.447991271 +0000 UTC m=+0.254226083 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:18:07 np0005548788.localdomain podman[316707]: 2025-12-06 10:18:07.460847636 +0000 UTC m=+0.267082448 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:18:07 np0005548788.localdomain podman[316701]: 2025-12-06 10:18:07.518838229 +0000 UTC m=+0.338736311 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:18:07 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:18:08 np0005548788.localdomain ceph-mon[293643]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec 06 10:18:08 np0005548788.localdomain ceph-mon[293643]: osdmap e140: 6 total, 6 up, 6 in
Dec 06 10:18:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:08.272 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:08 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:08.687 2 INFO neutron.agent.securitygroups_rpc [None req-2cd445e7-be6d-4272-b78a-eedc8c1ca774 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:18:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:18:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:09.226 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:09.228 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:09.231 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:09.233 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[f075beb6-a2c8-430e-93eb-d77ff989a10e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:10 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:10.061 2 INFO neutron.agent.securitygroups_rpc [None req-36813505-8d2e-42b4-bcdd-400a4500589a a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:10.118 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:10 np0005548788.localdomain ceph-mon[293643]: pgmap v242: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 1.9 KiB/s wr, 36 op/s
Dec 06 10:18:10 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:10.939 2 INFO neutron.agent.securitygroups_rpc [None req-809d6155-5d31-4aee-97b1-907b0d1ee5ee a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e140 do_prune osdmap full prune enabled
Dec 06 10:18:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e141 e141: 6 total, 6 up, 6 in
Dec 06 10:18:11 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in
Dec 06 10:18:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:11.928 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:11.930 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:18:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:11.929 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e141 do_prune osdmap full prune enabled
Dec 06 10:18:12 np0005548788.localdomain ceph-mon[293643]: pgmap v243: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 1.9 KiB/s wr, 36 op/s
Dec 06 10:18:12 np0005548788.localdomain ceph-mon[293643]: osdmap e141: 6 total, 6 up, 6 in
Dec 06 10:18:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e142 e142: 6 total, 6 up, 6 in
Dec 06 10:18:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in
Dec 06 10:18:13 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:13.075 262572 INFO neutron.agent.linux.ip_lib [None req-2adcddf7-66bc-45dc-aaf6-44227e4ebaa5 - - - - - -] Device tap789b40b3-af cannot be used as it has no MAC address
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.099 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain kernel: device tap789b40b3-af entered promiscuous mode
Dec 06 10:18:13 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016293.1076] manager: (tap789b40b3-af): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Dec 06 10:18:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:13Z|00127|binding|INFO|Claiming lport 789b40b3-afd7-4a88-9017-1b2660dc1d27 for this chassis.
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.107 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:13Z|00128|binding|INFO|789b40b3-afd7-4a88-9017-1b2660dc1d27: Claiming unknown
Dec 06 10:18:13 np0005548788.localdomain systemd-udevd[316770]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.117 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe87:a62d/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-935ef55e-b28d-4ced-ba10-0d7f1006f139', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-935ef55e-b28d-4ced-ba10-0d7f1006f139', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a775a086-0f2b-45df-9f2e-9573d26ea8f8, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=789b40b3-afd7-4a88-9017-1b2660dc1d27) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.118 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 789b40b3-afd7-4a88-9017-1b2660dc1d27 in datapath 935ef55e-b28d-4ced-ba10-0d7f1006f139 bound to our chassis
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.119 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 935ef55e-b28d-4ced-ba10-0d7f1006f139 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.120 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[f8655ec2-3843-432d-9d57-6069a8f595f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:13Z|00129|binding|INFO|Setting lport 789b40b3-afd7-4a88-9017-1b2660dc1d27 ovn-installed in OVS
Dec 06 10:18:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:13Z|00130|binding|INFO|Setting lport 789b40b3-afd7-4a88-9017-1b2660dc1d27 up in Southbound
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.122 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.123 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.149 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap789b40b3-af: No such device
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.190 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.219 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain ceph-mon[293643]: osdmap e142: 6 total, 6 up, 6 in
Dec 06 10:18:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2958126727' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2958126727' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:13.275 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.748 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.749 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.750 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:13.751 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[40039968-5da8-4018-a6f1-8807fb1ca82b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:14 np0005548788.localdomain podman[316839]: 
Dec 06 10:18:14 np0005548788.localdomain podman[316839]: 2025-12-06 10:18:14.05731237 +0000 UTC m=+0.095555027 container create 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:18:14 np0005548788.localdomain systemd[1]: Started libpod-conmon-878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372.scope.
Dec 06 10:18:14 np0005548788.localdomain podman[316839]: 2025-12-06 10:18:14.011470921 +0000 UTC m=+0.049713608 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:14 np0005548788.localdomain systemd[1]: tmp-crun.SeTgoP.mount: Deactivated successfully.
Dec 06 10:18:14 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:14 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba1862b425058bd3d845160fbceda27c8612fc7e822f4630bfbccd31cebb533e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:14 np0005548788.localdomain podman[316839]: 2025-12-06 10:18:14.162159662 +0000 UTC m=+0.200402319 container init 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:18:14 np0005548788.localdomain podman[316839]: 2025-12-06 10:18:14.168478066 +0000 UTC m=+0.206720723 container start 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.171 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port cb561ed0-1ffa-4c26-a43d-38ab282d0440 with type ""
Dec 06 10:18:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:14Z|00131|binding|INFO|Removing iface tap789b40b3-af ovn-installed in OVS
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.172 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe87:a62d/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-935ef55e-b28d-4ced-ba10-0d7f1006f139', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-935ef55e-b28d-4ced-ba10-0d7f1006f139', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a775a086-0f2b-45df-9f2e-9573d26ea8f8, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=789b40b3-afd7-4a88-9017-1b2660dc1d27) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.173 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 789b40b3-afd7-4a88-9017-1b2660dc1d27 in datapath 935ef55e-b28d-4ced-ba10-0d7f1006f139 unbound from our chassis
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.173 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 935ef55e-b28d-4ced-ba10-0d7f1006f139 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.174 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c1c7e2-49ec-4e03-88dc-93605b038c4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:14 np0005548788.localdomain dnsmasq[316858]: started, version 2.85 cachesize 150
Dec 06 10:18:14 np0005548788.localdomain dnsmasq[316858]: DNS service limited to local subnets
Dec 06 10:18:14 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:14Z|00132|binding|INFO|Removing lport 789b40b3-afd7-4a88-9017-1b2660dc1d27 ovn-installed in OVS
Dec 06 10:18:14 np0005548788.localdomain dnsmasq[316858]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:14 np0005548788.localdomain dnsmasq[316858]: warning: no upstream servers configured
Dec 06 10:18:14 np0005548788.localdomain dnsmasq[316858]: read /var/lib/neutron/dhcp/935ef55e-b28d-4ced-ba10-0d7f1006f139/addn_hosts - 0 addresses
Dec 06 10:18:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:14.178 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:14.180 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e142 do_prune osdmap full prune enabled
Dec 06 10:18:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:14.265 262572 INFO neutron.agent.dhcp.agent [None req-5f22db0f-205b-4e0c-850b-4c478445f057 - - - - - -] DHCP configuration for ports {'e3fa50eb-216a-41ab-be99-4371b7a0f20e'} is completed
Dec 06 10:18:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e143 e143: 6 total, 6 up, 6 in
Dec 06 10:18:14 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in
Dec 06 10:18:14 np0005548788.localdomain ceph-mon[293643]: pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 3.2 KiB/s wr, 57 op/s
Dec 06 10:18:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:14.435 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548788.localdomain dnsmasq[316858]: exiting on receipt of SIGTERM
Dec 06 10:18:14 np0005548788.localdomain podman[316874]: 2025-12-06 10:18:14.507577517 +0000 UTC m=+0.068567398 container kill 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:14 np0005548788.localdomain systemd[1]: libpod-878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372.scope: Deactivated successfully.
Dec 06 10:18:14 np0005548788.localdomain podman[316886]: 2025-12-06 10:18:14.59164706 +0000 UTC m=+0.064988838 container died 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:18:14 np0005548788.localdomain podman[316886]: 2025-12-06 10:18:14.622282512 +0000 UTC m=+0.095624250 container cleanup 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:18:14 np0005548788.localdomain systemd[1]: libpod-conmon-878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372.scope: Deactivated successfully.
Dec 06 10:18:14 np0005548788.localdomain podman[316888]: 2025-12-06 10:18:14.673165796 +0000 UTC m=+0.138044984 container remove 878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-935ef55e-b28d-4ced-ba10-0d7f1006f139, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:18:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:14.687 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548788.localdomain kernel: device tap789b40b3-af left promiscuous mode
Dec 06 10:18:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:14.699 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:14.714 262572 INFO neutron.agent.dhcp.agent [None req-c3cfc58f-34ee-4437-b77a-cf74715d17f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:14.715 262572 INFO neutron.agent.dhcp.agent [None req-c3cfc58f-34ee-4437-b77a-cf74715d17f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.987 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.989 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.992 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:14 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:14.993 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[903c7203-330a-4e04-8e2d-6572e1929441]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ba1862b425058bd3d845160fbceda27c8612fc7e822f4630bfbccd31cebb533e-merged.mount: Deactivated successfully.
Dec 06 10:18:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-878828fb5a2b08f71a7ecb455658c797f659d50952fc8bb68ae2a4817b202372-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:15 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d935ef55e\x2db28d\x2d4ced\x2dba10\x2d0d7f1006f139.mount: Deactivated successfully.
Dec 06 10:18:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:15.144 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e143 do_prune osdmap full prune enabled
Dec 06 10:18:15 np0005548788.localdomain ceph-mon[293643]: osdmap e143: 6 total, 6 up, 6 in
Dec 06 10:18:15 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4158350361' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:15 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4158350361' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e144 e144: 6 total, 6 up, 6 in
Dec 06 10:18:15 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in
Dec 06 10:18:16 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:16.112 2 INFO neutron.agent.securitygroups_rpc [None req-6fa383fb-a4a1-4db9-8964-14f7246d83c2 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e144 do_prune osdmap full prune enabled
Dec 06 10:18:16 np0005548788.localdomain ceph-mon[293643]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.5 KiB/s wr, 55 op/s
Dec 06 10:18:16 np0005548788.localdomain ceph-mon[293643]: osdmap e144: 6 total, 6 up, 6 in
Dec 06 10:18:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e145 e145: 6 total, 6 up, 6 in
Dec 06 10:18:16 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in
Dec 06 10:18:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:17.333 2 INFO neutron.agent.securitygroups_rpc [None req-034cc1e4-4fb9-4793-8ac5-168cd3b3cb7e a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:17 np0005548788.localdomain ceph-mon[293643]: osdmap e145: 6 total, 6 up, 6 in
Dec 06 10:18:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1565843773' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1565843773' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:18.276 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548788.localdomain ceph-mon[293643]: pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 3.1 KiB/s wr, 67 op/s
Dec 06 10:18:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3956098805' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3956098805' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1565843773' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1565843773' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:18.932 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:18:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:18:19 np0005548788.localdomain systemd[1]: tmp-crun.0EsmWd.mount: Deactivated successfully.
Dec 06 10:18:19 np0005548788.localdomain podman[316918]: 2025-12-06 10:18:19.279521902 +0000 UTC m=+0.098029654 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:19 np0005548788.localdomain podman[316918]: 2025-12-06 10:18:19.325903496 +0000 UTC m=+0.144411328 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:19 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:18:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:18:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:18:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:18:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158570 "" "Go-http-client/1.1"
Dec 06 10:18:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:18:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19680 "" "Go-http-client/1.1"
Dec 06 10:18:19 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:19.737 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:19 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:19.739 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:19 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:19.743 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:19 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:19.746 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e05bc6e2-c96b-4237-bd35-f28d0c7f7f20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:20.148 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3253677710' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3253677710' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 127 KiB/s rd, 8.0 KiB/s wr, 172 op/s
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3253677710' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3253677710' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4142291706' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4142291706' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:21 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:18:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:18:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:18:21 np0005548788.localdomain podman[316938]: 2025-12-06 10:18:21.28137694 +0000 UTC m=+0.096475115 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:18:21 np0005548788.localdomain podman[316938]: 2025-12-06 10:18:21.29177052 +0000 UTC m=+0.106868685 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:18:21 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:18:21 np0005548788.localdomain podman[316939]: 2025-12-06 10:18:21.386243233 +0000 UTC m=+0.196718556 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:18:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e145 do_prune osdmap full prune enabled
Dec 06 10:18:21 np0005548788.localdomain podman[316939]: 2025-12-06 10:18:21.421770675 +0000 UTC m=+0.232245978 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:18:21 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:18:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e146 e146: 6 total, 6 up, 6 in
Dec 06 10:18:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4142291706' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4142291706' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:21 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e146 do_prune osdmap full prune enabled
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e147 e147: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 7.0 KiB/s wr, 151 op/s
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: osdmap e146: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548788.localdomain ceph-mon[293643]: osdmap e147: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:22.475 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:22.477 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:22.480 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:22.481 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[016a81d1-13d3-4968-85d8-598dd37977a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:23.279 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e147 do_prune osdmap full prune enabled
Dec 06 10:18:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e148 e148: 6 total, 6 up, 6 in
Dec 06 10:18:23 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in
Dec 06 10:18:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/298180858' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/298180858' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:23.642 2 INFO neutron.agent.securitygroups_rpc [None req-b79a01a3-8e64-4889-8420-e298cffcfc58 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:23.725 2 INFO neutron.agent.securitygroups_rpc [None req-9f63fce7-8a34-4731-bfa7-9d45ada3f54e 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:24Z|00133|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:18:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:24Z|00134|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:18:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:24Z|00135|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:18:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:24.304 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:24.324 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:24.327 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:24 np0005548788.localdomain podman[316996]: 2025-12-06 10:18:24.473630381 +0000 UTC m=+0.063933916 container kill d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:18:24 np0005548788.localdomain dnsmasq[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/addn_hosts - 0 addresses
Dec 06 10:18:24 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/host
Dec 06 10:18:24 np0005548788.localdomain dnsmasq-dhcp[316355]: read /var/lib/neutron/dhcp/e7134938-cb89-4050-bde5-c9273441c423/opts
Dec 06 10:18:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e148 do_prune osdmap full prune enabled
Dec 06 10:18:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e149 e149: 6 total, 6 up, 6 in
Dec 06 10:18:24 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in
Dec 06 10:18:24 np0005548788.localdomain ceph-mon[293643]: pgmap v256: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 204 KiB/s rd, 15 MiB/s wr, 282 op/s
Dec 06 10:18:24 np0005548788.localdomain ceph-mon[293643]: osdmap e148: 6 total, 6 up, 6 in
Dec 06 10:18:24 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:24.636 2 INFO neutron.agent.securitygroups_rpc [None req-366a0057-fc3f-46e6-9a84-ba466e35126f a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:24.698 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:24 np0005548788.localdomain kernel: device tapaf6a5ec1-24 left promiscuous mode
Dec 06 10:18:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:24Z|00136|binding|INFO|Releasing lport af6a5ec1-243d-4374-a6bb-28aef20ea93f from this chassis (sb_readonly=0)
Dec 06 10:18:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:24Z|00137|binding|INFO|Setting lport af6a5ec1-243d-4374-a6bb-28aef20ea93f down in Southbound
Dec 06 10:18:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:24.708 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-e7134938-cb89-4050-bde5-c9273441c423', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7134938-cb89-4050-bde5-c9273441c423', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82e66d987af642e79e2539d816511c7b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c31cd2fe-5b9a-4a3d-99fd-81d43928afea, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=af6a5ec1-243d-4374-a6bb-28aef20ea93f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:24.710 159620 INFO neutron.agent.ovn.metadata.agent [-] Port af6a5ec1-243d-4374-a6bb-28aef20ea93f in datapath e7134938-cb89-4050-bde5-c9273441c423 unbound from our chassis
Dec 06 10:18:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:24.713 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e7134938-cb89-4050-bde5-c9273441c423, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:24.714 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc5614d-0539-41a1-b2db-9b5f7b57bda8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:24.725 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:25.181 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:25.374 2 INFO neutron.agent.securitygroups_rpc [None req-9f7062a2-5eeb-4deb-87a1-858e2e900cdd 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e149 do_prune osdmap full prune enabled
Dec 06 10:18:25 np0005548788.localdomain ceph-mon[293643]: osdmap e149: 6 total, 6 up, 6 in
Dec 06 10:18:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1581079548' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1581079548' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e150 e150: 6 total, 6 up, 6 in
Dec 06 10:18:25 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in
Dec 06 10:18:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e150 do_prune osdmap full prune enabled
Dec 06 10:18:26 np0005548788.localdomain ceph-mon[293643]: pgmap v259: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 155 KiB/s rd, 26 MiB/s wr, 220 op/s
Dec 06 10:18:26 np0005548788.localdomain ceph-mon[293643]: osdmap e150: 6 total, 6 up, 6 in
Dec 06 10:18:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e151 e151: 6 total, 6 up, 6 in
Dec 06 10:18:26 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in
Dec 06 10:18:26 np0005548788.localdomain dnsmasq[316675]: exiting on receipt of SIGTERM
Dec 06 10:18:26 np0005548788.localdomain podman[317037]: 2025-12-06 10:18:26.771912189 +0000 UTC m=+0.068543256 container kill a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:26 np0005548788.localdomain systemd[1]: libpod-a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9.scope: Deactivated successfully.
Dec 06 10:18:26 np0005548788.localdomain podman[317050]: 2025-12-06 10:18:26.835870485 +0000 UTC m=+0.054213277 container died a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:18:26 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:26 np0005548788.localdomain podman[317050]: 2025-12-06 10:18:26.867304701 +0000 UTC m=+0.085647453 container cleanup a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:26 np0005548788.localdomain systemd[1]: libpod-conmon-a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9.scope: Deactivated successfully.
Dec 06 10:18:26 np0005548788.localdomain podman[317057]: 2025-12-06 10:18:26.925304874 +0000 UTC m=+0.128976265 container remove a30cea174155a5ad4ac16bb01096c3ee86b75efef804363dd6d30790a10e50e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d747da61-f366-4c62-b8b8-85e410a4e587, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:18:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:26.937 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:26 np0005548788.localdomain kernel: device tapee0bfbaa-e1 left promiscuous mode
Dec 06 10:18:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:26Z|00138|binding|INFO|Releasing lport ee0bfbaa-e117-417c-8f91-c71c171d694f from this chassis (sb_readonly=0)
Dec 06 10:18:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:26Z|00139|binding|INFO|Setting lport ee0bfbaa-e117-417c-8f91-c71c171d694f down in Southbound
Dec 06 10:18:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:26.945 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-d747da61-f366-4c62-b8b8-85e410a4e587', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d747da61-f366-4c62-b8b8-85e410a4e587', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44dbf22d-4ec3-4a84-9020-b5eba9dacdbc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=ee0bfbaa-e117-417c-8f91-c71c171d694f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:26.947 159620 INFO neutron.agent.ovn.metadata.agent [-] Port ee0bfbaa-e117-417c-8f91-c71c171d694f in datapath d747da61-f366-4c62-b8b8-85e410a4e587 unbound from our chassis
Dec 06 10:18:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:26.948 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d747da61-f366-4c62-b8b8-85e410a4e587 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:26 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:26.949 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb0eb6a-1228-42aa-a682-1b038cce504b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:26.958 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:26 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:26.976 262572 INFO neutron.agent.dhcp.agent [None req-0c5d5d8a-e02e-41e9-a0f1-cb28b9df7321 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:27.168 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:27.170 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:27.173 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:27.174 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[79175fac-7b51-407f-a7a6-465a21875d42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:27 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:27.289 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e151 do_prune osdmap full prune enabled
Dec 06 10:18:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e152 e152: 6 total, 6 up, 6 in
Dec 06 10:18:27 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in
Dec 06 10:18:27 np0005548788.localdomain ceph-mon[293643]: osdmap e151: 6 total, 6 up, 6 in
Dec 06 10:18:27 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f0f98497d1a8feb5ba1758d63b1509fead819e9ffa51a9d76606099b377b2b4e-merged.mount: Deactivated successfully.
Dec 06 10:18:27 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2dd747da61\x2df366\x2d4c62\x2db8b8\x2d85e410a4e587.mount: Deactivated successfully.
Dec 06 10:18:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:27.873 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548788.localdomain dnsmasq[316355]: exiting on receipt of SIGTERM
Dec 06 10:18:28 np0005548788.localdomain podman[317098]: 2025-12-06 10:18:28.092540484 +0000 UTC m=+0.062416310 container kill d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:18:28 np0005548788.localdomain systemd[1]: libpod-d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5.scope: Deactivated successfully.
Dec 06 10:18:28 np0005548788.localdomain podman[317112]: 2025-12-06 10:18:28.172348556 +0000 UTC m=+0.060921682 container died d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:18:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:28 np0005548788.localdomain podman[317112]: 2025-12-06 10:18:28.20890769 +0000 UTC m=+0.097480766 container cleanup d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:18:28 np0005548788.localdomain systemd[1]: libpod-conmon-d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5.scope: Deactivated successfully.
Dec 06 10:18:28 np0005548788.localdomain podman[317113]: 2025-12-06 10:18:28.254182221 +0000 UTC m=+0.137673492 container remove d56b38224cef2efa58b95f9aa4276502bfcc1b3571d9f2acbbae723de7c8d9a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e7134938-cb89-4050-bde5-c9273441c423, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:18:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:28.276 262572 INFO neutron.agent.dhcp.agent [None req-b4981fb4-ffe0-4e33-8f5c-5149cad5dc3d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:28.278 262572 INFO neutron.agent.dhcp.agent [None req-b4981fb4-ffe0-4e33-8f5c-5149cad5dc3d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:28.281 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548788.localdomain ceph-mon[293643]: pgmap v262: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:18:28 np0005548788.localdomain ceph-mon[293643]: osdmap e152: 6 total, 6 up, 6 in
Dec 06 10:18:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:28.657 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:28.660 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:28.662 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:28.663 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[770b97ef-8ff2-44bc-a647-f2a731688a44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3964149539' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3964149539' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:28 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-38b7c834555bbfef7f0b6517ca49182f0d5d2d6942a1519a49e8534a4e0eecbf-merged.mount: Deactivated successfully.
Dec 06 10:18:28 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2de7134938\x2dcb89\x2d4050\x2dbde5\x2dc9273441c423.mount: Deactivated successfully.
Dec 06 10:18:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3964149539' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3964149539' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:29.906 2 INFO neutron.agent.securitygroups_rpc [None req-cc7e06ae-2215-4c85-8ca6-e56c13503fc8 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:30.234 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:30 np0005548788.localdomain ceph-mon[293643]: pgmap v264: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 161 KiB/s rd, 1.7 MiB/s wr, 226 op/s
Dec 06 10:18:30 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:30.759 2 INFO neutron.agent.securitygroups_rpc [None req-e8307117-28c2-4262-9c6e-dc24bf4a796c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e152 do_prune osdmap full prune enabled
Dec 06 10:18:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e153 e153: 6 total, 6 up, 6 in
Dec 06 10:18:32 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in
Dec 06 10:18:32 np0005548788.localdomain ceph-mon[293643]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 1.3 MiB/s wr, 174 op/s
Dec 06 10:18:32 np0005548788.localdomain ceph-mon[293643]: osdmap e153: 6 total, 6 up, 6 in
Dec 06 10:18:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:33.284 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:33 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:33.526 2 INFO neutron.agent.securitygroups_rpc [None req-dd900d05-ceed-4a76-8792-94f73f7d9bdc b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:33.807 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:33.809 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:33.811 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:33.812 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[32b4d6ec-3ebd-4817-a3f6-b4f5e3bba78a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:34 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:34.363 2 INFO neutron.agent.securitygroups_rpc [None req-155fc4a8-22cb-4d06-82dd-8cfd5b79a9e9 8705da02a69e4c3281916dd7bc9ac6d1 851f2bb5c4164322946aa41fe266eb66 - - default default] Security group member updated ['6607cea2-9b0f-45af-9864-1af2923eb94b']
Dec 06 10:18:34 np0005548788.localdomain ceph-mon[293643]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 1.2 MiB/s wr, 184 op/s
Dec 06 10:18:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2082246135' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:34 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:34.833 2 INFO neutron.agent.securitygroups_rpc [None req-d5c62043-5321-4cf5-baec-1c1605bc1cd9 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:35.048 2 INFO neutron.agent.securitygroups_rpc [None req-f9e16e57-d76f-4e49-8c54-adacc8516f8a b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:35.147 2 INFO neutron.agent.securitygroups_rpc [None req-fe503d20-8e49-4871-94e0-efabb011ed42 8705da02a69e4c3281916dd7bc9ac6d1 851f2bb5c4164322946aa41fe266eb66 - - default default] Security group member updated ['6607cea2-9b0f-45af-9864-1af2923eb94b']
Dec 06 10:18:35 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:18:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:35.270 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:35 np0005548788.localdomain systemd[1]: tmp-crun.7GuxRq.mount: Deactivated successfully.
Dec 06 10:18:35 np0005548788.localdomain podman[317139]: 2025-12-06 10:18:35.284958123 +0000 UTC m=+0.108793505 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:35 np0005548788.localdomain podman[317139]: 2025-12-06 10:18:35.359734251 +0000 UTC m=+0.183569633 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:18:35 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:18:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:35.423 2 INFO neutron.agent.securitygroups_rpc [None req-f9e16e57-d76f-4e49-8c54-adacc8516f8a b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:35.688 2 INFO neutron.agent.securitygroups_rpc [None req-4e477950-abaa-4886-9df8-9dd5bb5175a4 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1644367828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:36.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:36.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:18:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:36.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:36.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:18:36 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:36.117 2 INFO neutron.agent.securitygroups_rpc [None req-14f85306-cb54-46c0-a6f6-e09d3e175b2a b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:36 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:36.652 2 INFO neutron.agent.securitygroups_rpc [None req-af2f65bf-97b7-4bb0-b9fe-3c28224c3c96 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:36 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:36.692 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:36 np0005548788.localdomain ceph-mon[293643]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 1.0 MiB/s wr, 151 op/s
Dec 06 10:18:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:38.025 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:38 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:38.028 2 INFO neutron.agent.securitygroups_rpc [None req-1f216153-8df9-4f5f-9520-bf151df27051 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e153 do_prune osdmap full prune enabled
Dec 06 10:18:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:18:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:18:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e154 e154: 6 total, 6 up, 6 in
Dec 06 10:18:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:18:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in
Dec 06 10:18:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:38.287 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548788.localdomain podman[317164]: 2025-12-06 10:18:38.288667829 +0000 UTC m=+0.107583657 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:18:38 np0005548788.localdomain podman[317165]: 2025-12-06 10:18:38.330234027 +0000 UTC m=+0.142950004 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:18:38 np0005548788.localdomain podman[317164]: 2025-12-06 10:18:38.33391886 +0000 UTC m=+0.152834648 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true)
Dec 06 10:18:38 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:18:38 np0005548788.localdomain podman[317165]: 2025-12-06 10:18:38.367681968 +0000 UTC m=+0.180397945 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:18:38 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:18:38 np0005548788.localdomain podman[317166]: 2025-12-06 10:18:38.424864765 +0000 UTC m=+0.235576240 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Dec 06 10:18:38 np0005548788.localdomain podman[317166]: 2025-12-06 10:18:38.442635131 +0000 UTC m=+0.253346536 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Dec 06 10:18:38 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:18:38 np0005548788.localdomain ceph-mon[293643]: pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 863 KiB/s wr, 126 op/s
Dec 06 10:18:38 np0005548788.localdomain ceph-mon[293643]: osdmap e154: 6 total, 6 up, 6 in
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:18:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:18:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:18:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:39.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:39.030 2 INFO neutron.agent.securitygroups_rpc [None req-df5d0c63-3dbc-41ec-8a7e-d627e1beca42 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1177396961' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1177396961' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:39.882 2 INFO neutron.agent.securitygroups_rpc [None req-57bddb58-8e7b-4200-a14c-4d9431ae075f b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:40 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:40.247 262572 INFO neutron.agent.linux.ip_lib [None req-0600c422-846f-4451-9371-daf6405d80f6 - - - - - -] Device tapc036f5bf-a9 cannot be used as it has no MAC address
Dec 06 10:18:40 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:40.272 2 INFO neutron.agent.securitygroups_rpc [None req-dae126c9-280d-4a4d-ad9b-17df376d8729 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:40.312 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:40.320 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548788.localdomain kernel: device tapc036f5bf-a9 entered promiscuous mode
Dec 06 10:18:40 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016320.3286] manager: (tapc036f5bf-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Dec 06 10:18:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:40.331 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:40Z|00140|binding|INFO|Claiming lport c036f5bf-a988-48f3-b8ce-bd0fe1ab6099 for this chassis.
Dec 06 10:18:40 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:40Z|00141|binding|INFO|c036f5bf-a988-48f3-b8ce-bd0fe1ab6099: Claiming unknown
Dec 06 10:18:40 np0005548788.localdomain systemd-udevd[317238]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:40.340 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-67f408d1-ef97-4090-9fb1-d5e305e5ffa5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67f408d1-ef97-4090-9fb1-d5e305e5ffa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50977625-0add-42be-b555-d8a885f97cb7, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=c036f5bf-a988-48f3-b8ce-bd0fe1ab6099) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:40.341 159620 INFO neutron.agent.ovn.metadata.agent [-] Port c036f5bf-a988-48f3-b8ce-bd0fe1ab6099 in datapath 67f408d1-ef97-4090-9fb1-d5e305e5ffa5 bound to our chassis
Dec 06 10:18:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:40.343 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9975dad3-6602-4c85-bc8a-e46ea3bf58b1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:40.343 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 67f408d1-ef97-4090-9fb1-d5e305e5ffa5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:40.344 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a7eca3a5-42df-4883-8243-50b923484ce9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:40 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:40.352 2 INFO neutron.agent.securitygroups_rpc [None req-e4bf1752-f0a3-4484-84a7-e670337a989c b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:40.364 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:40Z|00142|binding|INFO|Setting lport c036f5bf-a988-48f3-b8ce-bd0fe1ab6099 ovn-installed in OVS
Dec 06 10:18:40 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:40Z|00143|binding|INFO|Setting lport c036f5bf-a988-48f3-b8ce-bd0fe1ab6099 up in Southbound
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc036f5bf-a9: No such device
Dec 06 10:18:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:40.401 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:40.430 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548788.localdomain ceph-mon[293643]: pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 895 B/s wr, 25 op/s
Dec 06 10:18:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e154 do_prune osdmap full prune enabled
Dec 06 10:18:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e155 e155: 6 total, 6 up, 6 in
Dec 06 10:18:40 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in
Dec 06 10:18:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:41.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:41.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:41 np0005548788.localdomain podman[317309]: 
Dec 06 10:18:41 np0005548788.localdomain podman[317309]: 2025-12-06 10:18:41.429336916 +0000 UTC m=+0.089453590 container create aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:18:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40.scope.
Dec 06 10:18:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:41 np0005548788.localdomain podman[317309]: 2025-12-06 10:18:41.388653915 +0000 UTC m=+0.048770659 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:41 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73c623faf063fa6cfb7e402194b7be00497968d49def09753f068a74ea8f8ff1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:41 np0005548788.localdomain podman[317309]: 2025-12-06 10:18:41.503627278 +0000 UTC m=+0.163743962 container init aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:41 np0005548788.localdomain podman[317309]: 2025-12-06 10:18:41.516652239 +0000 UTC m=+0.176768923 container start aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:18:41 np0005548788.localdomain dnsmasq[317328]: started, version 2.85 cachesize 150
Dec 06 10:18:41 np0005548788.localdomain dnsmasq[317328]: DNS service limited to local subnets
Dec 06 10:18:41 np0005548788.localdomain dnsmasq[317328]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:41 np0005548788.localdomain dnsmasq[317328]: warning: no upstream servers configured
Dec 06 10:18:41 np0005548788.localdomain dnsmasq-dhcp[317328]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:41 np0005548788.localdomain dnsmasq[317328]: read /var/lib/neutron/dhcp/67f408d1-ef97-4090-9fb1-d5e305e5ffa5/addn_hosts - 0 addresses
Dec 06 10:18:41 np0005548788.localdomain dnsmasq-dhcp[317328]: read /var/lib/neutron/dhcp/67f408d1-ef97-4090-9fb1-d5e305e5ffa5/host
Dec 06 10:18:41 np0005548788.localdomain dnsmasq-dhcp[317328]: read /var/lib/neutron/dhcp/67f408d1-ef97-4090-9fb1-d5e305e5ffa5/opts
Dec 06 10:18:41 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:41.534 2 INFO neutron.agent.securitygroups_rpc [None req-652b9b12-ec43-4a29-b268-053f1f58f2a3 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:41 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:41.717 262572 INFO neutron.agent.dhcp.agent [None req-dcdef551-712e-4a49-bf68-0c8133ba0a50 - - - - - -] DHCP configuration for ports {'229f8f2e-0e8d-4274-a951-ef6a44d1c4bc'} is completed
Dec 06 10:18:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e155 do_prune osdmap full prune enabled
Dec 06 10:18:41 np0005548788.localdomain ceph-mon[293643]: osdmap e155: 6 total, 6 up, 6 in
Dec 06 10:18:41 np0005548788.localdomain ceph-mon[293643]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 3.6 KiB/s rd, 511 B/s wr, 5 op/s
Dec 06 10:18:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e156 e156: 6 total, 6 up, 6 in
Dec 06 10:18:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in
Dec 06 10:18:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:41Z|00144|binding|INFO|Removing iface tapc036f5bf-a9 ovn-installed in OVS
Dec 06 10:18:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:41.807 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 9975dad3-6602-4c85-bc8a-e46ea3bf58b1 with type ""
Dec 06 10:18:41 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:18:41Z|00145|binding|INFO|Removing lport c036f5bf-a988-48f3-b8ce-bd0fe1ab6099 ovn-installed in OVS
Dec 06 10:18:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:41.809 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-67f408d1-ef97-4090-9fb1-d5e305e5ffa5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67f408d1-ef97-4090-9fb1-d5e305e5ffa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50977625-0add-42be-b555-d8a885f97cb7, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=c036f5bf-a988-48f3-b8ce-bd0fe1ab6099) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:41.810 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:41.812 159620 INFO neutron.agent.ovn.metadata.agent [-] Port c036f5bf-a988-48f3-b8ce-bd0fe1ab6099 in datapath 67f408d1-ef97-4090-9fb1-d5e305e5ffa5 unbound from our chassis
Dec 06 10:18:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:41.814 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 67f408d1-ef97-4090-9fb1-d5e305e5ffa5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:41 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:41.815 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6333c6-026f-41e4-9200-754b3b4c754e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:41.817 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:41 np0005548788.localdomain dnsmasq[317328]: exiting on receipt of SIGTERM
Dec 06 10:18:41 np0005548788.localdomain podman[317346]: 2025-12-06 10:18:41.910870033 +0000 UTC m=+0.061837322 container kill aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:18:41 np0005548788.localdomain systemd[1]: libpod-aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40.scope: Deactivated successfully.
Dec 06 10:18:41 np0005548788.localdomain podman[317359]: 2025-12-06 10:18:41.980686488 +0000 UTC m=+0.057672032 container died aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:42.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:42 np0005548788.localdomain podman[317359]: 2025-12-06 10:18:42.020052108 +0000 UTC m=+0.097037602 container cleanup aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:18:42 np0005548788.localdomain systemd[1]: libpod-conmon-aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40.scope: Deactivated successfully.
Dec 06 10:18:42 np0005548788.localdomain podman[317366]: 2025-12-06 10:18:42.045873832 +0000 UTC m=+0.108393542 container remove aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f408d1-ef97-4090-9fb1-d5e305e5ffa5, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:42.063 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:42 np0005548788.localdomain kernel: device tapc036f5bf-a9 left promiscuous mode
Dec 06 10:18:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:42.078 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:42 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:42.101 262572 INFO neutron.agent.dhcp.agent [None req-01006125-b7b5-4dde-814d-b259233fb674 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:42 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:42.102 262572 INFO neutron.agent.dhcp.agent [None req-01006125-b7b5-4dde-814d-b259233fb674 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:42 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:42.103 262572 INFO neutron.agent.dhcp.agent [None req-01006125-b7b5-4dde-814d-b259233fb674 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:42 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:42.321 2 INFO neutron.agent.securitygroups_rpc [None req-b75aa7a3-5af1-4cd3-b1b1-cfd423d7e2ab 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:42 np0005548788.localdomain systemd[1]: tmp-crun.MaKGyD.mount: Deactivated successfully.
Dec 06 10:18:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-73c623faf063fa6cfb7e402194b7be00497968d49def09753f068a74ea8f8ff1-merged.mount: Deactivated successfully.
Dec 06 10:18:42 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aaaf0491ead7db99176dfce88147228e407f9619fd88c43376c4bb5e59af4b40-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:42 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d67f408d1\x2def97\x2d4090\x2d9fb1\x2dd5e305e5ffa5.mount: Deactivated successfully.
Dec 06 10:18:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:42.479 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:42 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:42.586 2 INFO neutron.agent.securitygroups_rpc [None req-f4287026-ab49-4456-bd2b-fbcf52c0630e a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1970788286' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1970788286' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: osdmap e156: 6 total, 6 up, 6 in
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1970788286' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1970788286' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.023 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.024 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:43 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:43.024 2 INFO neutron.agent.securitygroups_rpc [None req-55643603-9572-45b2-ae71-f445e3294506 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.046 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.047 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.047 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.047 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.048 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.290 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:18:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/409284351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.575 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.735 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.736 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11559MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.736 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:43.737 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:43 np0005548788.localdomain ceph-mon[293643]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 5.2 KiB/s wr, 111 op/s
Dec 06 10:18:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4278276739' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4278276739' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/409284351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.010 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.011 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:18:44 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:44.058 2 INFO neutron.agent.securitygroups_rpc [None req-255af1be-4d8f-48ce-b409-88074fe3f28a a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.242 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:18:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:18:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3870109751' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.712 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.719 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.738 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.741 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:18:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:44.742 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:44 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:44.760 2 INFO neutron.agent.securitygroups_rpc [None req-f9323177-d4e4-4dce-bd8f-2cc985b7b1dc 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2594254949' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3870109751' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:45 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:45.343 2 INFO neutron.agent.securitygroups_rpc [None req-ec83bf10-c909-4c2a-a1a4-0827521eece8 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:45.362 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:45.724 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:45.724 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:45 np0005548788.localdomain ceph-mon[293643]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 3.9 KiB/s wr, 89 op/s
Dec 06 10:18:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2023953488' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:46 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:46.531 2 INFO neutron.agent.securitygroups_rpc [None req-28a20db1-a3bb-47de-ad39-5c494614c36d 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:47.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:47.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:18:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:47.027 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:18:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e156 do_prune osdmap full prune enabled
Dec 06 10:18:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e157 e157: 6 total, 6 up, 6 in
Dec 06 10:18:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in
Dec 06 10:18:47 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:47.313 2 INFO neutron.agent.securitygroups_rpc [None req-e6ce568b-b382-4ede-9132-8960f2608a77 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:47.442 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:47.442 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:18:47.442 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:47 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:47.683 2 INFO neutron.agent.securitygroups_rpc [None req-7750bef0-0e9a-45d8-b031-72812daa7ba7 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:48 np0005548788.localdomain ceph-mon[293643]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 3.4 KiB/s wr, 77 op/s
Dec 06 10:18:48 np0005548788.localdomain ceph-mon[293643]: osdmap e157: 6 total, 6 up, 6 in
Dec 06 10:18:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:48.293 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:48 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:48.428 2 INFO neutron.agent.securitygroups_rpc [None req-ca0f0e3b-d5e2-4383-9ed8-27fd62359bae 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:48 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:48.811 2 INFO neutron.agent.securitygroups_rpc [None req-18dbc826-4a96-46e3-9a3f-9599a1372c95 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e47: np0005548790.kvkfyr(active, since 7m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:18:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:18:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:18:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:18:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/352148416' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/352148416' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:18:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18733 "" "Go-http-client/1.1"
Dec 06 10:18:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:18:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:18:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "format": "json"}]: dispatch
Dec 06 10:18:50 np0005548788.localdomain ceph-mon[293643]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 4.2 KiB/s wr, 86 op/s
Dec 06 10:18:50 np0005548788.localdomain ceph-mon[293643]: mgrmap e47: np0005548790.kvkfyr(active, since 7m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:18:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/352148416' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/352148416' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:50 np0005548788.localdomain podman[317431]: 2025-12-06 10:18:50.258266044 +0000 UTC m=+0.083447346 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:18:50 np0005548788.localdomain podman[317431]: 2025-12-06 10:18:50.272690007 +0000 UTC m=+0.097871299 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:18:50 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:18:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:50.362 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:50 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:50.838 2 INFO neutron.agent.securitygroups_rpc [None req-a73968c5-9b01-4d0f-920e-d4625a021612 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:51.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:51 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:51.393 2 INFO neutron.agent.securitygroups_rpc [None req-3e522469-6174-4bd8-8fa4-46c3281a8670 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:51 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:51.423 2 INFO neutron.agent.securitygroups_rpc [None req-c981ff7c-a5c4-4968-95e1-73b35c2abc32 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:51 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:51.786 2 INFO neutron.agent.securitygroups_rpc [None req-57ba5b5d-0488-433d-83ba-25f85616d546 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:18:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:18:52 np0005548788.localdomain ceph-mon[293643]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.6 KiB/s wr, 73 op/s
Dec 06 10:18:52 np0005548788.localdomain podman[317449]: 2025-12-06 10:18:52.254322465 +0000 UTC m=+0.081884468 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:18:52 np0005548788.localdomain podman[317449]: 2025-12-06 10:18:52.263979491 +0000 UTC m=+0.091541484 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:18:52 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:18:52 np0005548788.localdomain systemd[1]: tmp-crun.expbV0.mount: Deactivated successfully.
Dec 06 10:18:52 np0005548788.localdomain podman[317450]: 2025-12-06 10:18:52.330448234 +0000 UTC m=+0.151721563 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Dec 06 10:18:52 np0005548788.localdomain podman[317450]: 2025-12-06 10:18:52.335908692 +0000 UTC m=+0.157182061 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:52 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:18:52 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:52.375 2 INFO neutron.agent.securitygroups_rpc [None req-db878ba9-86fb-4d9b-8254-a77b4f9b264f a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:52 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:52.443 2 INFO neutron.agent.securitygroups_rpc [None req-d89000d1-9304-4659-90ea-3fec18561423 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:52.487 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:53.295 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:53 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:53.634 2 INFO neutron.agent.securitygroups_rpc [None req-7967e84a-5cad-44b5-8ea9-0783854ccdc0 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:54 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:54.135 2 INFO neutron.agent.securitygroups_rpc [None req-47656a87-41ce-4f30-afbc-1720c247f9e1 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:54 np0005548788.localdomain ceph-mon[293643]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 4.1 KiB/s wr, 23 op/s
Dec 06 10:18:54 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:54.903 2 INFO neutron.agent.securitygroups_rpc [None req-fc107b59-6198-4194-ae82-133aadd3bf55 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:54.923 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:55 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e48: np0005548790.kvkfyr(active, since 7m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:18:55 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "format": "json"}]: dispatch
Dec 06 10:18:55 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "force": true, "format": "json"}]: dispatch
Dec 06 10:18:55 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:55.353 2 INFO neutron.agent.securitygroups_rpc [None req-6d09d5a9-00b8-48ae-971d-442c4fe5ddd4 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:55.400 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:56.152 2 INFO neutron.agent.securitygroups_rpc [None req-49caf4e8-1cf3-4ac4-8ed0-7c14787e9a49 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:56 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:18:56.177 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:56 np0005548788.localdomain ceph-mon[293643]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 4.1 KiB/s wr, 23 op/s
Dec 06 10:18:56 np0005548788.localdomain ceph-mon[293643]: mgrmap e48: np0005548790.kvkfyr(active, since 7m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:18:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:57 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:57.376 2 INFO neutron.agent.securitygroups_rpc [None req-371fcf7f-6858-4f55-8627-7d44578a7f6c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:57 np0005548788.localdomain sudo[317491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:18:57 np0005548788.localdomain sudo[317491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:18:57 np0005548788.localdomain sudo[317491]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:57 np0005548788.localdomain sudo[317509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:18:57 np0005548788.localdomain sudo[317509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:18:58 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:58.224 2 INFO neutron.agent.securitygroups_rpc [None req-dcd65c97-9e9b-434b-8250-c459c3b8a42b a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:18:58.298 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:58 np0005548788.localdomain ceph-mon[293643]: pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 4.1 KiB/s wr, 23 op/s
Dec 06 10:18:58 np0005548788.localdomain sudo[317509]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:18:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:18:58 np0005548788.localdomain sudo[317559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:18:58 np0005548788.localdomain sudo[317559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:18:58 np0005548788.localdomain sudo[317559]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:18:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:18:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:18:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:18:59 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:18:59.741 2 INFO neutron.agent.securitygroups_rpc [None req-39208e2c-f7c0-489d-b264-ceaa95c43793 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:00 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:00.368 262572 INFO neutron.agent.linux.ip_lib [None req-a2826a2b-e2a2-4648-b92c-5bf38b00092b - - - - - -] Device tapfe2e8335-1a cannot be used as it has no MAC address
Dec 06 10:19:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:00.429 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548788.localdomain kernel: device tapfe2e8335-1a entered promiscuous mode
Dec 06 10:19:00 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016340.4391] manager: (tapfe2e8335-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Dec 06 10:19:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:00.443 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548788.localdomain systemd-udevd[317587]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:00Z|00146|binding|INFO|Claiming lport fe2e8335-1a62-40a5-8cfd-a60e344d38eb for this chassis.
Dec 06 10:19:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:00Z|00147|binding|INFO|fe2e8335-1a62-40a5-8cfd-a60e344d38eb: Claiming unknown
Dec 06 10:19:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:00.466 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd8b2850-e3e7-477f-8017-199231500400, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=fe2e8335-1a62-40a5-8cfd-a60e344d38eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:00.469 159620 INFO neutron.agent.ovn.metadata.agent [-] Port fe2e8335-1a62-40a5-8cfd-a60e344d38eb in datapath 9beccfed-6ce7-4343-a09a-a10df412729f bound to our chassis
Dec 06 10:19:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:00.472 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 48eab5dd-4ec3-475a-b86b-8bd746b32fcb IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:00.473 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9beccfed-6ce7-4343-a09a-a10df412729f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:00 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:00.476 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[62160b8f-075a-4133-b8cf-907d78862b35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:00Z|00148|binding|INFO|Setting lport fe2e8335-1a62-40a5-8cfd-a60e344d38eb ovn-installed in OVS
Dec 06 10:19:00 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:00Z|00149|binding|INFO|Setting lport fe2e8335-1a62-40a5-8cfd-a60e344d38eb up in Southbound
Dec 06 10:19:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:00.481 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfe2e8335-1a: No such device
Dec 06 10:19:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:00.517 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:00.545 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548788.localdomain ceph-mon[293643]: pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 6.7 KiB/s wr, 35 op/s
Dec 06 10:19:01 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:01.286 2 INFO neutron.agent.securitygroups_rpc [None req-7d4e4684-fcf3-4893-8a23-2e4dea6b64ed 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:01 np0005548788.localdomain podman[317657]: 
Dec 06 10:19:01 np0005548788.localdomain podman[317657]: 2025-12-06 10:19:01.430180905 +0000 UTC m=+0.091105170 container create 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:19:01 np0005548788.localdomain podman[317657]: 2025-12-06 10:19:01.387276867 +0000 UTC m=+0.048201152 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:01 np0005548788.localdomain systemd[1]: Started libpod-conmon-35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e.scope.
Dec 06 10:19:01 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:01 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e87814781b0951e9b8270f579ba309ff1e08abed53f2afb2703a1705afc45f50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:01 np0005548788.localdomain podman[317657]: 2025-12-06 10:19:01.525599728 +0000 UTC m=+0.186523993 container init 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:19:01 np0005548788.localdomain podman[317657]: 2025-12-06 10:19:01.534507601 +0000 UTC m=+0.195431856 container start 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:01 np0005548788.localdomain dnsmasq[317675]: started, version 2.85 cachesize 150
Dec 06 10:19:01 np0005548788.localdomain dnsmasq[317675]: DNS service limited to local subnets
Dec 06 10:19:01 np0005548788.localdomain dnsmasq[317675]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:01 np0005548788.localdomain dnsmasq[317675]: warning: no upstream servers configured
Dec 06 10:19:01 np0005548788.localdomain dnsmasq-dhcp[317675]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:01 np0005548788.localdomain dnsmasq[317675]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 0 addresses
Dec 06 10:19:01 np0005548788.localdomain dnsmasq-dhcp[317675]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:01 np0005548788.localdomain dnsmasq-dhcp[317675]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:01 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:01.863 262572 INFO neutron.agent.dhcp.agent [None req-e4ae5225-8566-4b0c-b703-c54863808483 - - - - - -] DHCP configuration for ports {'5e418b23-64fb-4cc3-b4f5-351454b6f675'} is completed
Dec 06 10:19:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:19:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:19:02 np0005548788.localdomain dnsmasq[317675]: exiting on receipt of SIGTERM
Dec 06 10:19:02 np0005548788.localdomain podman[317693]: 2025-12-06 10:19:02.082673997 +0000 UTC m=+0.070422945 container kill 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:19:02 np0005548788.localdomain systemd[1]: libpod-35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e.scope: Deactivated successfully.
Dec 06 10:19:02 np0005548788.localdomain podman[317706]: 2025-12-06 10:19:02.153789462 +0000 UTC m=+0.058865660 container died 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:19:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:02 np0005548788.localdomain podman[317706]: 2025-12-06 10:19:02.194284177 +0000 UTC m=+0.099360325 container cleanup 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:19:02 np0005548788.localdomain systemd[1]: libpod-conmon-35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e.scope: Deactivated successfully.
Dec 06 10:19:02 np0005548788.localdomain podman[317713]: 2025-12-06 10:19:02.239549128 +0000 UTC m=+0.127763808 container remove 35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:19:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:02.317 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:3e:86 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd8b2850-e3e7-477f-8017-199231500400, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5e418b23-64fb-4cc3-b4f5-351454b6f675) old=Port_Binding(mac=['fa:16:3e:fe:3e:86 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:02.319 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5e418b23-64fb-4cc3-b4f5-351454b6f675 in datapath 9beccfed-6ce7-4343-a09a-a10df412729f updated
Dec 06 10:19:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:02.322 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 48eab5dd-4ec3-475a-b86b-8bd746b32fcb IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:02.322 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9beccfed-6ce7-4343-a09a-a10df412729f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:02.323 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a2144ad5-21c9-4ffa-b6f8-e9ae2ac83685]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e87814781b0951e9b8270f579ba309ff1e08abed53f2afb2703a1705afc45f50-merged.mount: Deactivated successfully.
Dec 06 10:19:02 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35a79605b0cb607f3148d86744a4574ac0fa888f9454900b3e2e069b98ca7f6e-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:02 np0005548788.localdomain ceph-mon[293643]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 6.1 KiB/s wr, 29 op/s
Dec 06 10:19:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:19:02 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:02.690 2 INFO neutron.agent.securitygroups_rpc [None req-84b84473-0a2d-4cb4-b946-8bdffecc1ba7 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:02 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:02.872 2 INFO neutron.agent.securitygroups_rpc [None req-9f09c577-620d-43b3-bb16-a6c9b188fc98 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:03.299 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:03.575 2 INFO neutron.agent.securitygroups_rpc [None req-dcc65382-3d9e-4656-a85e-ef8650caf3cc b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:04 np0005548788.localdomain podman[317785]: 
Dec 06 10:19:04 np0005548788.localdomain podman[317785]: 2025-12-06 10:19:04.300309438 +0000 UTC m=+0.070962683 container create 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:19:04 np0005548788.localdomain systemd[1]: Started libpod-conmon-6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398.scope.
Dec 06 10:19:04 np0005548788.localdomain systemd[1]: tmp-crun.zcS7NF.mount: Deactivated successfully.
Dec 06 10:19:04 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:04 np0005548788.localdomain podman[317785]: 2025-12-06 10:19:04.263029291 +0000 UTC m=+0.033682576 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:04 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc872f3f147d38144c7dcaac0a173463b5f53c68c17bed34dee19b67321bcb92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:04 np0005548788.localdomain podman[317785]: 2025-12-06 10:19:04.375516769 +0000 UTC m=+0.146169994 container init 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:19:04 np0005548788.localdomain podman[317785]: 2025-12-06 10:19:04.386490276 +0000 UTC m=+0.157143501 container start 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:04 np0005548788.localdomain dnsmasq[317803]: started, version 2.85 cachesize 150
Dec 06 10:19:04 np0005548788.localdomain dnsmasq[317803]: DNS service limited to local subnets
Dec 06 10:19:04 np0005548788.localdomain dnsmasq[317803]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:04 np0005548788.localdomain dnsmasq[317803]: warning: no upstream servers configured
Dec 06 10:19:04 np0005548788.localdomain dnsmasq-dhcp[317803]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 06 10:19:04 np0005548788.localdomain dnsmasq-dhcp[317803]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:04 np0005548788.localdomain dnsmasq[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 0 addresses
Dec 06 10:19:04 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:04 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:04 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:04.434 2 INFO neutron.agent.securitygroups_rpc [None req-debf1305-ba6e-49c2-9083-8908dd68e972 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:04.461 262572 INFO neutron.agent.dhcp.agent [None req-9883938c-638c-4508-baf5-87b44f6206f1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c69a97f0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f18c69a9c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c69da6d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f18c69da220>], id=438b1b90-b74f-47e2-9a35-87a65117a2d2, ip_allocation=immediate, mac_address=fa:16:3e:ba:e6:31, name=tempest-PortsTestJSON-495983017, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:56Z, description=, dns_domain=, id=9beccfed-6ce7-4343-a09a-a10df412729f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1638694477, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27775, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=1972, status=ACTIVE, subnets=['05477320-3800-4e70-b141-6b5621089ccf', 'e43f490f-6938-4d6f-bfc9-927c7b2a3f59'], tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:00Z, vlan_transparent=None, network_id=9beccfed-6ce7-4343-a09a-a10df412729f, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2023, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:02Z on network 9beccfed-6ce7-4343-a09a-a10df412729f
Dec 06 10:19:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:04.586 262572 INFO neutron.agent.dhcp.agent [None req-9cce7135-b15b-422f-b19f-02ab807d947c - - - - - -] DHCP configuration for ports {'5e418b23-64fb-4cc3-b4f5-351454b6f675', 'fe2e8335-1a62-40a5-8cfd-a60e344d38eb'} is completed
Dec 06 10:19:04 np0005548788.localdomain ceph-mon[293643]: pgmap v286: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 7.3 KiB/s wr, 39 op/s
Dec 06 10:19:04 np0005548788.localdomain dnsmasq[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 2 addresses
Dec 06 10:19:04 np0005548788.localdomain podman[317821]: 2025-12-06 10:19:04.783160696 +0000 UTC m=+0.066424223 container kill 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:19:04 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:04 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:04.971 262572 INFO neutron.agent.dhcp.agent [None req-9f0f92ee-4649-4ca1-bf8b-0129cdeb5118 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6856dc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6856820>], id=438b1b90-b74f-47e2-9a35-87a65117a2d2, ip_allocation=immediate, mac_address=fa:16:3e:ba:e6:31, name=tempest-PortsTestJSON-495983017, network_id=9beccfed-6ce7-4343-a09a-a10df412729f, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2023, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:03Z on network 9beccfed-6ce7-4343-a09a-a10df412729f
Dec 06 10:19:05 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:05.017 2 INFO neutron.agent.securitygroups_rpc [None req-91092cc5-6d84-4cc3-b0ed-55c483b81857 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:05.086 262572 INFO neutron.agent.dhcp.agent [None req-5d4dfa8b-150a-408c-95ca-ef58626235cb - - - - - -] DHCP configuration for ports {'438b1b90-b74f-47e2-9a35-87a65117a2d2'} is completed
Dec 06 10:19:05 np0005548788.localdomain podman[317861]: 2025-12-06 10:19:05.261999611 +0000 UTC m=+0.052852345 container kill 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:05 np0005548788.localdomain dnsmasq[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 1 addresses
Dec 06 10:19:05 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:05 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:05.460 262572 INFO neutron.agent.dhcp.agent [None req-767c2dcf-fd1d-408d-a32d-4f21ac4a4722 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67a1b80>, <neutron.agent.linux.dhcp.DictModel object at 0x7f18c69a96d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67a1fd0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f18c67a1e50>], id=438b1b90-b74f-47e2-9a35-87a65117a2d2, ip_allocation=immediate, mac_address=fa:16:3e:ba:e6:31, name=tempest-PortsTestJSON-495983017, network_id=9beccfed-6ce7-4343-a09a-a10df412729f, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2023, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:04Z on network 9beccfed-6ce7-4343-a09a-a10df412729f
Dec 06 10:19:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:05.461 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:05.580 262572 INFO neutron.agent.dhcp.agent [None req-387f9df1-30d7-4b9f-ab96-cc7f2bfbaffb - - - - - -] DHCP configuration for ports {'438b1b90-b74f-47e2-9a35-87a65117a2d2'} is completed
Dec 06 10:19:05 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:05.581 2 INFO neutron.agent.securitygroups_rpc [None req-03766a42-54e7-4e6a-a01a-d12c463a6613 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e157 do_prune osdmap full prune enabled
Dec 06 10:19:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e158 e158: 6 total, 6 up, 6 in
Dec 06 10:19:05 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in
Dec 06 10:19:05 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:05.723 2 INFO neutron.agent.securitygroups_rpc [None req-9d59c4b8-d3a8-40d9-8d73-2b90f45c1e12 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:05 np0005548788.localdomain dnsmasq[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 2 addresses
Dec 06 10:19:05 np0005548788.localdomain podman[317899]: 2025-12-06 10:19:05.805764261 +0000 UTC m=+0.066904907 container kill 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:19:05 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:05 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:19:05 np0005548788.localdomain podman[317914]: 2025-12-06 10:19:05.933352352 +0000 UTC m=+0.097429935 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:19:06 np0005548788.localdomain podman[317914]: 2025-12-06 10:19:06.004674164 +0000 UTC m=+0.168751697 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller)
Dec 06 10:19:06 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:19:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:06.153 262572 INFO neutron.agent.dhcp.agent [None req-beb35c36-fb12-4a85-9132-d3a6844e90de - - - - - -] DHCP configuration for ports {'438b1b90-b74f-47e2-9a35-87a65117a2d2'} is completed
Dec 06 10:19:06 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:06.286 2 INFO neutron.agent.securitygroups_rpc [None req-13d35407-bef4-4c5e-baaa-9390a0fcd613 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:06 np0005548788.localdomain dnsmasq[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 0 addresses
Dec 06 10:19:06 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:06 np0005548788.localdomain dnsmasq-dhcp[317803]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:06 np0005548788.localdomain podman[317962]: 2025-12-06 10:19:06.615933798 +0000 UTC m=+0.068755383 container kill 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:19:06 np0005548788.localdomain ceph-mon[293643]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 4.5 KiB/s wr, 25 op/s
Dec 06 10:19:06 np0005548788.localdomain ceph-mon[293643]: osdmap e158: 6 total, 6 up, 6 in
Dec 06 10:19:06 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3936622661' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:06 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3936622661' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:19:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:08.302 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:08 np0005548788.localdomain ceph-mon[293643]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 4.1 MiB/s rd, 5.4 KiB/s wr, 30 op/s
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:19:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: tmp-crun.lNzKXr.mount: Deactivated successfully.
Dec 06 10:19:09 np0005548788.localdomain podman[317985]: 2025-12-06 10:19:09.283289357 +0000 UTC m=+0.101709456 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:19:09 np0005548788.localdomain podman[317985]: 2025-12-06 10:19:09.323652408 +0000 UTC m=+0.142072487 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:19:09 np0005548788.localdomain podman[317984]: 2025-12-06 10:19:09.420412171 +0000 UTC m=+0.239712027 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 10:19:09 np0005548788.localdomain podman[317986]: 2025-12-06 10:19:09.338610218 +0000 UTC m=+0.151181987 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=)
Dec 06 10:19:09 np0005548788.localdomain podman[317984]: 2025-12-06 10:19:09.460667799 +0000 UTC m=+0.279967615 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:19:09 np0005548788.localdomain podman[317986]: 2025-12-06 10:19:09.470974785 +0000 UTC m=+0.283546604 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:19:09 np0005548788.localdomain dnsmasq[317803]: exiting on receipt of SIGTERM
Dec 06 10:19:09 np0005548788.localdomain podman[318051]: 2025-12-06 10:19:09.536128738 +0000 UTC m=+0.168944053 container kill 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: libpod-6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398.scope: Deactivated successfully.
Dec 06 10:19:09 np0005548788.localdomain podman[318076]: 2025-12-06 10:19:09.612967819 +0000 UTC m=+0.060051966 container died 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:19:09 np0005548788.localdomain podman[318076]: 2025-12-06 10:19:09.643775246 +0000 UTC m=+0.090859343 container cleanup 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:09 np0005548788.localdomain systemd[1]: libpod-conmon-6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398.scope: Deactivated successfully.
Dec 06 10:19:09 np0005548788.localdomain podman[318078]: 2025-12-06 10:19:09.692887205 +0000 UTC m=+0.132496583 container remove 6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:19:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e158 do_prune osdmap full prune enabled
Dec 06 10:19:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e159 e159: 6 total, 6 up, 6 in
Dec 06 10:19:09 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in
Dec 06 10:19:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:10.193 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 48eab5dd-4ec3-475a-b86b-8bd746b32fcb with type ""
Dec 06 10:19:10 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:10Z|00150|binding|INFO|Removing iface tapfe2e8335-1a ovn-installed in OVS
Dec 06 10:19:10 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:10Z|00151|binding|INFO|Removing lport fe2e8335-1a62-40a5-8cfd-a60e344d38eb ovn-installed in OVS
Dec 06 10:19:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:10.197 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd8b2850-e3e7-477f-8017-199231500400, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=fe2e8335-1a62-40a5-8cfd-a60e344d38eb) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:10.199 159620 INFO neutron.agent.ovn.metadata.agent [-] Port fe2e8335-1a62-40a5-8cfd-a60e344d38eb in datapath 9beccfed-6ce7-4343-a09a-a10df412729f unbound from our chassis
Dec 06 10:19:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:10.203 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9beccfed-6ce7-4343-a09a-a10df412729f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:10.204 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b41e3d8-be58-46be-be07-9b8cff0d496c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:10.233 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548788.localdomain systemd[1]: tmp-crun.C4h4i3.mount: Deactivated successfully.
Dec 06 10:19:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cc872f3f147d38144c7dcaac0a173463b5f53c68c17bed34dee19b67321bcb92-merged.mount: Deactivated successfully.
Dec 06 10:19:10 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6adff66f03a067113e07ad08aa7938c16381c9c9e2a87c992df7e398bf6bb398-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:10.464 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:10.635 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548788.localdomain podman[318155]: 
Dec 06 10:19:10 np0005548788.localdomain ceph-mon[293643]: pgmap v290: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 4.3 MiB/s wr, 130 op/s
Dec 06 10:19:10 np0005548788.localdomain ceph-mon[293643]: osdmap e159: 6 total, 6 up, 6 in
Dec 06 10:19:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/961294606' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/961294606' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:10 np0005548788.localdomain podman[318155]: 2025-12-06 10:19:10.730154951 +0000 UTC m=+0.101868631 container create 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:19:10 np0005548788.localdomain systemd[1]: Started libpod-conmon-11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286.scope.
Dec 06 10:19:10 np0005548788.localdomain podman[318155]: 2025-12-06 10:19:10.683675532 +0000 UTC m=+0.055389272 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:10 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:10 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05edca9a4382ba162c9f24d37981aed435d91af2c814c6209398df316f3f617d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:10 np0005548788.localdomain podman[318155]: 2025-12-06 10:19:10.812165881 +0000 UTC m=+0.183879561 container init 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:19:10 np0005548788.localdomain podman[318155]: 2025-12-06 10:19:10.819531478 +0000 UTC m=+0.191245188 container start 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:10 np0005548788.localdomain dnsmasq[318173]: started, version 2.85 cachesize 150
Dec 06 10:19:10 np0005548788.localdomain dnsmasq[318173]: DNS service limited to local subnets
Dec 06 10:19:10 np0005548788.localdomain dnsmasq[318173]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:10 np0005548788.localdomain dnsmasq[318173]: warning: no upstream servers configured
Dec 06 10:19:10 np0005548788.localdomain dnsmasq-dhcp[318173]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:10 np0005548788.localdomain dnsmasq[318173]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/addn_hosts - 0 addresses
Dec 06 10:19:10 np0005548788.localdomain dnsmasq-dhcp[318173]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/host
Dec 06 10:19:10 np0005548788.localdomain dnsmasq-dhcp[318173]: read /var/lib/neutron/dhcp/9beccfed-6ce7-4343-a09a-a10df412729f/opts
Dec 06 10:19:10 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:10.905 262572 INFO neutron.agent.dhcp.agent [None req-138c3971-ce6b-42e4-b1cc-088c85064bf6 - - - - - -] DHCP configuration for ports {'5e418b23-64fb-4cc3-b4f5-351454b6f675', 'fe2e8335-1a62-40a5-8cfd-a60e344d38eb'} is completed
Dec 06 10:19:11 np0005548788.localdomain dnsmasq[318173]: exiting on receipt of SIGTERM
Dec 06 10:19:11 np0005548788.localdomain podman[318191]: 2025-12-06 10:19:11.035484884 +0000 UTC m=+0.049106760 container kill 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:11 np0005548788.localdomain systemd[1]: libpod-11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286.scope: Deactivated successfully.
Dec 06 10:19:11 np0005548788.localdomain podman[318204]: 2025-12-06 10:19:11.099423019 +0000 UTC m=+0.053336870 container died 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:19:11 np0005548788.localdomain podman[318204]: 2025-12-06 10:19:11.133112445 +0000 UTC m=+0.087026256 container cleanup 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:19:11 np0005548788.localdomain systemd[1]: libpod-conmon-11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286.scope: Deactivated successfully.
Dec 06 10:19:11 np0005548788.localdomain podman[318211]: 2025-12-06 10:19:11.157120402 +0000 UTC m=+0.095696032 container remove 11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9beccfed-6ce7-4343-a09a-a10df412729f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:19:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:11.170 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:11 np0005548788.localdomain kernel: device tapfe2e8335-1a left promiscuous mode
Dec 06 10:19:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:11.183 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:11.205 262572 INFO neutron.agent.dhcp.agent [None req-5e98af68-33ba-4e05-9598-e09c5eb9ef35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:11.206 262572 INFO neutron.agent.dhcp.agent [None req-5e98af68-33ba-4e05-9598-e09c5eb9ef35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-05edca9a4382ba162c9f24d37981aed435d91af2c814c6209398df316f3f617d-merged.mount: Deactivated successfully.
Dec 06 10:19:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11a42cb0ada060ec8e576d9ecb86b5f7b243f107d864b10ee675e18d556e1286-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:11 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d9beccfed\x2d6ce7\x2d4343\x2da09a\x2da10df412729f.mount: Deactivated successfully.
Dec 06 10:19:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:11.584 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8:0:1:f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:11.586 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:19:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:11.588 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:11.589 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[27b464e6-46cf-43f7-abea-e29378fad0ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1079829945' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1079829945' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:11 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:11.988 2 INFO neutron.agent.securitygroups_rpc [None req-cdcfd119-194b-4de8-98ff-a5e7eedce5b7 7365839d5bca455283c571ca0abd33bb 12673f85bb004c3c946338dc70e565e7 - - default default] Security group member updated ['5a014cda-2333-483a-bcd0-2243e387c412']
Dec 06 10:19:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:12.186 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:12.188 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.188 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:12.560 262572 INFO neutron.agent.linux.ip_lib [None req-b094321a-8858-4dd6-92af-c1893c24c733 - - - - - -] Device tap4fe1b1eb-28 cannot be used as it has no MAC address
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.588 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain kernel: device tap4fe1b1eb-28 entered promiscuous mode
Dec 06 10:19:12 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016352.6001] manager: (tap4fe1b1eb-28): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.601 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain systemd-udevd[318244]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:12 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:12Z|00152|binding|INFO|Claiming lport 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a for this chassis.
Dec 06 10:19:12 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:12Z|00153|binding|INFO|4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a: Claiming unknown
Dec 06 10:19:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:12.615 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-1e856066-fb99-475a-a3e9-160e3cd8f615', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e856066-fb99-475a-a3e9-160e3cd8f615', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98cffbb9-f96a-4e76-960e-7a4e6036a6d1, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:12.622 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a in datapath 1e856066-fb99-475a-a3e9-160e3cd8f615 bound to our chassis
Dec 06 10:19:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:12.624 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1e856066-fb99-475a-a3e9-160e3cd8f615 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:12.625 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a18a5c85-6ce2-473c-9c52-a5c3a34aa66b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.636 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:12Z|00154|binding|INFO|Setting lport 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a ovn-installed in OVS
Dec 06 10:19:12 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:12Z|00155|binding|INFO|Setting lport 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a up in Southbound
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.640 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.641 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4fe1b1eb-28: No such device
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.678 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:12.713 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548788.localdomain ceph-mon[293643]: pgmap v292: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 5.3 MiB/s wr, 147 op/s
Dec 06 10:19:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e159 do_prune osdmap full prune enabled
Dec 06 10:19:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e160 e160: 6 total, 6 up, 6 in
Dec 06 10:19:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in
Dec 06 10:19:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:13.303 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:13 np0005548788.localdomain podman[318314]: 
Dec 06 10:19:13 np0005548788.localdomain podman[318314]: 2025-12-06 10:19:13.720445125 +0000 UTC m=+0.096748054 container create 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e160 do_prune osdmap full prune enabled
Dec 06 10:19:13 np0005548788.localdomain ceph-mon[293643]: osdmap e160: 6 total, 6 up, 6 in
Dec 06 10:19:13 np0005548788.localdomain ceph-mon[293643]: pgmap v294: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 157 KiB/s rd, 5.7 MiB/s wr, 230 op/s
Dec 06 10:19:13 np0005548788.localdomain systemd[1]: Started libpod-conmon-8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d.scope.
Dec 06 10:19:13 np0005548788.localdomain podman[318314]: 2025-12-06 10:19:13.674539124 +0000 UTC m=+0.050842093 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e161 e161: 6 total, 6 up, 6 in
Dec 06 10:19:13 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in
Dec 06 10:19:13 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee03225b913fd4bd7ace6faef5fb3ceb2fc816942a3d4576a772a10b88bde492/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:13 np0005548788.localdomain podman[318314]: 2025-12-06 10:19:13.807314634 +0000 UTC m=+0.183617563 container init 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:19:13 np0005548788.localdomain podman[318314]: 2025-12-06 10:19:13.825126562 +0000 UTC m=+0.201429501 container start 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:13 np0005548788.localdomain dnsmasq[318332]: started, version 2.85 cachesize 150
Dec 06 10:19:13 np0005548788.localdomain dnsmasq[318332]: DNS service limited to local subnets
Dec 06 10:19:13 np0005548788.localdomain dnsmasq[318332]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:13 np0005548788.localdomain dnsmasq[318332]: warning: no upstream servers configured
Dec 06 10:19:13 np0005548788.localdomain dnsmasq-dhcp[318332]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:13 np0005548788.localdomain dnsmasq[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/addn_hosts - 0 addresses
Dec 06 10:19:13 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/host
Dec 06 10:19:13 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/opts
Dec 06 10:19:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:14.012 262572 INFO neutron.agent.dhcp.agent [None req-082a38d5-4601-43e2-aea3-cb613868aca2 - - - - - -] DHCP configuration for ports {'25049fc3-5efc-4204-91dc-e230bcfd0f96'} is completed
Dec 06 10:19:14 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:14.137 2 INFO neutron.agent.securitygroups_rpc [None req-a19ad948-85b8-4074-80f7-d1d223959ce7 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:14 np0005548788.localdomain systemd[1]: tmp-crun.ktkAAM.mount: Deactivated successfully.
Dec 06 10:19:14 np0005548788.localdomain ceph-mon[293643]: osdmap e161: 6 total, 6 up, 6 in
Dec 06 10:19:14 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/932893422' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:14 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/932893422' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:14 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:14.867 2 INFO neutron.agent.securitygroups_rpc [None req-83d5638e-2f4a-455b-b2d3-487dd6af4b6c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:15.497 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:15 np0005548788.localdomain ceph-mon[293643]: pgmap v296: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 3.5 KiB/s wr, 90 op/s
Dec 06 10:19:15 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:15.913 2 INFO neutron.agent.securitygroups_rpc [None req-ca0b48dc-f1ce-4207-b602-f1515b9dc7e0 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:15 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:15.998 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:15Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c70c45b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c70c4a00>], id=efc7c2b0-7f5f-42a3-abe7-b20f4a75dd08, ip_allocation=immediate, mac_address=fa:16:3e:c8:47:b5, name=tempest-PortsTestJSON-68903737, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:11Z, description=, dns_domain=, id=1e856066-fb99-475a-a3e9-160e3cd8f615, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-374991417, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62185, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2040, status=ACTIVE, subnets=['655c2e12-8544-45a3-ba05-18864b877ad4'], tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:11Z, vlan_transparent=None, network_id=1e856066-fb99-475a-a3e9-160e3cd8f615, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2066, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:15Z on network 1e856066-fb99-475a-a3e9-160e3cd8f615
Dec 06 10:19:16 np0005548788.localdomain dnsmasq[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/addn_hosts - 1 addresses
Dec 06 10:19:16 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/host
Dec 06 10:19:16 np0005548788.localdomain podman[318349]: 2025-12-06 10:19:16.234132363 +0000 UTC m=+0.064669798 container kill 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:16 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/opts
Dec 06 10:19:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:16.475 262572 INFO neutron.agent.dhcp.agent [None req-5d8a2a4a-5cf3-4167-b91a-0a276ff3e223 - - - - - -] DHCP configuration for ports {'efc7c2b0-7f5f-42a3-abe7-b20f4a75dd08'} is completed
Dec 06 10:19:16 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1061312478' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1061312478' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3547940131' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3547940131' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:16.992 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:17.662 2 INFO neutron.agent.securitygroups_rpc [None req-09283ffa-3b28-4158-b02d-0d7572bb2b32 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:17 np0005548788.localdomain sshd[318370]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:19:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:17.782 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:16Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6881e20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68816d0>], id=e1986c14-e09e-4164-a04e-5fcd4f910221, ip_allocation=immediate, mac_address=fa:16:3e:90:f1:bd, name=tempest-PortsTestJSON-1397064562, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:11Z, description=, dns_domain=, id=1e856066-fb99-475a-a3e9-160e3cd8f615, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-374991417, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62185, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2040, status=ACTIVE, subnets=['655c2e12-8544-45a3-ba05-18864b877ad4'], tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:11Z, vlan_transparent=None, network_id=1e856066-fb99-475a-a3e9-160e3cd8f615, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2080, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:17Z on network 1e856066-fb99-475a-a3e9-160e3cd8f615
Dec 06 10:19:18 np0005548788.localdomain dnsmasq[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/addn_hosts - 2 addresses
Dec 06 10:19:18 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/host
Dec 06 10:19:18 np0005548788.localdomain podman[318389]: 2025-12-06 10:19:18.042517285 +0000 UTC m=+0.088769839 container kill 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:18 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/opts
Dec 06 10:19:18 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:18.068 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:18 np0005548788.localdomain ceph-mon[293643]: pgmap v297: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 2.8 KiB/s wr, 72 op/s
Dec 06 10:19:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:18.248 2 INFO neutron.agent.securitygroups_rpc [None req-c5dcea7d-aa1b-4625-8fc4-dc86e9ad2a1a 7365839d5bca455283c571ca0abd33bb 12673f85bb004c3c946338dc70e565e7 - - default default] Security group member updated ['5a014cda-2333-483a-bcd0-2243e387c412']
Dec 06 10:19:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:18.304 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:18 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:18.335 262572 INFO neutron.agent.dhcp.agent [None req-26f75149-fdce-4947-ba81-6e595cadfa58 - - - - - -] DHCP configuration for ports {'e1986c14-e09e-4164-a04e-5fcd4f910221'} is completed
Dec 06 10:19:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:19.279 2 INFO neutron.agent.securitygroups_rpc [None req-7362e19c-e595-471d-b005-9585c5cb5a42 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:19:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:19:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:19:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:19:19 np0005548788.localdomain podman[318424]: 2025-12-06 10:19:19.686857567 +0000 UTC m=+0.115253503 container kill 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:19 np0005548788.localdomain dnsmasq[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/addn_hosts - 1 addresses
Dec 06 10:19:19 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/host
Dec 06 10:19:19 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/opts
Dec 06 10:19:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:19:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19196 "" "Go-http-client/1.1"
Dec 06 10:19:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:20.190 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:19:20 np0005548788.localdomain ceph-mon[293643]: pgmap v298: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 122 KiB/s rd, 5.1 KiB/s wr, 161 op/s
Dec 06 10:19:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:20.499 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:20 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:20.871 2 INFO neutron.agent.securitygroups_rpc [None req-8a7361c8-fe6c-42e8-b9eb-90548f1065a0 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:19:21 np0005548788.localdomain dnsmasq[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/addn_hosts - 0 addresses
Dec 06 10:19:21 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/host
Dec 06 10:19:21 np0005548788.localdomain dnsmasq-dhcp[318332]: read /var/lib/neutron/dhcp/1e856066-fb99-475a-a3e9-160e3cd8f615/opts
Dec 06 10:19:21 np0005548788.localdomain podman[318462]: 2025-12-06 10:19:21.205747467 +0000 UTC m=+0.069167763 container kill 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:19:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1538185790' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1538185790' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:21 np0005548788.localdomain podman[318471]: 2025-12-06 10:19:21.286741146 +0000 UTC m=+0.107362946 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:21 np0005548788.localdomain podman[318471]: 2025-12-06 10:19:21.304318671 +0000 UTC m=+0.124940471 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:19:21 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:19:21 np0005548788.localdomain sshd[318370]: Received disconnect from 45.78.194.186 port 43886:11: Bye Bye [preauth]
Dec 06 10:19:21 np0005548788.localdomain sshd[318370]: Disconnected from authenticating user root 45.78.194.186 port 43886 [preauth]
Dec 06 10:19:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e161 do_prune osdmap full prune enabled
Dec 06 10:19:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e162 e162: 6 total, 6 up, 6 in
Dec 06 10:19:22 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in
Dec 06 10:19:22 np0005548788.localdomain ceph-mon[293643]: pgmap v299: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 116 KiB/s rd, 4.9 KiB/s wr, 154 op/s
Dec 06 10:19:22 np0005548788.localdomain ceph-mon[293643]: osdmap e162: 6 total, 6 up, 6 in
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: tmp-crun.MiYH8e.mount: Deactivated successfully.
Dec 06 10:19:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3020127696' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3020127696' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:23 np0005548788.localdomain podman[318502]: 2025-12-06 10:19:23.286145759 +0000 UTC m=+0.107135700 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:23.307 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:23 np0005548788.localdomain podman[318502]: 2025-12-06 10:19:23.32168452 +0000 UTC m=+0.142674411 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:19:23 np0005548788.localdomain podman[318501]: 2025-12-06 10:19:23.32559694 +0000 UTC m=+0.147634034 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:19:23 np0005548788.localdomain podman[318501]: 2025-12-06 10:19:23.408736176 +0000 UTC m=+0.230773280 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:19:23 np0005548788.localdomain dnsmasq[318332]: exiting on receipt of SIGTERM
Dec 06 10:19:23 np0005548788.localdomain podman[318554]: 2025-12-06 10:19:23.460978034 +0000 UTC m=+0.075062256 container kill 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: libpod-8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d.scope: Deactivated successfully.
Dec 06 10:19:23 np0005548788.localdomain podman[318569]: 2025-12-06 10:19:23.549829896 +0000 UTC m=+0.064492389 container died 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:19:23 np0005548788.localdomain podman[318569]: 2025-12-06 10:19:23.644674804 +0000 UTC m=+0.159337267 container remove 8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e856066-fb99-475a-a3e9-160e3cd8f615, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:19:23 np0005548788.localdomain systemd[1]: libpod-conmon-8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d.scope: Deactivated successfully.
Dec 06 10:19:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:23.660 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:23 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:23Z|00156|binding|INFO|Releasing lport 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a from this chassis (sb_readonly=0)
Dec 06 10:19:23 np0005548788.localdomain kernel: device tap4fe1b1eb-28 left promiscuous mode
Dec 06 10:19:23 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:23Z|00157|binding|INFO|Setting lport 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a down in Southbound
Dec 06 10:19:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:23.691 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:23 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:23.750 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-1e856066-fb99-475a-a3e9-160e3cd8f615', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e856066-fb99-475a-a3e9-160e3cd8f615', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98cffbb9-f96a-4e76-960e-7a4e6036a6d1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:23 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:23.752 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 4fe1b1eb-28c7-4e1e-b45e-4465737b9b6a in datapath 1e856066-fb99-475a-a3e9-160e3cd8f615 unbound from our chassis
Dec 06 10:19:23 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:23.754 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1e856066-fb99-475a-a3e9-160e3cd8f615, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:23 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:23.759 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[10cd5b16-72e7-4f75-9399-fefc9f297a03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:23.771 2 INFO neutron.agent.securitygroups_rpc [None req-7c634670-f27b-4241-a6ed-35c65bde0f68 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:23.784 262572 INFO neutron.agent.dhcp.agent [None req-ba30d380-1b2a-4b3a-a09f-99c9319c65d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-ee03225b913fd4bd7ace6faef5fb3ceb2fc816942a3d4576a772a10b88bde492-merged.mount: Deactivated successfully.
Dec 06 10:19:24 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8906022e5102a044af8b95f38fb81abfa984901662c1bbd06307a462d08cf91d-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:24 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d1e856066\x2dfb99\x2d475a\x2da3e9\x2d160e3cd8f615.mount: Deactivated successfully.
Dec 06 10:19:24 np0005548788.localdomain ceph-mon[293643]: pgmap v301: 177 pgs: 177 active+clean; 192 MiB data, 939 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 4.1 KiB/s wr, 129 op/s
Dec 06 10:19:24 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3025114459' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:24 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3025114459' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:24.626 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:25.296 2 INFO neutron.agent.securitygroups_rpc [None req-e1143dbb-8340-4dac-af2c-b301e23bde0e a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:25.295 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e162 do_prune osdmap full prune enabled
Dec 06 10:19:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 e163: 6 total, 6 up, 6 in
Dec 06 10:19:25 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in
Dec 06 10:19:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:25.502 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:25.517 2 INFO neutron.agent.securitygroups_rpc [None req-16b19944-c36d-4221-9d13-f63b2c9f61ac b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:25.678 2 INFO neutron.agent.securitygroups_rpc [None req-16b19944-c36d-4221-9d13-f63b2c9f61ac b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:26 np0005548788.localdomain ceph-mon[293643]: pgmap v302: 177 pgs: 177 active+clean; 192 MiB data, 939 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 3.8 KiB/s wr, 121 op/s
Dec 06 10:19:26 np0005548788.localdomain ceph-mon[293643]: osdmap e163: 6 total, 6 up, 6 in
Dec 06 10:19:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:26.798 2 INFO neutron.agent.securitygroups_rpc [None req-6fcbbc2a-54c0-4eb0-a7e2-cb02681a4453 b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:27 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:27.179 2 INFO neutron.agent.securitygroups_rpc [None req-a0866618-9e73-4e70-a70b-4bf19bcc43ec b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.205445) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367205539, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2029, "num_deletes": 268, "total_data_size": 2831956, "memory_usage": 3011168, "flush_reason": "Manual Compaction"}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367233931, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2755035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27789, "largest_seqno": 29817, "table_properties": {"data_size": 2746461, "index_size": 5207, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19127, "raw_average_key_size": 20, "raw_value_size": 2728650, "raw_average_value_size": 2991, "num_data_blocks": 225, "num_entries": 912, "num_filter_entries": 912, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016242, "oldest_key_time": 1765016242, "file_creation_time": 1765016367, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 28549 microseconds, and 15245 cpu microseconds.
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.234000) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2755035 bytes OK
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.234035) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.235978) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.236000) EVENT_LOG_v1 {"time_micros": 1765016367235993, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.236033) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2823113, prev total WAL file size 2823603, number of live WAL files 2.
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.239936) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303137' seq:72057594037927935, type:22 .. '6C6F676D0034323731' seq:0, type:0; will stop at (end)
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2690KB)], [48(16MB)]
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367240143, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 20324323, "oldest_snapshot_seqno": -1}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 12944 keys, 19836118 bytes, temperature: kUnknown
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367358471, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 19836118, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19760243, "index_size": 42430, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32389, "raw_key_size": 345814, "raw_average_key_size": 26, "raw_value_size": 19537958, "raw_average_value_size": 1509, "num_data_blocks": 1616, "num_entries": 12944, "num_filter_entries": 12944, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016367, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.358912) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 19836118 bytes
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.361421) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.4 rd, 167.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 16.8 +0.0 blob) out(18.9 +0.0 blob), read-write-amplify(14.6) write-amplify(7.2) OK, records in: 13495, records dropped: 551 output_compression: NoCompression
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.361454) EVENT_LOG_v1 {"time_micros": 1765016367361431, "job": 28, "event": "compaction_finished", "compaction_time_micros": 118558, "compaction_time_cpu_micros": 59436, "output_level": 6, "num_output_files": 1, "total_output_size": 19836118, "num_input_records": 13495, "num_output_records": 12944, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367362284, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367364963, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.236983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365097) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365104) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365107) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.309 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain ceph-mon[293643]: pgmap v304: 177 pgs: 177 active+clean; 192 MiB data, 939 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.2 KiB/s wr, 57 op/s
Dec 06 10:19:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3594078665' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3594078665' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:28.659 262572 INFO neutron.agent.linux.ip_lib [None req-893dabb2-7d9b-4d90-8690-1e38f0b8dbef - - - - - -] Device tap4e4774e5-e5 cannot be used as it has no MAC address
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.690 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain kernel: device tap4e4774e5-e5 entered promiscuous mode
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.702 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:28Z|00158|binding|INFO|Claiming lport 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 for this chassis.
Dec 06 10:19:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:28Z|00159|binding|INFO|4e4774e5-e5ef-4074-a357-64ed1f61b8c3: Claiming unknown
Dec 06 10:19:28 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016368.7086] manager: (tap4e4774e5-e5): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Dec 06 10:19:28 np0005548788.localdomain systemd-udevd[318609]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:28.719 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-ed9aec3e-b3c1-45d0-92de-301463a3e557', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed9aec3e-b3c1-45d0-92de-301463a3e557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b183e5cb-034c-4fb5-a7d5-b388b4f64278, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=4e4774e5-e5ef-4074-a357-64ed1f61b8c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:28.721 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 in datapath ed9aec3e-b3c1-45d0-92de-301463a3e557 bound to our chassis
Dec 06 10:19:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:28.722 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ed9aec3e-b3c1-45d0-92de-301463a3e557 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:28.723 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[77274ae6-bf3a-42f8-bb0f-78ccba024c5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.738 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:28Z|00160|binding|INFO|Setting lport 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 ovn-installed in OVS
Dec 06 10:19:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:28Z|00161|binding|INFO|Setting lport 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 up in Southbound
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.744 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.745 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap4e4774e5-e5: No such device
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.783 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:28.817 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3743484833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3743484833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3743484833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3743484833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:29 np0005548788.localdomain podman[318680]: 
Dec 06 10:19:29 np0005548788.localdomain podman[318680]: 2025-12-06 10:19:29.800431131 +0000 UTC m=+0.104483827 container create 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:19:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:29 np0005548788.localdomain podman[318680]: 2025-12-06 10:19:29.74746433 +0000 UTC m=+0.051517056 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:29 np0005548788.localdomain systemd[1]: Started libpod-conmon-9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea.scope.
Dec 06 10:19:29 np0005548788.localdomain systemd[1]: tmp-crun.kofWh4.mount: Deactivated successfully.
Dec 06 10:19:29 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:29 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5667c710c45fb238b4fba5cfc47364c9da24c0c6072816a1b53cd35e4f862b6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:29 np0005548788.localdomain podman[318680]: 2025-12-06 10:19:29.885017752 +0000 UTC m=+0.189070428 container init 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:19:29 np0005548788.localdomain podman[318680]: 2025-12-06 10:19:29.89498224 +0000 UTC m=+0.199034906 container start 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:19:29 np0005548788.localdomain dnsmasq[318699]: started, version 2.85 cachesize 150
Dec 06 10:19:29 np0005548788.localdomain dnsmasq[318699]: DNS service limited to local subnets
Dec 06 10:19:29 np0005548788.localdomain dnsmasq[318699]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:29 np0005548788.localdomain dnsmasq[318699]: warning: no upstream servers configured
Dec 06 10:19:29 np0005548788.localdomain dnsmasq-dhcp[318699]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:29 np0005548788.localdomain dnsmasq[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/addn_hosts - 0 addresses
Dec 06 10:19:29 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/host
Dec 06 10:19:29 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/opts
Dec 06 10:19:30 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:30.051 262572 INFO neutron.agent.dhcp.agent [None req-ed33673c-e4fe-43e4-901a-227e8329f184 - - - - - -] DHCP configuration for ports {'3a100ece-f7fd-42bd-b75e-a20b58ba80ad'} is completed
Dec 06 10:19:30 np0005548788.localdomain ceph-mon[293643]: pgmap v305: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 5.2 KiB/s wr, 116 op/s
Dec 06 10:19:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:30 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4147539828' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:30 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4147539828' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:30.505 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:30 np0005548788.localdomain systemd[1]: tmp-crun.lQ7zmA.mount: Deactivated successfully.
Dec 06 10:19:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:30.846 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:31 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:31.273 2 INFO neutron.agent.securitygroups_rpc [None req-33076df9-23c5-4745-bba5-728ca02b1a7f b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:31.274 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:31.306 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68813a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6881190>], id=6847d815-effc-4921-872f-e8370c5a63d0, ip_allocation=immediate, mac_address=fa:16:3e:bc:ea:79, name=tempest-PortsTestJSON-1119402910, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:25Z, description=, dns_domain=, id=ed9aec3e-b3c1-45d0-92de-301463a3e557, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1243491999, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9966, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2108, status=ACTIVE, subnets=['5f431dc5-4619-43a4-9c33-bd3ae426a689'], tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:27Z, vlan_transparent=None, network_id=ed9aec3e-b3c1-45d0-92de-301463a3e557, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2141, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:31Z on network ed9aec3e-b3c1-45d0-92de-301463a3e557
Dec 06 10:19:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:19:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "format": "json"}]: dispatch
Dec 06 10:19:31 np0005548788.localdomain dnsmasq[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/addn_hosts - 1 addresses
Dec 06 10:19:31 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/host
Dec 06 10:19:31 np0005548788.localdomain podman[318717]: 2025-12-06 10:19:31.534116143 +0000 UTC m=+0.057584944 container kill 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:19:31 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/opts
Dec 06 10:19:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:31.828 262572 INFO neutron.agent.dhcp.agent [None req-cab827bc-d165-4b0d-aad5-c6029550e7d6 - - - - - -] DHCP configuration for ports {'6847d815-effc-4921-872f-e8370c5a63d0'} is completed
Dec 06 10:19:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:32 np0005548788.localdomain ceph-mon[293643]: pgmap v306: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 75 KiB/s rd, 4.7 KiB/s wr, 103 op/s
Dec 06 10:19:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:33.310 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:34 np0005548788.localdomain ceph-mon[293643]: pgmap v307: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 5.6 KiB/s wr, 84 op/s
Dec 06 10:19:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1956276203' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:34 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:34.649 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:31Z, description=, device_id=065abc93-7a63-4671-b520-03fce14bf28c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68f2910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6960940>], id=6847d815-effc-4921-872f-e8370c5a63d0, ip_allocation=immediate, mac_address=fa:16:3e:bc:ea:79, name=tempest-PortsTestJSON-1119402910, network_id=ed9aec3e-b3c1-45d0-92de-301463a3e557, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['cd56abe4-204c-4363-ad64-0a6840260727'], standard_attr_id=2141, status=ACTIVE, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:32Z on network ed9aec3e-b3c1-45d0-92de-301463a3e557
Dec 06 10:19:34 np0005548788.localdomain dnsmasq[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/addn_hosts - 1 addresses
Dec 06 10:19:34 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/host
Dec 06 10:19:34 np0005548788.localdomain podman[318755]: 2025-12-06 10:19:34.937904227 +0000 UTC m=+0.061770614 container kill 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:19:34 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/opts
Dec 06 10:19:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:35.112 2 INFO neutron.agent.securitygroups_rpc [None req-dad18757-c8ae-4573-92a7-49e2b9f564ab a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:35 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:35.291 262572 INFO neutron.agent.dhcp.agent [None req-aca7b4f5-7f9a-4ad3-bfc4-333d74df8dec - - - - - -] DHCP configuration for ports {'6847d815-effc-4921-872f-e8370c5a63d0'} is completed
Dec 06 10:19:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:35.420 2 INFO neutron.agent.securitygroups_rpc [None req-dad18757-c8ae-4573-92a7-49e2b9f564ab a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "format": "json"}]: dispatch
Dec 06 10:19:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "force": true, "format": "json"}]: dispatch
Dec 06 10:19:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2644755643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:35.545 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:19:36 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:36.141 2 INFO neutron.agent.securitygroups_rpc [None req-792339ab-c7cd-409a-a342-ae21c75c2ee5 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:19:36 np0005548788.localdomain podman[318775]: 2025-12-06 10:19:36.270311949 +0000 UTC m=+0.093466007 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:36 np0005548788.localdomain podman[318775]: 2025-12-06 10:19:36.315698475 +0000 UTC m=+0.138852483 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec 06 10:19:36 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:36.471 2 INFO neutron.agent.securitygroups_rpc [None req-21429926-074a-46a0-a4f4-611f2e364131 a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:36 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:19:36 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:36.506 262572 INFO neutron.agent.linux.ip_lib [None req-45c8f5f1-faf8-49f7-a2bf-14f0fe764064 - - - - - -] Device tap76d60975-0f cannot be used as it has no MAC address
Dec 06 10:19:36 np0005548788.localdomain ceph-mon[293643]: pgmap v308: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 5.6 KiB/s wr, 84 op/s
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.541 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain kernel: device tap76d60975-0f entered promiscuous mode
Dec 06 10:19:36 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016376.5547] manager: (tap76d60975-0f): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.596 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:36Z|00162|binding|INFO|Claiming lport 76d60975-0f93-45fb-86f9-d506ef719583 for this chassis.
Dec 06 10:19:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:36Z|00163|binding|INFO|76d60975-0f93-45fb-86f9-d506ef719583: Claiming unknown
Dec 06 10:19:36 np0005548788.localdomain systemd-udevd[318834]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.610 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-976b104f-da3c-4aa9-9046-5852081ea282', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976b104f-da3c-4aa9-9046-5852081ea282', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=182262b6-0716-4101-82ed-ce935388bd2a, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=76d60975-0f93-45fb-86f9-d506ef719583) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.611 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 76d60975-0f93-45fb-86f9-d506ef719583 in datapath 976b104f-da3c-4aa9-9046-5852081ea282 bound to our chassis
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.613 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 976b104f-da3c-4aa9-9046-5852081ea282 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.614 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[36384b85-edb5-42d1-9bed-38d0f8445fb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:36Z|00164|binding|INFO|Setting lport 76d60975-0f93-45fb-86f9-d506ef719583 ovn-installed in OVS
Dec 06 10:19:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:36Z|00165|binding|INFO|Setting lport 76d60975-0f93-45fb-86f9-d506ef719583 up in Southbound
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.622 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.635 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain dnsmasq[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/addn_hosts - 0 addresses
Dec 06 10:19:36 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/host
Dec 06 10:19:36 np0005548788.localdomain dnsmasq-dhcp[318699]: read /var/lib/neutron/dhcp/ed9aec3e-b3c1-45d0-92de-301463a3e557/opts
Dec 06 10:19:36 np0005548788.localdomain podman[318825]: 2025-12-06 10:19:36.653833329 +0000 UTC m=+0.098399369 container kill 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.689 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.718 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:19:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:36 np0005548788.localdomain kernel: device tap4e4774e5-e5 left promiscuous mode
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.889 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:36Z|00166|binding|INFO|Releasing lport 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 from this chassis (sb_readonly=0)
Dec 06 10:19:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:36Z|00167|binding|INFO|Setting lport 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 down in Southbound
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.899 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-ed9aec3e-b3c1-45d0-92de-301463a3e557', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed9aec3e-b3c1-45d0-92de-301463a3e557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b183e5cb-034c-4fb5-a7d5-b388b4f64278, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=4e4774e5-e5ef-4074-a357-64ed1f61b8c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.901 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 4e4774e5-e5ef-4074-a357-64ed1f61b8c3 in datapath ed9aec3e-b3c1-45d0-92de-301463a3e557 unbound from our chassis
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.903 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed9aec3e-b3c1-45d0-92de-301463a3e557, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:36.904 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[75c43372-963e-4d84-9337-d0273d3ea1ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:36.915 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:37 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:37.109 2 INFO neutron.agent.securitygroups_rpc [None req-bb88ef2d-64f1-4b09-a81f-2bd8c1d4b6c6 a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:19:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "format": "json"}]: dispatch
Dec 06 10:19:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:37 np0005548788.localdomain podman[318920]: 
Dec 06 10:19:37 np0005548788.localdomain dnsmasq[318699]: exiting on receipt of SIGTERM
Dec 06 10:19:37 np0005548788.localdomain podman[318930]: 2025-12-06 10:19:37.681332296 +0000 UTC m=+0.078091520 container kill 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:19:37 np0005548788.localdomain systemd[1]: libpod-9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea.scope: Deactivated successfully.
Dec 06 10:19:37 np0005548788.localdomain podman[318920]: 2025-12-06 10:19:37.721623194 +0000 UTC m=+0.157174400 container create cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:37 np0005548788.localdomain podman[318920]: 2025-12-06 10:19:37.62594792 +0000 UTC m=+0.061499176 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:37 np0005548788.localdomain systemd[1]: Started libpod-conmon-cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4.scope.
Dec 06 10:19:37 np0005548788.localdomain podman[318951]: 2025-12-06 10:19:37.77250677 +0000 UTC m=+0.067879444 container died 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:19:37 np0005548788.localdomain systemd[1]: tmp-crun.HjojqG.mount: Deactivated successfully.
Dec 06 10:19:37 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:37 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7853259d6d04c73a16dc50af620f127ddf774738f29486456e6c4ab8a6b0604/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:37 np0005548788.localdomain podman[318920]: 2025-12-06 10:19:37.883504658 +0000 UTC m=+0.319055894 container init cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:19:37 np0005548788.localdomain podman[318920]: 2025-12-06 10:19:37.892694133 +0000 UTC m=+0.328245369 container start cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:19:37 np0005548788.localdomain dnsmasq[318982]: started, version 2.85 cachesize 150
Dec 06 10:19:37 np0005548788.localdomain dnsmasq[318982]: DNS service limited to local subnets
Dec 06 10:19:37 np0005548788.localdomain dnsmasq[318982]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:37 np0005548788.localdomain dnsmasq[318982]: warning: no upstream servers configured
Dec 06 10:19:37 np0005548788.localdomain dnsmasq-dhcp[318982]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:37 np0005548788.localdomain dnsmasq[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/addn_hosts - 0 addresses
Dec 06 10:19:37 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/host
Dec 06 10:19:37 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/opts
Dec 06 10:19:37 np0005548788.localdomain podman[318951]: 2025-12-06 10:19:37.935111187 +0000 UTC m=+0.230483861 container remove 9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ed9aec3e-b3c1-45d0-92de-301463a3e557, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:19:37 np0005548788.localdomain systemd[1]: libpod-conmon-9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea.scope: Deactivated successfully.
Dec 06 10:19:37 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:37.964 262572 INFO neutron.agent.dhcp.agent [None req-225fcd66-6bff-4607-a3f6-b780aac31d74 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:37 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:37.965 262572 INFO neutron.agent.dhcp.agent [None req-225fcd66-6bff-4607-a3f6-b780aac31d74 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:38.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:38 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:38.062 262572 INFO neutron.agent.dhcp.agent [None req-b28aeac6-18c6-4fc2-b04d-0794b94ffd34 - - - - - -] DHCP configuration for ports {'04e9892b-ada0-41d9-8fbc-4d2c452b6dcb'} is completed
Dec 06 10:19:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:38.167 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:38 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:38.184 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:37Z, description=, device_id=fd85d19e-cd56-4873-b452-bd25ff0b3893, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68c6d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68c6190>], id=e97b3f05-2055-43cd-83b9-acb38a4bf9ac, ip_allocation=immediate, mac_address=fa:16:3e:6c:66:11, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:33Z, description=, dns_domain=, id=976b104f-da3c-4aa9-9046-5852081ea282, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-128650844, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62379, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2158, status=ACTIVE, subnets=['05e4925e-a398-49ae-bcb0-06ee8231ee88'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:19:35Z, vlan_transparent=None, network_id=976b104f-da3c-4aa9-9046-5852081ea282, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2182, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:19:37Z on network 976b104f-da3c-4aa9-9046-5852081ea282
Dec 06 10:19:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:38.313 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:38 np0005548788.localdomain dnsmasq[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/addn_hosts - 1 addresses
Dec 06 10:19:38 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/host
Dec 06 10:19:38 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/opts
Dec 06 10:19:38 np0005548788.localdomain podman[319003]: 2025-12-06 10:19:38.398904163 +0000 UTC m=+0.072927250 container kill cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:38 np0005548788.localdomain ceph-mon[293643]: pgmap v309: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 4.7 KiB/s wr, 71 op/s
Dec 06 10:19:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f5667c710c45fb238b4fba5cfc47364c9da24c0c6072816a1b53cd35e4f862b6-merged.mount: Deactivated successfully.
Dec 06 10:19:38 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f58d973acf417a93a54430a6d96b5c81b1123f007deca72161f92faeabe8cea-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:38 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2ded9aec3e\x2db3c1\x2d45d0\x2d92de\x2d301463a3e557.mount: Deactivated successfully.
Dec 06 10:19:38 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:38.715 262572 INFO neutron.agent.dhcp.agent [None req-aad1edd4-a7a7-4a0c-969b-1749de0da6c8 - - - - - -] DHCP configuration for ports {'e97b3f05-2055-43cd-83b9-acb38a4bf9ac'} is completed
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:19:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:19:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:19:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3512849964' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3512849964' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:39.525 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:37Z, description=, device_id=fd85d19e-cd56-4873-b452-bd25ff0b3893, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ad1f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ad4c0>], id=e97b3f05-2055-43cd-83b9-acb38a4bf9ac, ip_allocation=immediate, mac_address=fa:16:3e:6c:66:11, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:33Z, description=, dns_domain=, id=976b104f-da3c-4aa9-9046-5852081ea282, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-128650844, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62379, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2158, status=ACTIVE, subnets=['05e4925e-a398-49ae-bcb0-06ee8231ee88'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:19:35Z, vlan_transparent=None, network_id=976b104f-da3c-4aa9-9046-5852081ea282, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2182, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:19:37Z on network 976b104f-da3c-4aa9-9046-5852081ea282
Dec 06 10:19:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3512849964' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3512849964' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:39 np0005548788.localdomain podman[319042]: 2025-12-06 10:19:39.715209796 +0000 UTC m=+0.058577205 container kill cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:19:39 np0005548788.localdomain dnsmasq[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/addn_hosts - 1 addresses
Dec 06 10:19:39 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/host
Dec 06 10:19:39 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/opts
Dec 06 10:19:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:19:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:19:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:19:39 np0005548788.localdomain podman[319057]: 2025-12-06 10:19:39.83738605 +0000 UTC m=+0.100377210 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:19:39 np0005548788.localdomain podman[319057]: 2025-12-06 10:19:39.856624426 +0000 UTC m=+0.119615656 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:39.858 262572 INFO neutron.agent.linux.ip_lib [None req-2fac0ba7-32db-4404-9a07-a03e878fc624 - - - - - -] Device tap8fdc5379-5c cannot be used as it has no MAC address
Dec 06 10:19:39 np0005548788.localdomain podman[319058]: 2025-12-06 10:19:39.812343064 +0000 UTC m=+0.075191080 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:19:39 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:19:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:39.887 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548788.localdomain kernel: device tap8fdc5379-5c entered promiscuous mode
Dec 06 10:19:39 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016379.8964] manager: (tap8fdc5379-5c): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Dec 06 10:19:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:39Z|00168|binding|INFO|Claiming lport 8fdc5379-5c86-41b4-8c56-dabef74615ea for this chassis.
Dec 06 10:19:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:39Z|00169|binding|INFO|8fdc5379-5c86-41b4-8c56-dabef74615ea: Claiming unknown
Dec 06 10:19:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:39.897 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548788.localdomain podman[319058]: 2025-12-06 10:19:39.899633248 +0000 UTC m=+0.162481234 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:19:39 np0005548788.localdomain systemd-udevd[319124]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:39 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:19:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:39.921 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:39.925 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:39Z|00170|binding|INFO|Setting lport 8fdc5379-5c86-41b4-8c56-dabef74615ea ovn-installed in OVS
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:39.928 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain podman[319060]: 2025-12-06 10:19:39.946516571 +0000 UTC m=+0.201246395 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8fdc5379-5c: No such device
Dec 06 10:19:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:39Z|00171|binding|INFO|Setting lport 8fdc5379-5c86-41b4-8c56-dabef74615ea up in Southbound
Dec 06 10:19:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:39.955 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=8fdc5379-5c86-41b4-8c56-dabef74615ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:39.956 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 8fdc5379-5c86-41b4-8c56-dabef74615ea in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 bound to our chassis
Dec 06 10:19:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:39.957 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 667a7cf2-00f8-4896-8e3d-8222fad7f397 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:39 np0005548788.localdomain podman[319060]: 2025-12-06 10:19:39.958738679 +0000 UTC m=+0.213468583 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, release=1755695350, version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:19:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:39.958 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[798346d5-0b9a-49d6-967a-1dc3fb09b53a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:39.963 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:19:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:39.984 262572 INFO neutron.agent.dhcp.agent [None req-73e20201-22df-4a33-8008-dea4941ac3ff - - - - - -] DHCP configuration for ports {'e97b3f05-2055-43cd-83b9-acb38a4bf9ac'} is completed
Dec 06 10:19:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:40.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:40.024 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:19:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:40.545 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: pgmap v310: 177 pgs: 177 active+clean; 238 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-659509012", "format": "json"} : dispatch
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:19:40 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:19:40 np0005548788.localdomain podman[319204]: 
Dec 06 10:19:40 np0005548788.localdomain podman[319204]: 2025-12-06 10:19:40.971400827 +0000 UTC m=+0.081785964 container create ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:41.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79.scope.
Dec 06 10:19:41 np0005548788.localdomain podman[319204]: 2025-12-06 10:19:40.927099145 +0000 UTC m=+0.037484312 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:41 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/858cd1fae0b5df6ce4b437e06606af4b948e3caf1993ccdd7912db11609b6ed6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:41 np0005548788.localdomain podman[319204]: 2025-12-06 10:19:41.065295975 +0000 UTC m=+0.175681112 container init ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:41 np0005548788.localdomain podman[319204]: 2025-12-06 10:19:41.07544483 +0000 UTC m=+0.185829957 container start ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:19:41 np0005548788.localdomain dnsmasq[319222]: started, version 2.85 cachesize 150
Dec 06 10:19:41 np0005548788.localdomain dnsmasq[319222]: DNS service limited to local subnets
Dec 06 10:19:41 np0005548788.localdomain dnsmasq[319222]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:41 np0005548788.localdomain dnsmasq[319222]: warning: no upstream servers configured
Dec 06 10:19:41 np0005548788.localdomain dnsmasq-dhcp[319222]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:41 np0005548788.localdomain dnsmasq[319222]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 0 addresses
Dec 06 10:19:41 np0005548788.localdomain dnsmasq-dhcp[319222]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:19:41 np0005548788.localdomain dnsmasq-dhcp[319222]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"} v 0)
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"} : dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"}]': finished
Dec 06 10:19:41 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:41.549 262572 INFO neutron.agent.dhcp.agent [None req-830c01ee-0f03-4bb0-92bc-3fffa02983f2 - - - - - -] DHCP configuration for ports {'659e29bd-a84c-4733-b754-dbb7b70b98cc', '4f359c22-d41c-4075-b1d7-af5b57282e35'} is completed
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "auth_id": "tempest-cephx-id-659509012", "tenant_id": "d694f30d513746329568207534277c9c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/467307692' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/467307692' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-659509012", "format": "json"} : dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"} : dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"} : dispatch
Dec 06 10:19:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"}]': finished
Dec 06 10:19:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:42.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:42 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:42.199 262572 INFO neutron.agent.linux.ip_lib [None req-7751340c-7534-419a-a491-fa634e101689 - - - - - -] Device tap8c1e5a28-dc cannot be used as it has no MAC address
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:42.233 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:42 np0005548788.localdomain kernel: device tap8c1e5a28-dc entered promiscuous mode
Dec 06 10:19:42 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016382.2410] manager: (tap8c1e5a28-dc): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Dec 06 10:19:42 np0005548788.localdomain systemd-udevd[319126]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:42Z|00172|binding|INFO|Claiming lport 8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b for this chassis.
Dec 06 10:19:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:42.245 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:42Z|00173|binding|INFO|8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b: Claiming unknown
Dec 06 10:19:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:42.258 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-60e0b038-f342-417c-a752-f0ee5b99d802', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60e0b038-f342-417c-a752-f0ee5b99d802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a37023c5-d1b8-4cda-9b3d-bacf98434407, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:42.260 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b in datapath 60e0b038-f342-417c-a752-f0ee5b99d802 bound to our chassis
Dec 06 10:19:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:42.262 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 60e0b038-f342-417c-a752-f0ee5b99d802 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:42.263 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a725b34c-2bbc-409c-a489-78720f98f190]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:42Z|00174|binding|INFO|Setting lport 8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b ovn-installed in OVS
Dec 06 10:19:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:42Z|00175|binding|INFO|Setting lport 8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b up in Southbound
Dec 06 10:19:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:42.284 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap8c1e5a28-dc: No such device
Dec 06 10:19:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:42.361 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:42.386 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: pgmap v311: 177 pgs: 177 active+clean; 238 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 67 op/s
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "auth_id": "tempest-cephx-id-659509012", "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "auth_id": "tempest-cephx-id-659509012", "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7a05360b-59a7-495e-a884-ff87c0880377", "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "force": true, "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/139921279' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/139921279' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:43.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:43.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:19:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:43.007 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:19:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:43.253 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:19:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:43.315 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:43 np0005548788.localdomain podman[319302]: 
Dec 06 10:19:43 np0005548788.localdomain podman[319302]: 2025-12-06 10:19:43.329883372 +0000 UTC m=+0.098997957 container create 96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-60e0b038-f342-417c-a752-f0ee5b99d802, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:43 np0005548788.localdomain systemd[1]: Started libpod-conmon-96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44.scope.
Dec 06 10:19:43 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:43 np0005548788.localdomain podman[319302]: 2025-12-06 10:19:43.284578219 +0000 UTC m=+0.053692884 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:43 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01ecd3a90f083b79c53d27c577db10e1616a356bb4357024bda8f9ac0c5b7cc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:43 np0005548788.localdomain podman[319302]: 2025-12-06 10:19:43.394101211 +0000 UTC m=+0.163215796 container init 96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-60e0b038-f342-417c-a752-f0ee5b99d802, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:19:43 np0005548788.localdomain podman[319302]: 2025-12-06 10:19:43.410393186 +0000 UTC m=+0.179507771 container start 96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-60e0b038-f342-417c-a752-f0ee5b99d802, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:43 np0005548788.localdomain dnsmasq[319320]: started, version 2.85 cachesize 150
Dec 06 10:19:43 np0005548788.localdomain dnsmasq[319320]: DNS service limited to local subnets
Dec 06 10:19:43 np0005548788.localdomain dnsmasq[319320]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:43 np0005548788.localdomain dnsmasq[319320]: warning: no upstream servers configured
Dec 06 10:19:43 np0005548788.localdomain dnsmasq-dhcp[319320]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Dec 06 10:19:43 np0005548788.localdomain dnsmasq[319320]: read /var/lib/neutron/dhcp/60e0b038-f342-417c-a752-f0ee5b99d802/addn_hosts - 0 addresses
Dec 06 10:19:43 np0005548788.localdomain dnsmasq-dhcp[319320]: read /var/lib/neutron/dhcp/60e0b038-f342-417c-a752-f0ee5b99d802/host
Dec 06 10:19:43 np0005548788.localdomain dnsmasq-dhcp[319320]: read /var/lib/neutron/dhcp/60e0b038-f342-417c-a752-f0ee5b99d802/opts
Dec 06 10:19:43 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:43.624 2 INFO neutron.agent.securitygroups_rpc [None req-f21d32c3-41e3-465d-a5ba-39b4b631a0c1 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['ae1eaa44-7360-485a-b85b-f1bfb95ce20b']
Dec 06 10:19:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:19:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "format": "json"}]: dispatch
Dec 06 10:19:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:43.714 262572 INFO neutron.agent.dhcp.agent [None req-a3caa3db-2c2c-4a08-b1ad-2a6286c3ff39 - - - - - -] DHCP configuration for ports {'a94b76f3-94f8-404c-880c-12784a979c0b'} is completed
Dec 06 10:19:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:43.739 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6772d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6772c70>], id=35aebe9e-8b9a-4ff3-95e5-b702e30e91d0, ip_allocation=immediate, mac_address=fa:16:3e:ed:12:20, name=tempest-PortsTestJSON-1168819908, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:32Z, description=, dns_domain=, id=667a7cf2-00f8-4896-8e3d-8222fad7f397, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-584577498, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1799, status=ACTIVE, subnets=['4f8ef0f6-e026-4624-bbba-07619cd14b08'], tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:38Z, vlan_transparent=None, network_id=667a7cf2-00f8-4896-8e3d-8222fad7f397, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ae1eaa44-7360-485a-b85b-f1bfb95ce20b'], standard_attr_id=2215, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:42Z on network 667a7cf2-00f8-4896-8e3d-8222fad7f397
Dec 06 10:19:43 np0005548788.localdomain dnsmasq[319222]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 1 addresses
Dec 06 10:19:43 np0005548788.localdomain dnsmasq-dhcp[319222]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:19:43 np0005548788.localdomain podman[319336]: 2025-12-06 10:19:43.979399611 +0000 UTC m=+0.055128349 container kill ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:43 np0005548788.localdomain dnsmasq-dhcp[319222]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:19:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:44.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:44 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:44.492 262572 INFO neutron.agent.dhcp.agent [None req-782ab84d-d5c9-4045-92c5-339c5bc90d14 - - - - - -] DHCP configuration for ports {'35aebe9e-8b9a-4ff3-95e5-b702e30e91d0'} is completed
Dec 06 10:19:44 np0005548788.localdomain ceph-mon[293643]: pgmap v312: 177 pgs: 177 active+clean; 192 MiB data, 927 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:19:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1258063083' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.523 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.524 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.524 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.524 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.525 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.556 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2619362809' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:19:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2321015285' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:45.996 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.235 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.237 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11531MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.238 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.238 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.709 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.710 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:19:46 np0005548788.localdomain ceph-mon[293643]: pgmap v313: 177 pgs: 177 active+clean; 192 MiB data, 927 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 1.8 MiB/s wr, 70 op/s
Dec 06 10:19:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "format": "json"}]: dispatch
Dec 06 10:19:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "force": true, "format": "json"}]: dispatch
Dec 06 10:19:46 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2321015285' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.737 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.951 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.952 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.967 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:19:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:46.994 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:19:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:47.019 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:19:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:47.442 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:47.443 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:47.443 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:19:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2332530373' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:47.510 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:19:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:47.518 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:19:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:47.561 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:19:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:47.564 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:19:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:47.567 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:47 np0005548788.localdomain dnsmasq[319222]: exiting on receipt of SIGTERM
Dec 06 10:19:47 np0005548788.localdomain podman[319417]: 2025-12-06 10:19:47.615612414 +0000 UTC m=+0.067229553 container kill ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:47 np0005548788.localdomain systemd[1]: libpod-ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79.scope: Deactivated successfully.
Dec 06 10:19:47 np0005548788.localdomain podman[319431]: 2025-12-06 10:19:47.690624997 +0000 UTC m=+0.058139191 container died ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:19:47 np0005548788.localdomain podman[319431]: 2025-12-06 10:19:47.725326572 +0000 UTC m=+0.092840736 container cleanup ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:19:47 np0005548788.localdomain systemd[1]: libpod-conmon-ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79.scope: Deactivated successfully.
Dec 06 10:19:47 np0005548788.localdomain podman[319433]: 2025-12-06 10:19:47.754427353 +0000 UTC m=+0.112477994 container remove ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:19:48 np0005548788.localdomain ceph-mon[293643]: pgmap v314: 177 pgs: 177 active+clean; 192 MiB data, 927 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 1.8 MiB/s wr, 70 op/s
Dec 06 10:19:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2332530373' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:48.317 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:48.597 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=659e29bd-a84c-4733-b754-dbb7b70b98cc) old=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:48.600 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 659e29bd-a84c-4733-b754-dbb7b70b98cc in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 updated
Dec 06 10:19:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:48.603 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 34ac1143-d926-44f4-b7e7-5664d64831d2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:48.603 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:48 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:48.604 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[06376b8f-26b2-4555-90bf-1f2931eb95bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-858cd1fae0b5df6ce4b437e06606af4b948e3caf1993ccdd7912db11609b6ed6-merged.mount: Deactivated successfully.
Dec 06 10:19:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea808fc4b1db018c2fc196d9bc30490494d1fed5ba0409a44a8621b869df3b79-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:49.568 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:19:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:19:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:19:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158563 "" "Go-http-client/1.1"
Dec 06 10:19:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:19:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19682 "" "Go-http-client/1.1"
Dec 06 10:19:50 np0005548788.localdomain ceph-mon[293643]: pgmap v315: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 1.8 MiB/s wr, 72 op/s
Dec 06 10:19:50 np0005548788.localdomain dnsmasq[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/addn_hosts - 0 addresses
Dec 06 10:19:50 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/host
Dec 06 10:19:50 np0005548788.localdomain dnsmasq-dhcp[318982]: read /var/lib/neutron/dhcp/976b104f-da3c-4aa9-9046-5852081ea282/opts
Dec 06 10:19:50 np0005548788.localdomain podman[319500]: 2025-12-06 10:19:50.507297935 +0000 UTC m=+0.071932200 container kill cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:50.578 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:50 np0005548788.localdomain kernel: device tap76d60975-0f left promiscuous mode
Dec 06 10:19:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:50Z|00176|binding|INFO|Releasing lport 76d60975-0f93-45fb-86f9-d506ef719583 from this chassis (sb_readonly=0)
Dec 06 10:19:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:19:50Z|00177|binding|INFO|Setting lport 76d60975-0f93-45fb-86f9-d506ef719583 down in Southbound
Dec 06 10:19:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:50.731 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:50.749 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:50.855 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-976b104f-da3c-4aa9-9046-5852081ea282', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-976b104f-da3c-4aa9-9046-5852081ea282', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=182262b6-0716-4101-82ed-ce935388bd2a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=76d60975-0f93-45fb-86f9-d506ef719583) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:50.861 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 76d60975-0f93-45fb-86f9-d506ef719583 in datapath 976b104f-da3c-4aa9-9046-5852081ea282 unbound from our chassis
Dec 06 10:19:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:50.862 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 976b104f-da3c-4aa9-9046-5852081ea282 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:19:50.864 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[6113f730-bd22-4be2-9474-60313f5d9716]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:50 np0005548788.localdomain podman[319546]: 
Dec 06 10:19:50 np0005548788.localdomain podman[319546]: 2025-12-06 10:19:50.897121859 +0000 UTC m=+0.079226644 container create a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:19:50 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:50.922 2 INFO neutron.agent.securitygroups_rpc [None req-8ab54d3f-0dba-4adf-88cd-ebbf59b7b541 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['86fafa90-40d2-4e2b-87d7-dc3d530576aa', 'ae1eaa44-7360-485a-b85b-f1bfb95ce20b']
Dec 06 10:19:50 np0005548788.localdomain systemd[1]: Started libpod-conmon-a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322.scope.
Dec 06 10:19:50 np0005548788.localdomain systemd[1]: tmp-crun.Cx3UiD.mount: Deactivated successfully.
Dec 06 10:19:50 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:50 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9335b9261f0728dd6ee631e71c7bacc30428a55ccde28ed03ff8d03d362734bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:50 np0005548788.localdomain podman[319546]: 2025-12-06 10:19:50.863796267 +0000 UTC m=+0.045901122 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:50 np0005548788.localdomain podman[319546]: 2025-12-06 10:19:50.969379858 +0000 UTC m=+0.151484663 container init a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:50 np0005548788.localdomain podman[319546]: 2025-12-06 10:19:50.978320865 +0000 UTC m=+0.160425680 container start a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:50 np0005548788.localdomain dnsmasq[319564]: started, version 2.85 cachesize 150
Dec 06 10:19:50 np0005548788.localdomain dnsmasq[319564]: DNS service limited to local subnets
Dec 06 10:19:50 np0005548788.localdomain dnsmasq[319564]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:50 np0005548788.localdomain dnsmasq[319564]: warning: no upstream servers configured
Dec 06 10:19:50 np0005548788.localdomain dnsmasq-dhcp[319564]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 06 10:19:50 np0005548788.localdomain dnsmasq-dhcp[319564]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:50 np0005548788.localdomain dnsmasq[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 1 addresses
Dec 06 10:19:50 np0005548788.localdomain dnsmasq-dhcp[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:19:50 np0005548788.localdomain dnsmasq-dhcp[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.157 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6879eb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6879e50>], id=35aebe9e-8b9a-4ff3-95e5-b702e30e91d0, ip_allocation=immediate, mac_address=fa:16:3e:ed:12:20, name=tempest-PortsTestJSON-439971049, network_id=667a7cf2-00f8-4896-8e3d-8222fad7f397, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['86fafa90-40d2-4e2b-87d7-dc3d530576aa'], standard_attr_id=2215, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:50Z on network 667a7cf2-00f8-4896-8e3d-8222fad7f397
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.162 262572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmp08_u_2b7/privsep.sock']
Dec 06 10:19:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:19:51 np0005548788.localdomain podman[319569]: 2025-12-06 10:19:51.50147912 +0000 UTC m=+0.076862383 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:19:51 np0005548788.localdomain podman[319569]: 2025-12-06 10:19:51.546951158 +0000 UTC m=+0.122334421 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 06 10:19:51 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.594 262572 INFO neutron.agent.dhcp.agent [None req-cff2ca2a-8d05-4125-8a71-887a93f41bdb - - - - - -] DHCP configuration for ports {'35aebe9e-8b9a-4ff3-95e5-b702e30e91d0', '659e29bd-a84c-4733-b754-dbb7b70b98cc', '4f359c22-d41c-4075-b1d7-af5b57282e35', '8fdc5379-5c86-41b4-8c56-dabef74615ea'} is completed
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.783 262572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.664 319587 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.669 319587 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.672 319587 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 06 10:19:51 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:51.673 319587 INFO oslo.privsep.daemon [-] privsep daemon running as pid 319587
Dec 06 10:19:52 np0005548788.localdomain dnsmasq-dhcp[319564]: DHCPRELEASE(tap8fdc5379-5c) 10.100.0.12 fa:16:3e:ed:12:20
Dec 06 10:19:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:52 np0005548788.localdomain ceph-mon[293643]: pgmap v316: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 13 KiB/s wr, 35 op/s
Dec 06 10:19:52 np0005548788.localdomain podman[319610]: 2025-12-06 10:19:52.705692451 +0000 UTC m=+0.061372133 container kill a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:52 np0005548788.localdomain dnsmasq[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 1 addresses
Dec 06 10:19:52 np0005548788.localdomain dnsmasq-dhcp[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:19:52 np0005548788.localdomain dnsmasq-dhcp[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:19:53 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:19:53.187 2 INFO neutron.agent.securitygroups_rpc [None req-cbc27e7f-4bef-4dfe-ad5b-dd1345427342 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['86fafa90-40d2-4e2b-87d7-dc3d530576aa']
Dec 06 10:19:53 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:53.212 262572 INFO neutron.agent.dhcp.agent [None req-07f86f48-84e0-4276-847c-919cf6391b12 - - - - - -] DHCP configuration for ports {'35aebe9e-8b9a-4ff3-95e5-b702e30e91d0'} is completed
Dec 06 10:19:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:53.319 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:54 np0005548788.localdomain dnsmasq[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 0 addresses
Dec 06 10:19:54 np0005548788.localdomain dnsmasq-dhcp[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:19:54 np0005548788.localdomain podman[319646]: 2025-12-06 10:19:54.010242548 +0000 UTC m=+0.064135467 container kill a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:54 np0005548788.localdomain dnsmasq-dhcp[319564]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:19:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:19:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:19:54 np0005548788.localdomain podman[319661]: 2025-12-06 10:19:54.136278872 +0000 UTC m=+0.094208588 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:54 np0005548788.localdomain podman[319660]: 2025-12-06 10:19:54.197990414 +0000 UTC m=+0.157850300 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:19:54 np0005548788.localdomain systemd[1]: tmp-crun.0tx8jr.mount: Deactivated successfully.
Dec 06 10:19:54 np0005548788.localdomain podman[319661]: 2025-12-06 10:19:54.220030337 +0000 UTC m=+0.177960103 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:19:54 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:19:54 np0005548788.localdomain podman[319660]: 2025-12-06 10:19:54.235105024 +0000 UTC m=+0.194964930 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:19:54 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: pgmap v317: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 14 KiB/s wr, 36 op/s
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.316787) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394316839, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 621, "num_deletes": 251, "total_data_size": 358215, "memory_usage": 369840, "flush_reason": "Manual Compaction"}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394323312, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 345511, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29818, "largest_seqno": 30438, "table_properties": {"data_size": 342325, "index_size": 1041, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8349, "raw_average_key_size": 20, "raw_value_size": 335709, "raw_average_value_size": 818, "num_data_blocks": 46, "num_entries": 410, "num_filter_entries": 410, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016367, "oldest_key_time": 1765016367, "file_creation_time": 1765016394, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 6587 microseconds, and 2704 cpu microseconds.
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.323373) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 345511 bytes OK
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.323403) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.325403) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.325427) EVENT_LOG_v1 {"time_micros": 1765016394325420, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.325449) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 354757, prev total WAL file size 354757, number of live WAL files 2.
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.325989) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(337KB)], [51(18MB)]
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394326047, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 20181629, "oldest_snapshot_seqno": -1}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12835 keys, 18822460 bytes, temperature: kUnknown
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394430439, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 18822460, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18748295, "index_size": 40976, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32133, "raw_key_size": 344198, "raw_average_key_size": 26, "raw_value_size": 18529089, "raw_average_value_size": 1443, "num_data_blocks": 1551, "num_entries": 12835, "num_filter_entries": 12835, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016394, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.430847) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 18822460 bytes
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433134) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.1 rd, 180.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 18.9 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(112.9) write-amplify(54.5) OK, records in: 13354, records dropped: 519 output_compression: NoCompression
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433167) EVENT_LOG_v1 {"time_micros": 1765016394433152, "job": 30, "event": "compaction_finished", "compaction_time_micros": 104521, "compaction_time_cpu_micros": 53092, "output_level": 6, "num_output_files": 1, "total_output_size": 18822460, "num_input_records": 13354, "num_output_records": 12835, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394433510, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394436940, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.325882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.437122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.437135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.437138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.437141) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:19:54.437144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:55.581 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:56 np0005548788.localdomain ceph-mon[293643]: pgmap v318: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 4.5 KiB/s wr, 2 op/s
Dec 06 10:19:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:57 np0005548788.localdomain dnsmasq[318982]: exiting on receipt of SIGTERM
Dec 06 10:19:57 np0005548788.localdomain systemd[1]: libpod-cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4.scope: Deactivated successfully.
Dec 06 10:19:57 np0005548788.localdomain podman[319724]: 2025-12-06 10:19:57.747338306 +0000 UTC m=+0.069890216 container kill cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:19:57 np0005548788.localdomain podman[319738]: 2025-12-06 10:19:57.818780399 +0000 UTC m=+0.057168242 container died cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:19:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:57 np0005548788.localdomain podman[319738]: 2025-12-06 10:19:57.846619762 +0000 UTC m=+0.085007565 container cleanup cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:57 np0005548788.localdomain systemd[1]: libpod-conmon-cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4.scope: Deactivated successfully.
Dec 06 10:19:57 np0005548788.localdomain podman[319740]: 2025-12-06 10:19:57.893022989 +0000 UTC m=+0.123607170 container remove cfc98b4ee0b278cda1530cd84b965a5c6ffd83f89fe66bd5289091976a54a0a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-976b104f-da3c-4aa9-9046-5852081ea282, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:19:58.321 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:58 np0005548788.localdomain ceph-mon[293643]: pgmap v319: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 4.5 KiB/s wr, 2 op/s
Dec 06 10:19:58 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-d7853259d6d04c73a16dc50af620f127ddf774738f29486456e6c4ab8a6b0604-merged.mount: Deactivated successfully.
Dec 06 10:19:58 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:19:58.766 262572 INFO neutron.agent.dhcp.agent [None req-036db4e4-7e20-4ce6-afc3-ab67df9edadb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:58 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d976b104f\x2dda3c\x2d4aa9\x2d9046\x2d5852081ea282.mount: Deactivated successfully.
Dec 06 10:19:58 np0005548788.localdomain sudo[319769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:19:58 np0005548788.localdomain sudo[319769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:19:58 np0005548788.localdomain sudo[319769]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:59 np0005548788.localdomain sudo[319787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:19:59 np0005548788.localdomain sudo[319787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:19:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:19:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:59 np0005548788.localdomain sudo[319787]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:59 np0005548788.localdomain dnsmasq[319564]: exiting on receipt of SIGTERM
Dec 06 10:19:59 np0005548788.localdomain podman[319853]: 2025-12-06 10:19:59.708344109 +0000 UTC m=+0.060179475 container kill a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:19:59 np0005548788.localdomain systemd[1]: libpod-a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322.scope: Deactivated successfully.
Dec 06 10:19:59 np0005548788.localdomain podman[319867]: 2025-12-06 10:19:59.788181902 +0000 UTC m=+0.060759493 container died a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:19:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:19:59 np0005548788.localdomain systemd[1]: tmp-crun.RbbP9w.mount: Deactivated successfully.
Dec 06 10:19:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:19:59 np0005548788.localdomain podman[319867]: 2025-12-06 10:19:59.889579723 +0000 UTC m=+0.162157264 container cleanup a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:59 np0005548788.localdomain systemd[1]: libpod-conmon-a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322.scope: Deactivated successfully.
Dec 06 10:19:59 np0005548788.localdomain sudo[319893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:19:59 np0005548788.localdomain podman[319869]: 2025-12-06 10:19:59.914511895 +0000 UTC m=+0.182513314 container remove a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:59 np0005548788.localdomain sudo[319893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:19:59 np0005548788.localdomain sudo[319893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:20:00 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:00.081 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "format": "json"}]: dispatch
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: pgmap v320: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 5.0 KiB/s wr, 2 op/s
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:20:00 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:20:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:00.479 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:00.583 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9335b9261f0728dd6ee631e71c7bacc30428a55ccde28ed03ff8d03d362734bc-merged.mount: Deactivated successfully.
Dec 06 10:20:00 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4c0258b9d3efaa27fd059381f8221f716c51dd365541ecf3b1a9c07eab38322-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:00 np0005548788.localdomain podman[319962]: 
Dec 06 10:20:00 np0005548788.localdomain podman[319962]: 2025-12-06 10:20:00.895286205 +0000 UTC m=+0.087507251 container create 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:20:00 np0005548788.localdomain systemd[1]: Started libpod-conmon-628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a.scope.
Dec 06 10:20:00 np0005548788.localdomain podman[319962]: 2025-12-06 10:20:00.853546442 +0000 UTC m=+0.045767508 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:00 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:00 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ab17fb1e0cbb83e1c3291ad9fd6a37837e63bc376ef004ca1d36dac645b2027/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:00 np0005548788.localdomain podman[319962]: 2025-12-06 10:20:00.99134141 +0000 UTC m=+0.183562456 container init 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:20:00 np0005548788.localdomain podman[319962]: 2025-12-06 10:20:00.99876285 +0000 UTC m=+0.190983896 container start 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:01 np0005548788.localdomain dnsmasq[319981]: started, version 2.85 cachesize 150
Dec 06 10:20:01 np0005548788.localdomain dnsmasq[319981]: DNS service limited to local subnets
Dec 06 10:20:01 np0005548788.localdomain dnsmasq[319981]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:01 np0005548788.localdomain dnsmasq[319981]: warning: no upstream servers configured
Dec 06 10:20:01 np0005548788.localdomain dnsmasq-dhcp[319981]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 06 10:20:01 np0005548788.localdomain dnsmasq[319981]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 0 addresses
Dec 06 10:20:01 np0005548788.localdomain dnsmasq-dhcp[319981]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:01 np0005548788.localdomain dnsmasq-dhcp[319981]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:01 np0005548788.localdomain dnsmasq[319981]: exiting on receipt of SIGTERM
Dec 06 10:20:01 np0005548788.localdomain podman[319999]: 2025-12-06 10:20:01.44424748 +0000 UTC m=+0.050635230 container kill 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:20:01 np0005548788.localdomain systemd[1]: libpod-628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a.scope: Deactivated successfully.
Dec 06 10:20:01 np0005548788.localdomain podman[320011]: 2025-12-06 10:20:01.527425006 +0000 UTC m=+0.067788620 container died 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:01 np0005548788.localdomain podman[320011]: 2025-12-06 10:20:01.560044026 +0000 UTC m=+0.100407530 container cleanup 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:20:01 np0005548788.localdomain systemd[1]: libpod-conmon-628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a.scope: Deactivated successfully.
Dec 06 10:20:01 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:01.584 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=659e29bd-a84c-4733-b754-dbb7b70b98cc) old=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:01 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:01.586 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 659e29bd-a84c-4733-b754-dbb7b70b98cc in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 updated
Dec 06 10:20:01 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:01.589 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 34ac1143-d926-44f4-b7e7-5664d64831d2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:20:01 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:01.589 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:01 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:01.588 262572 INFO neutron.agent.dhcp.agent [None req-db269708-9660-4ee0-b112-4e181ccdc54b - - - - - -] DHCP configuration for ports {'659e29bd-a84c-4733-b754-dbb7b70b98cc', '4f359c22-d41c-4075-b1d7-af5b57282e35', '8fdc5379-5c86-41b4-8c56-dabef74615ea'} is completed
Dec 06 10:20:01 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:01.590 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[af414ce0-43ef-4647-9dda-b587dfd7bf63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:01 np0005548788.localdomain podman[320013]: 2025-12-06 10:20:01.599909842 +0000 UTC m=+0.133444335 container remove 628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-6ab17fb1e0cbb83e1c3291ad9fd6a37837e63bc376ef004ca1d36dac645b2027-merged.mount: Deactivated successfully.
Dec 06 10:20:01 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-628d043479a1da452f81b77c376e25c62e65d1d20eae29303302e21bdb5c074a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:01 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:01.944 262572 INFO neutron.agent.linux.ip_lib [None req-a58b4884-9fea-480f-8513-66cd7f931db3 - - - - - -] Device tapb1894616-10 cannot be used as it has no MAC address
Dec 06 10:20:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:01.992 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:02 np0005548788.localdomain kernel: device tapb1894616-10 entered promiscuous mode
Dec 06 10:20:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:02.001 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:02 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016402.0021] manager: (tapb1894616-10): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Dec 06 10:20:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:02Z|00178|binding|INFO|Claiming lport b1894616-1059-4de5-89ab-fa9085c132e8 for this chassis.
Dec 06 10:20:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:02Z|00179|binding|INFO|b1894616-1059-4de5-89ab-fa9085c132e8: Claiming unknown
Dec 06 10:20:02 np0005548788.localdomain systemd-udevd[320049]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:02.020 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-dca693cd-078c-4362-8195-e12f45811556', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dca693cd-078c-4362-8195-e12f45811556', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b301b340-0ff8-4f83-b325-0efaafe087e3, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=b1894616-1059-4de5-89ab-fa9085c132e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:02.022 159620 INFO neutron.agent.ovn.metadata.agent [-] Port b1894616-1059-4de5-89ab-fa9085c132e8 in datapath dca693cd-078c-4362-8195-e12f45811556 bound to our chassis
Dec 06 10:20:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:02.023 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network dca693cd-078c-4362-8195-e12f45811556 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:02 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:02.024 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[9d74f725-bebe-402b-852f-dbf69aa6396c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:02Z|00180|binding|INFO|Setting lport b1894616-1059-4de5-89ab-fa9085c132e8 ovn-installed in OVS
Dec 06 10:20:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:02Z|00181|binding|INFO|Setting lport b1894616-1059-4de5-89ab-fa9085c132e8 up in Southbound
Dec 06 10:20:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:02.032 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb1894616-10: No such device
Dec 06 10:20:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:02.077 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:20:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:20:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:02.112 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:02 np0005548788.localdomain ceph-mon[293643]: pgmap v321: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s wr, 0 op/s
Dec 06 10:20:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:20:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:03.323 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:03 np0005548788.localdomain podman[320147]: 
Dec 06 10:20:03 np0005548788.localdomain podman[320147]: 2025-12-06 10:20:03.341446196 +0000 UTC m=+0.070311889 container create 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:03 np0005548788.localdomain systemd[1]: Started libpod-conmon-918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b.scope.
Dec 06 10:20:03 np0005548788.localdomain podman[320147]: 2025-12-06 10:20:03.309865668 +0000 UTC m=+0.038731371 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:03 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:03 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82dd03bb94cc3a1271eef878ddfdb162e5610be76faae40452fd0360d537dd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:03 np0005548788.localdomain podman[320147]: 2025-12-06 10:20:03.425036715 +0000 UTC m=+0.153902408 container init 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:20:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "snap_name": "86b1b260-491a-4ff7-9199-adafe7f88c8f", "format": "json"}]: dispatch
Dec 06 10:20:03 np0005548788.localdomain podman[320147]: 2025-12-06 10:20:03.437302466 +0000 UTC m=+0.166168159 container start 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320171]: started, version 2.85 cachesize 150
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320171]: DNS service limited to local subnets
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320171]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320171]: warning: no upstream servers configured
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320171]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320171]: read /var/lib/neutron/dhcp/dca693cd-078c-4362-8195-e12f45811556/addn_hosts - 0 addresses
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320171]: read /var/lib/neutron/dhcp/dca693cd-078c-4362-8195-e12f45811556/host
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320171]: read /var/lib/neutron/dhcp/dca693cd-078c-4362-8195-e12f45811556/opts
Dec 06 10:20:03 np0005548788.localdomain podman[320189]: 
Dec 06 10:20:03 np0005548788.localdomain podman[320189]: 2025-12-06 10:20:03.638905399 +0000 UTC m=+0.096653364 container create a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:03 np0005548788.localdomain systemd[1]: Started libpod-conmon-a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b.scope.
Dec 06 10:20:03 np0005548788.localdomain podman[320189]: 2025-12-06 10:20:03.59142991 +0000 UTC m=+0.049177925 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:03 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:03 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbfa81f413785b321dff79cb8ec85a404102b449240f3a2f5306bf52fcafad77/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:03 np0005548788.localdomain podman[320189]: 2025-12-06 10:20:03.706878756 +0000 UTC m=+0.164626721 container init a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:03 np0005548788.localdomain podman[320189]: 2025-12-06 10:20:03.715693319 +0000 UTC m=+0.173441284 container start a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320208]: started, version 2.85 cachesize 150
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320208]: DNS service limited to local subnets
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320208]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320208]: warning: no upstream servers configured
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320208]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320208]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 06 10:20:03 np0005548788.localdomain dnsmasq[320208]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 0 addresses
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320208]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:03 np0005548788.localdomain dnsmasq-dhcp[320208]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:03 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:03.755 262572 INFO neutron.agent.dhcp.agent [None req-95461819-7106-4350-9fa7-cc10cd0d4a71 - - - - - -] DHCP configuration for ports {'30d05ef4-6977-4f78-8ca1-4803812d9513'} is completed
Dec 06 10:20:04 np0005548788.localdomain ceph-mon[293643]: pgmap v322: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 3.8 KiB/s wr, 1 op/s
Dec 06 10:20:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:05.628 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:05.684 262572 INFO neutron.agent.dhcp.agent [None req-5339955e-b57e-4c7f-969f-a83239e1e5f2 - - - - - -] DHCP configuration for ports {'659e29bd-a84c-4733-b754-dbb7b70b98cc', '4f359c22-d41c-4075-b1d7-af5b57282e35', '8fdc5379-5c86-41b4-8c56-dabef74615ea'} is completed
Dec 06 10:20:06 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:06Z|00182|binding|INFO|Releasing lport b1894616-1059-4de5-89ab-fa9085c132e8 from this chassis (sb_readonly=0)
Dec 06 10:20:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:06.090 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:06Z|00183|binding|INFO|Setting lport b1894616-1059-4de5-89ab-fa9085c132e8 down in Southbound
Dec 06 10:20:06 np0005548788.localdomain kernel: device tapb1894616-10 left promiscuous mode
Dec 06 10:20:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:06.106 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-dca693cd-078c-4362-8195-e12f45811556', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dca693cd-078c-4362-8195-e12f45811556', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b301b340-0ff8-4f83-b325-0efaafe087e3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=b1894616-1059-4de5-89ab-fa9085c132e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:06.109 159620 INFO neutron.agent.ovn.metadata.agent [-] Port b1894616-1059-4de5-89ab-fa9085c132e8 in datapath dca693cd-078c-4362-8195-e12f45811556 unbound from our chassis
Dec 06 10:20:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:06.112 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:06.113 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dca693cd-078c-4362-8195-e12f45811556, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:06 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:06.114 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[b49db85f-0c0b-4b3e-907d-4ff96e84f1a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:06 np0005548788.localdomain ceph-mon[293643]: pgmap v323: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 3.2 KiB/s wr, 1 op/s
Dec 06 10:20:06 np0005548788.localdomain dnsmasq[320171]: read /var/lib/neutron/dhcp/dca693cd-078c-4362-8195-e12f45811556/addn_hosts - 0 addresses
Dec 06 10:20:06 np0005548788.localdomain dnsmasq-dhcp[320171]: read /var/lib/neutron/dhcp/dca693cd-078c-4362-8195-e12f45811556/host
Dec 06 10:20:06 np0005548788.localdomain dnsmasq-dhcp[320171]: read /var/lib/neutron/dhcp/dca693cd-078c-4362-8195-e12f45811556/opts
Dec 06 10:20:06 np0005548788.localdomain podman[320228]: 2025-12-06 10:20:06.514486143 +0000 UTC m=+0.067340517 container kill 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:20:06 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent [None req-5a6a80d3-1a20-4add-90a6-f23dd058e94e - - - - - -] Unable to reload_allocations dhcp for dca693cd-078c-4362-8195-e12f45811556.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapb1894616-10 not found in namespace qdhcp-dca693cd-078c-4362-8195-e12f45811556.
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapb1894616-10 not found in namespace qdhcp-dca693cd-078c-4362-8195-e12f45811556.
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.543 262572 ERROR neutron.agent.dhcp.agent 
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.549 262572 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 06 10:20:06 np0005548788.localdomain systemd[1]: tmp-crun.khjEAv.mount: Deactivated successfully.
Dec 06 10:20:06 np0005548788.localdomain podman[320242]: 2025-12-06 10:20:06.633948574 +0000 UTC m=+0.090138664 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:20:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:06.702 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548788.localdomain podman[320242]: 2025-12-06 10:20:06.748932325 +0000 UTC m=+0.205122375 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:06 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:20:06 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:06.813 262572 INFO neutron.agent.dhcp.agent [None req-149f05d0-a1cb-4f76-b752-1ca8b8f2e328 - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:06 np0005548788.localdomain dnsmasq[320171]: exiting on receipt of SIGTERM
Dec 06 10:20:06 np0005548788.localdomain podman[320283]: 2025-12-06 10:20:06.994522182 +0000 UTC m=+0.066852592 container kill 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:20:06 np0005548788.localdomain systemd[1]: libpod-918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b.scope: Deactivated successfully.
Dec 06 10:20:07 np0005548788.localdomain podman[320296]: 2025-12-06 10:20:07.078340118 +0000 UTC m=+0.067725198 container died 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:07 np0005548788.localdomain podman[320296]: 2025-12-06 10:20:07.115275402 +0000 UTC m=+0.104660442 container cleanup 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:20:07 np0005548788.localdomain systemd[1]: libpod-conmon-918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b.scope: Deactivated successfully.
Dec 06 10:20:07 np0005548788.localdomain podman[320298]: 2025-12-06 10:20:07.158864722 +0000 UTC m=+0.138256502 container remove 918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dca693cd-078c-4362-8195-e12f45811556, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:07 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:07.191 262572 INFO neutron.agent.dhcp.agent [None req-8948dd9f-2fc1-458d-b6f6-09f9aa3a47a1 - - - - - -] Synchronizing state complete
Dec 06 10:20:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "snap_name": "86b1b260-491a-4ff7-9199-adafe7f88c8f_1b65360c-9474-4053-9db5-09821cc600f9", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "snap_name": "86b1b260-491a-4ff7-9199-adafe7f88c8f", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c82dd03bb94cc3a1271eef878ddfdb162e5610be76faae40452fd0360d537dd3-merged.mount: Deactivated successfully.
Dec 06 10:20:07 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-918c9a5e508d441f7831b9dee046d8ebfc4dcea8e1f2dbbd97a49af4c7516d4b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:07 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2ddca693cd\x2d078c\x2d4362\x2d8195\x2de12f45811556.mount: Deactivated successfully.
Dec 06 10:20:07 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:07.597 2 INFO neutron.agent.securitygroups_rpc [None req-6d59f8dd-76ee-4672-86ac-2d91b87c0791 260dfc8941214c308c05293af65bdae9 24086b701d6b4d4081d2e63578d18d24 - - default default] Security group member updated ['ea587027-2c02-4165-a90f-98eaf0ce1ddb']
Dec 06 10:20:07 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:07.616 2 INFO neutron.agent.securitygroups_rpc [None req-1250ea59-7c13-4a58-b22f-38de2df53542 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['c05cd5e8-c5d4-4d05-80ba-b6a4af8b3ba8']
Dec 06 10:20:07 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:07.663 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6823550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c68236a0>], id=8dc13206-e669-4342-8595-6d38e8078380, ip_allocation=immediate, mac_address=fa:16:3e:09:e1:0d, name=tempest-PortsTestJSON-1474947020, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:32Z, description=, dns_domain=, id=667a7cf2-00f8-4896-8e3d-8222fad7f397, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-584577498, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21450, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=1799, status=ACTIVE, subnets=['6baebf2a-d490-47c2-bccd-580b84cbfcc3', '9683cc1d-3f9b-474d-93a2-faebbcbf5aaf'], tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:19:59Z, vlan_transparent=None, network_id=667a7cf2-00f8-4896-8e3d-8222fad7f397, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c05cd5e8-c5d4-4d05-80ba-b6a4af8b3ba8'], standard_attr_id=2309, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:20:07Z on network 667a7cf2-00f8-4896-8e3d-8222fad7f397
Dec 06 10:20:08 np0005548788.localdomain dnsmasq[320208]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 1 addresses
Dec 06 10:20:08 np0005548788.localdomain dnsmasq-dhcp[320208]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:08 np0005548788.localdomain dnsmasq-dhcp[320208]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:08 np0005548788.localdomain systemd[1]: tmp-crun.SEIX8i.mount: Deactivated successfully.
Dec 06 10:20:08 np0005548788.localdomain podman[320342]: 2025-12-06 10:20:08.051601085 +0000 UTC m=+0.074710015 container kill a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:20:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:08.324 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:08 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:08.403 262572 INFO neutron.agent.dhcp.agent [None req-cb555b27-4a3d-486c-bef4-259c080d3a67 - - - - - -] DHCP configuration for ports {'8dc13206-e669-4342-8595-6d38e8078380'} is completed
Dec 06 10:20:08 np0005548788.localdomain ceph-mon[293643]: pgmap v324: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 3.2 KiB/s wr, 1 op/s
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:20:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:20:09 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:09.132 262572 INFO neutron.agent.linux.ip_lib [None req-afc7536e-c1bd-48c8-8270-603df05694fe - - - - - -] Device tapcfc2be1a-ed cannot be used as it has no MAC address
Dec 06 10:20:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:09.157 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:09 np0005548788.localdomain kernel: device tapcfc2be1a-ed entered promiscuous mode
Dec 06 10:20:09 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:09Z|00184|binding|INFO|Claiming lport cfc2be1a-ed3a-4b84-a6a7-292558e68928 for this chassis.
Dec 06 10:20:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:09.164 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:09 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:09Z|00185|binding|INFO|cfc2be1a-ed3a-4b84-a6a7-292558e68928: Claiming unknown
Dec 06 10:20:09 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016409.1670] manager: (tapcfc2be1a-ed): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Dec 06 10:20:09 np0005548788.localdomain systemd-udevd[320372]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:09.176 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-61c48b9a-53ba-446c-91d5-cfa06ca15a69', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61c48b9a-53ba-446c-91d5-cfa06ca15a69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b19d6ea6-8bdc-448b-8817-7f548f3c16fd, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=cfc2be1a-ed3a-4b84-a6a7-292558e68928) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:09.178 159620 INFO neutron.agent.ovn.metadata.agent [-] Port cfc2be1a-ed3a-4b84-a6a7-292558e68928 in datapath 61c48b9a-53ba-446c-91d5-cfa06ca15a69 bound to our chassis
Dec 06 10:20:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:09.179 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 61c48b9a-53ba-446c-91d5-cfa06ca15a69 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:09 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:09.180 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[0dafea0c-4dbd-4152-a692-413db9321708]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:09Z|00186|binding|INFO|Setting lport cfc2be1a-ed3a-4b84-a6a7-292558e68928 ovn-installed in OVS
Dec 06 10:20:09 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:09Z|00187|binding|INFO|Setting lport cfc2be1a-ed3a-4b84-a6a7-292558e68928 up in Southbound
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:09.198 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapcfc2be1a-ed: No such device
Dec 06 10:20:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:09.238 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:09.268 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:20:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 13K writes, 48K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 13K writes, 4278 syncs, 3.11 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 7696 writes, 24K keys, 7696 commit groups, 1.0 writes per commit group, ingest: 21.13 MB, 0.04 MB/s
                                                          Interval WAL: 7696 writes, 3365 syncs, 2.29 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:20:10 np0005548788.localdomain podman[320441]: 
Dec 06 10:20:10 np0005548788.localdomain podman[320441]: 2025-12-06 10:20:10.035179767 +0000 UTC m=+0.094633363 container create fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: Started libpod-conmon-fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf.scope.
Dec 06 10:20:10 np0005548788.localdomain podman[320441]: 2025-12-06 10:20:09.988579793 +0000 UTC m=+0.048033419 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:10 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e45ec60e897b833469b4cf5223aa6d6feeac0c271b548c74fffefb5c33fd264c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:10 np0005548788.localdomain podman[320441]: 2025-12-06 10:20:10.126814775 +0000 UTC m=+0.186268341 container init fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:20:10 np0005548788.localdomain podman[320441]: 2025-12-06 10:20:10.137293019 +0000 UTC m=+0.196746565 container start fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320503]: started, version 2.85 cachesize 150
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320503]: DNS service limited to local subnets
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320503]: warning: no upstream servers configured
Dec 06 10:20:10 np0005548788.localdomain dnsmasq-dhcp[320503]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320503]: read /var/lib/neutron/dhcp/61c48b9a-53ba-446c-91d5-cfa06ca15a69/addn_hosts - 0 addresses
Dec 06 10:20:10 np0005548788.localdomain dnsmasq-dhcp[320503]: read /var/lib/neutron/dhcp/61c48b9a-53ba-446c-91d5-cfa06ca15a69/host
Dec 06 10:20:10 np0005548788.localdomain dnsmasq-dhcp[320503]: read /var/lib/neutron/dhcp/61c48b9a-53ba-446c-91d5-cfa06ca15a69/opts
Dec 06 10:20:10 np0005548788.localdomain podman[320458]: 2025-12-06 10:20:10.20703796 +0000 UTC m=+0.133491126 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:10 np0005548788.localdomain podman[320458]: 2025-12-06 10:20:10.220706153 +0000 UTC m=+0.147159339 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:10.260 262572 INFO neutron.agent.dhcp.agent [None req-48ecbbb5-f5df-423b-98a4-8c5dca41b5b7 - - - - - -] DHCP configuration for ports {'a506ceee-2189-4b27-9fdc-f0c11c5e38c3'} is completed
Dec 06 10:20:10 np0005548788.localdomain podman[320461]: 2025-12-06 10:20:10.267264176 +0000 UTC m=+0.180777191 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc.)
Dec 06 10:20:10 np0005548788.localdomain podman[320459]: 2025-12-06 10:20:10.173481331 +0000 UTC m=+0.091846277 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:20:10 np0005548788.localdomain podman[320461]: 2025-12-06 10:20:10.283702234 +0000 UTC m=+0.197215299 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain podman[320459]: 2025-12-06 10:20:10.308698649 +0000 UTC m=+0.227063655 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:10.429 2 INFO neutron.agent.securitygroups_rpc [None req-e7d31636-8439-4e6f-9785-a953cb0386af 260dfc8941214c308c05293af65bdae9 24086b701d6b4d4081d2e63578d18d24 - - default default] Security group member updated ['ea587027-2c02-4165-a90f-98eaf0ce1ddb']
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320208]: exiting on receipt of SIGTERM
Dec 06 10:20:10 np0005548788.localdomain podman[320555]: 2025-12-06 10:20:10.440501801 +0000 UTC m=+0.063359093 container kill a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: libpod-a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b.scope: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain dnsmasq[320503]: exiting on receipt of SIGTERM
Dec 06 10:20:10 np0005548788.localdomain podman[320562]: 2025-12-06 10:20:10.471169261 +0000 UTC m=+0.070684630 container kill fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: libpod-fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf.scope: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain podman[320580]: 2025-12-06 10:20:10.521410248 +0000 UTC m=+0.068250635 container died a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e163 do_prune osdmap full prune enabled
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: pgmap v325: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 7.3 KiB/s wr, 2 op/s
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aaafb223-94a1-4885-a088-5199e647d774", "format": "json"}]: dispatch
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e164 e164: 6 total, 6 up, 6 in
Dec 06 10:20:10 np0005548788.localdomain podman[320603]: 2025-12-06 10:20:10.551727967 +0000 UTC m=+0.063745366 container died fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in
Dec 06 10:20:10 np0005548788.localdomain podman[320580]: 2025-12-06 10:20:10.65739626 +0000 UTC m=+0.204236607 container cleanup a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:10.658 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: libpod-conmon-a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b.scope: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain podman[320582]: 2025-12-06 10:20:10.684883302 +0000 UTC m=+0.223309629 container remove a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:10 np0005548788.localdomain podman[320603]: 2025-12-06 10:20:10.711415973 +0000 UTC m=+0.223433322 container cleanup fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:20:10 np0005548788.localdomain systemd[1]: libpod-conmon-fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf.scope: Deactivated successfully.
Dec 06 10:20:10 np0005548788.localdomain podman[320605]: 2025-12-06 10:20:10.743445866 +0000 UTC m=+0.249103097 container remove fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61c48b9a-53ba-446c-91d5-cfa06ca15a69, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:10 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:10Z|00188|binding|INFO|Releasing lport cfc2be1a-ed3a-4b84-a6a7-292558e68928 from this chassis (sb_readonly=0)
Dec 06 10:20:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:10.756 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548788.localdomain kernel: device tapcfc2be1a-ed left promiscuous mode
Dec 06 10:20:10 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:10Z|00189|binding|INFO|Setting lport cfc2be1a-ed3a-4b84-a6a7-292558e68928 down in Southbound
Dec 06 10:20:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:10.767 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-61c48b9a-53ba-446c-91d5-cfa06ca15a69', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61c48b9a-53ba-446c-91d5-cfa06ca15a69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b19d6ea6-8bdc-448b-8817-7f548f3c16fd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=cfc2be1a-ed3a-4b84-a6a7-292558e68928) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:10.769 159620 INFO neutron.agent.ovn.metadata.agent [-] Port cfc2be1a-ed3a-4b84-a6a7-292558e68928 in datapath 61c48b9a-53ba-446c-91d5-cfa06ca15a69 unbound from our chassis
Dec 06 10:20:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:10.770 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 61c48b9a-53ba-446c-91d5-cfa06ca15a69 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:10 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:10.771 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[0b81ff14-bf22-4fb2-a100-e057ec8ff056]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:10.779 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e45ec60e897b833469b4cf5223aa6d6feeac0c271b548c74fffefb5c33fd264c-merged.mount: Deactivated successfully.
Dec 06 10:20:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fbb1c080ccb7960fb9692a246cf8b1a21619d67ba3d2bdc6f511e74583ad8daf-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-cbfa81f413785b321dff79cb8ec85a404102b449240f3a2f5306bf52fcafad77-merged.mount: Deactivated successfully.
Dec 06 10:20:11 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6180d83497334e063eb05ae28ed3ef1d31a01862c1af85834805726000d4d0b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:11 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d61c48b9a\x2d53ba\x2d446c\x2d91d5\x2dcfa06ca15a69.mount: Deactivated successfully.
Dec 06 10:20:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:11.105 262572 INFO neutron.agent.dhcp.agent [None req-3e668ba7-09ff-4fc5-adc2-3d81d696ff7e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:11.106 262572 INFO neutron.agent.dhcp.agent [None req-3e668ba7-09ff-4fc5-adc2-3d81d696ff7e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:11 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:11.173 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:11.211 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=659e29bd-a84c-4733-b754-dbb7b70b98cc) old=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:11.214 159620 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 659e29bd-a84c-4733-b754-dbb7b70b98cc in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 updated
Dec 06 10:20:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:11.216 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 34ac1143-d926-44f4-b7e7-5664d64831d2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:20:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:11.217 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:11 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:11.218 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[245e5ae8-80ba-4c86-a25e-759a3397b893]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:11.515 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:11 np0005548788.localdomain ceph-mon[293643]: osdmap e164: 6 total, 6 up, 6 in
Dec 06 10:20:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:12 np0005548788.localdomain podman[320693]: 
Dec 06 10:20:12 np0005548788.localdomain podman[320693]: 2025-12-06 10:20:12.201840699 +0000 UTC m=+0.084682844 container create 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:20:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:12.222 2 INFO neutron.agent.securitygroups_rpc [None req-6d93ce58-a6ee-4351-b70e-4269edfdd4c8 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['81acd248-ff6c-407a-a3e7-57e59597aa28', 'c05cd5e8-c5d4-4d05-80ba-b6a4af8b3ba8', '1d275e53-d6a2-4014-8325-c04642bc5279']
Dec 06 10:20:12 np0005548788.localdomain systemd[1]: Started libpod-conmon-59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f.scope.
Dec 06 10:20:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:12.251 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:12.250 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:12 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:12.254 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:20:12 np0005548788.localdomain podman[320693]: 2025-12-06 10:20:12.160184129 +0000 UTC m=+0.043026314 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:12 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:12 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e44747fda4351a852b62bb2dafe5fc4d81cd3fee3714bcd3e549f186cbb557db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:12 np0005548788.localdomain podman[320693]: 2025-12-06 10:20:12.282744155 +0000 UTC m=+0.165586300 container init 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:20:12 np0005548788.localdomain podman[320693]: 2025-12-06 10:20:12.293246211 +0000 UTC m=+0.176088406 container start 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:20:12 np0005548788.localdomain dnsmasq[320711]: started, version 2.85 cachesize 150
Dec 06 10:20:12 np0005548788.localdomain dnsmasq[320711]: DNS service limited to local subnets
Dec 06 10:20:12 np0005548788.localdomain dnsmasq[320711]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:12 np0005548788.localdomain dnsmasq[320711]: warning: no upstream servers configured
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: DHCP, static leases only on 10.100.0.32, lease time 1d
Dec 06 10:20:12 np0005548788.localdomain dnsmasq[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 1 addresses
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:12.360 262572 INFO neutron.agent.dhcp.agent [None req-3fdf9953-1e6b-4ed8-93c4-4ff1bb1797a4 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67eaaf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ea9d0>], id=8dc13206-e669-4342-8595-6d38e8078380, ip_allocation=immediate, mac_address=fa:16:3e:09:e1:0d, name=tempest-PortsTestJSON-1788146395, network_id=667a7cf2-00f8-4896-8e3d-8222fad7f397, port_security_enabled=True, project_id=5f8e1c4c589749b99178bbc7c2bea3f0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['1d275e53-d6a2-4014-8325-c04642bc5279', '81acd248-ff6c-407a-a3e7-57e59597aa28'], standard_attr_id=2309, status=DOWN, tags=[], tenant_id=5f8e1c4c589749b99178bbc7c2bea3f0, updated_at=2025-12-06T10:20:11Z on network 667a7cf2-00f8-4896-8e3d-8222fad7f397
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: DHCPRELEASE(tap8fdc5379-5c) 10.100.0.6 fa:16:3e:09:e1:0d
Dec 06 10:20:12 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:12.583 262572 INFO neutron.agent.dhcp.agent [None req-ee94b583-266e-40ab-a838-fcf524015165 - - - - - -] DHCP configuration for ports {'8dc13206-e669-4342-8595-6d38e8078380', '659e29bd-a84c-4733-b754-dbb7b70b98cc', '4f359c22-d41c-4075-b1d7-af5b57282e35', '8fdc5379-5c86-41b4-8c56-dabef74615ea'} is completed
Dec 06 10:20:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "format": "json"}]: dispatch
Dec 06 10:20:12 np0005548788.localdomain ceph-mon[293643]: pgmap v327: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s wr, 2 op/s
Dec 06 10:20:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:12.872 2 INFO neutron.agent.securitygroups_rpc [None req-5ab424f2-09a3-4942-a99f-ad10877e0761 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['81acd248-ff6c-407a-a3e7-57e59597aa28', '1d275e53-d6a2-4014-8325-c04642bc5279']
Dec 06 10:20:12 np0005548788.localdomain dnsmasq[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 1 addresses
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:12 np0005548788.localdomain dnsmasq-dhcp[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:12 np0005548788.localdomain podman[320730]: 2025-12-06 10:20:12.960117517 +0000 UTC m=+0.058483552 container kill 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:13 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:13.221 262572 INFO neutron.agent.dhcp.agent [None req-c096556d-fd60-406c-bb26-fc454b16cf55 - - - - - -] DHCP configuration for ports {'8dc13206-e669-4342-8595-6d38e8078380'} is completed
Dec 06 10:20:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e164 do_prune osdmap full prune enabled
Dec 06 10:20:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e165 e165: 6 total, 6 up, 6 in
Dec 06 10:20:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in
Dec 06 10:20:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:13.326 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:13 np0005548788.localdomain podman[320769]: 2025-12-06 10:20:13.505538282 +0000 UTC m=+0.065245293 container kill 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:13 np0005548788.localdomain dnsmasq[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 0 addresses
Dec 06 10:20:13 np0005548788.localdomain dnsmasq-dhcp[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:13 np0005548788.localdomain dnsmasq-dhcp[320711]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "format": "json"}]: dispatch
Dec 06 10:20:13 np0005548788.localdomain ceph-mon[293643]: osdmap e165: 6 total, 6 up, 6 in
Dec 06 10:20:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:20:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 38K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2729 syncs, 3.70 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4714 writes, 14K keys, 4714 commit groups, 1.0 writes per commit group, ingest: 11.91 MB, 0.02 MB/s
                                                          Interval WAL: 4714 writes, 2042 syncs, 2.31 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:20:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:14 np0005548788.localdomain podman[320806]: 2025-12-06 10:20:14.53983939 +0000 UTC m=+0.063333933 container kill 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:20:14 np0005548788.localdomain dnsmasq[320711]: exiting on receipt of SIGTERM
Dec 06 10:20:14 np0005548788.localdomain systemd[1]: libpod-59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f.scope: Deactivated successfully.
Dec 06 10:20:14 np0005548788.localdomain podman[320819]: 2025-12-06 10:20:14.613424619 +0000 UTC m=+0.054145258 container died 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:14 np0005548788.localdomain ceph-mon[293643]: pgmap v328: 177 pgs: 177 active+clean; 192 MiB data, 896 MiB used, 41 GiB / 42 GiB avail; 3.3 KiB/s rd, 15 KiB/s wr, 9 op/s
Dec 06 10:20:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:14 np0005548788.localdomain podman[320819]: 2025-12-06 10:20:14.654555823 +0000 UTC m=+0.095276412 container cleanup 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:20:14 np0005548788.localdomain systemd[1]: libpod-conmon-59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f.scope: Deactivated successfully.
Dec 06 10:20:14 np0005548788.localdomain podman[320820]: 2025-12-06 10:20:14.695892633 +0000 UTC m=+0.133965720 container remove 59f3455f83b84ec667c2cf5f6b9a40fbb6faf71de9522df3046f01cc56b7d65f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:20:15 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:15.329 262572 INFO neutron.agent.linux.ip_lib [None req-2d20d474-cc56-484a-96c0-d11337cae671 - - - - - -] Device tapfb4c7ddd-2f cannot be used as it has no MAC address
Dec 06 10:20:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:15.391 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:15 np0005548788.localdomain kernel: device tapfb4c7ddd-2f entered promiscuous mode
Dec 06 10:20:15 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016415.4011] manager: (tapfb4c7ddd-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Dec 06 10:20:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:15Z|00190|binding|INFO|Claiming lport fb4c7ddd-2f3b-4a70-b82a-872347c6cd57 for this chassis.
Dec 06 10:20:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:15Z|00191|binding|INFO|fb4c7ddd-2f3b-4a70-b82a-872347c6cd57: Claiming unknown
Dec 06 10:20:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:15.408 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:15.410 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-8ff8ee66-9289-47bd-9212-2f68930a3d99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ff8ee66-9289-47bd-9212-2f68930a3d99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76710af8-20ca-4320-8660-69d1eec67f3e, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=fb4c7ddd-2f3b-4a70-b82a-872347c6cd57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:15 np0005548788.localdomain systemd-udevd[320885]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:15.414 159620 INFO neutron.agent.ovn.metadata.agent [-] Port fb4c7ddd-2f3b-4a70-b82a-872347c6cd57 in datapath 8ff8ee66-9289-47bd-9212-2f68930a3d99 bound to our chassis
Dec 06 10:20:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:15.416 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8ff8ee66-9289-47bd-9212-2f68930a3d99 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:15 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:15.418 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a93b91f8-fd4c-4790-a856-96fd2bd0eca4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:15Z|00192|binding|INFO|Setting lport fb4c7ddd-2f3b-4a70-b82a-872347c6cd57 ovn-installed in OVS
Dec 06 10:20:15 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:15Z|00193|binding|INFO|Setting lport fb4c7ddd-2f3b-4a70-b82a-872347c6cd57 up in Southbound
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:15.449 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapfb4c7ddd-2f: No such device
Dec 06 10:20:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:15.480 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:15.510 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:15 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e44747fda4351a852b62bb2dafe5fc4d81cd3fee3714bcd3e549f186cbb557db-merged.mount: Deactivated successfully.
Dec 06 10:20:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e165 do_prune osdmap full prune enabled
Dec 06 10:20:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "format": "json"}]: dispatch
Dec 06 10:20:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e166 e166: 6 total, 6 up, 6 in
Dec 06 10:20:15 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in
Dec 06 10:20:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:15.660 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:15 np0005548788.localdomain podman[320935]: 
Dec 06 10:20:15 np0005548788.localdomain podman[320935]: 2025-12-06 10:20:15.69397927 +0000 UTC m=+0.112885009 container create ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:20:15 np0005548788.localdomain podman[320935]: 2025-12-06 10:20:15.641450292 +0000 UTC m=+0.060356021 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:15 np0005548788.localdomain systemd[1]: Started libpod-conmon-ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290.scope.
Dec 06 10:20:15 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:15 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b4d8ee593a9e5ae736c2858103f181e3b6fd77c1377076a87856babcc84065c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:15 np0005548788.localdomain podman[320935]: 2025-12-06 10:20:15.795925087 +0000 UTC m=+0.214830786 container init ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:20:15 np0005548788.localdomain podman[320935]: 2025-12-06 10:20:15.805914427 +0000 UTC m=+0.224820126 container start ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:15 np0005548788.localdomain dnsmasq[320963]: started, version 2.85 cachesize 150
Dec 06 10:20:15 np0005548788.localdomain dnsmasq[320963]: DNS service limited to local subnets
Dec 06 10:20:15 np0005548788.localdomain dnsmasq[320963]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:15 np0005548788.localdomain dnsmasq[320963]: warning: no upstream servers configured
Dec 06 10:20:15 np0005548788.localdomain dnsmasq-dhcp[320963]: DHCP, static leases only on 10.100.0.16, lease time 1d
Dec 06 10:20:15 np0005548788.localdomain dnsmasq-dhcp[320963]: DHCP, static leases only on 10.100.0.32, lease time 1d
Dec 06 10:20:15 np0005548788.localdomain dnsmasq[320963]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/addn_hosts - 0 addresses
Dec 06 10:20:15 np0005548788.localdomain dnsmasq-dhcp[320963]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/host
Dec 06 10:20:15 np0005548788.localdomain dnsmasq-dhcp[320963]: read /var/lib/neutron/dhcp/667a7cf2-00f8-4896-8e3d-8222fad7f397/opts
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.135 262572 INFO neutron.agent.dhcp.agent [None req-2c3d386b-d650-42d8-b502-2cc87a88d5e3 - - - - - -] DHCP configuration for ports {'659e29bd-a84c-4733-b754-dbb7b70b98cc', '4f359c22-d41c-4075-b1d7-af5b57282e35', '8fdc5379-5c86-41b4-8c56-dabef74615ea'} is completed
Dec 06 10:20:16 np0005548788.localdomain podman[320997]: 
Dec 06 10:20:16 np0005548788.localdomain podman[320997]: 2025-12-06 10:20:16.499392387 +0000 UTC m=+0.095914792 container create 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:20:16 np0005548788.localdomain systemd[1]: Started libpod-conmon-442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b.scope.
Dec 06 10:20:16 np0005548788.localdomain podman[320997]: 2025-12-06 10:20:16.457474829 +0000 UTC m=+0.053997284 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:16 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:16 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5e08d6f3947c16b98387b115d084a54ac01393de8796788c7f1bf76a5202ebe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:16 np0005548788.localdomain podman[320997]: 2025-12-06 10:20:16.592563243 +0000 UTC m=+0.189085648 container init 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:20:16 np0005548788.localdomain podman[320997]: 2025-12-06 10:20:16.603254824 +0000 UTC m=+0.199777239 container start 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:16 np0005548788.localdomain dnsmasq[321015]: started, version 2.85 cachesize 150
Dec 06 10:20:16 np0005548788.localdomain dnsmasq[321015]: DNS service limited to local subnets
Dec 06 10:20:16 np0005548788.localdomain dnsmasq[321015]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:16 np0005548788.localdomain dnsmasq[321015]: warning: no upstream servers configured
Dec 06 10:20:16 np0005548788.localdomain dnsmasq-dhcp[321015]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:16 np0005548788.localdomain dnsmasq[321015]: read /var/lib/neutron/dhcp/8ff8ee66-9289-47bd-9212-2f68930a3d99/addn_hosts - 0 addresses
Dec 06 10:20:16 np0005548788.localdomain dnsmasq-dhcp[321015]: read /var/lib/neutron/dhcp/8ff8ee66-9289-47bd-9212-2f68930a3d99/host
Dec 06 10:20:16 np0005548788.localdomain dnsmasq-dhcp[321015]: read /var/lib/neutron/dhcp/8ff8ee66-9289-47bd-9212-2f68930a3d99/opts
Dec 06 10:20:16 np0005548788.localdomain ceph-mon[293643]: pgmap v330: 177 pgs: 177 active+clean; 192 MiB data, 896 MiB used, 41 GiB / 42 GiB avail; 4.1 KiB/s rd, 19 KiB/s wr, 11 op/s
Dec 06 10:20:16 np0005548788.localdomain ceph-mon[293643]: osdmap e166: 6 total, 6 up, 6 in
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.675 262572 INFO neutron.agent.dhcp.agent [None req-8948dd9f-2fc1-458d-b6f6-09f9aa3a47a1 - - - - - -] Synchronizing state
Dec 06 10:20:16 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:16.702 2 INFO neutron.agent.securitygroups_rpc [None req-9395d16c-29ac-47bb-b03d-1c577d966648 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.775 262572 INFO neutron.agent.dhcp.agent [None req-7c5a3b76-e6dd-4c5e-8174-c2c30df52b56 - - - - - -] DHCP configuration for ports {'010543f8-7704-4516-bf42-a49417ffe518'} is completed
Dec 06 10:20:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:16Z|00194|binding|INFO|Removing iface tapfb4c7ddd-2f ovn-installed in OVS
Dec 06 10:20:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:16Z|00195|binding|INFO|Removing lport fb4c7ddd-2f3b-4a70-b82a-872347c6cd57 ovn-installed in OVS
Dec 06 10:20:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:16.889 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 49bf225a-6da0-443c-88e7-51ccc40a9800 with type ""
Dec 06 10:20:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:16.890 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-8ff8ee66-9289-47bd-9212-2f68930a3d99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ff8ee66-9289-47bd-9212-2f68930a3d99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76710af8-20ca-4320-8660-69d1eec67f3e, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=fb4c7ddd-2f3b-4a70-b82a-872347c6cd57) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:16.890 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:16.894 159620 INFO neutron.agent.ovn.metadata.agent [-] Port fb4c7ddd-2f3b-4a70-b82a-872347c6cd57 in datapath 8ff8ee66-9289-47bd-9212-2f68930a3d99 unbound from our chassis
Dec 06 10:20:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:16.895 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8ff8ee66-9289-47bd-9212-2f68930a3d99 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:16.896 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[012e125b-083f-49f4-bf69-44eda6828b0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:16.897 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.988 262572 INFO neutron.agent.dhcp.agent [None req-fbaf1db4-3f1a-4f08-92e6-d47bb9bf34bf - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.989 262572 INFO neutron.agent.dhcp.agent [None req-fbaf1db4-3f1a-4f08-92e6-d47bb9bf34bf - - - - - -] Synchronizing state complete
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.993 262572 INFO neutron.agent.dhcp.agent [None req-0347b40c-9df9-4132-9466-d994b0c5a817 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:16.994 262572 INFO neutron.agent.dhcp.agent [None req-0347b40c-9df9-4132-9466-d994b0c5a817 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.063 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e166 do_prune osdmap full prune enabled
Dec 06 10:20:17 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:17Z|00196|binding|INFO|Removing iface tap8fdc5379-5c ovn-installed in OVS
Dec 06 10:20:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:17.253 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 34ac1143-d926-44f4-b7e7-5664d64831d2 with type ""
Dec 06 10:20:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e167 e167: 6 total, 6 up, 6 in
Dec 06 10:20:17 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:17Z|00197|binding|INFO|Removing lport 8fdc5379-5c86-41b4-8c56-dabef74615ea ovn-installed in OVS
Dec 06 10:20:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:17.255 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=8fdc5379-5c86-41b4-8c56-dabef74615ea) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.257 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:17.258 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 8fdc5379-5c86-41b4-8c56-dabef74615ea in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 unbound from our chassis
Dec 06 10:20:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:17.260 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:17.261 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[432c5654-b608-4bf0-bbde-a835274878c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.265 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain dnsmasq[321015]: exiting on receipt of SIGTERM
Dec 06 10:20:17 np0005548788.localdomain podman[321046]: 2025-12-06 10:20:17.27106381 +0000 UTC m=+0.081652680 container kill 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: libpod-442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b.scope: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain podman[321055]: 2025-12-06 10:20:17.28978216 +0000 UTC m=+0.075968904 container kill ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: libpod-ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290.scope: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain dnsmasq[320963]: exiting on receipt of SIGTERM
Dec 06 10:20:17 np0005548788.localdomain podman[321075]: 2025-12-06 10:20:17.363576116 +0000 UTC m=+0.068308527 container died 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:17 np0005548788.localdomain podman[321075]: 2025-12-06 10:20:17.393231014 +0000 UTC m=+0.097963345 container cleanup 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: libpod-conmon-442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b.scope: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain podman[321076]: 2025-12-06 10:20:17.442699967 +0000 UTC m=+0.134347283 container remove 442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ff8ee66-9289-47bd-9212-2f68930a3d99, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.455 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain kernel: device tapfb4c7ddd-2f left promiscuous mode
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.469 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain podman[321084]: 2025-12-06 10:20:17.481644003 +0000 UTC m=+0.169150051 container died ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:17.486 262572 INFO neutron.agent.dhcp.agent [None req-0549a64f-c358-44d3-ad26-e3fd2f594e8d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:17.487 262572 INFO neutron.agent.dhcp.agent [None req-0549a64f-c358-44d3-ad26-e3fd2f594e8d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f5e08d6f3947c16b98387b115d084a54ac01393de8796788c7f1bf76a5202ebe-merged.mount: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-442179ae63fd1260abbc18d46daab74d6b718b8da3813daee810d26d95211b3b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-0b4d8ee593a9e5ae736c2858103f181e3b6fd77c1377076a87856babcc84065c-merged.mount: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d8ff8ee66\x2d9289\x2d47bd\x2d9212\x2d2f68930a3d99.mount: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain podman[321084]: 2025-12-06 10:20:17.576228773 +0000 UTC m=+0.263734781 container remove ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-667a7cf2-00f8-4896-8e3d-8222fad7f397, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: libpod-conmon-ebf7151e7712ae0a4e0abe5b286e0246e166aa3c6e50247940ae5999d6e2e290.scope: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.588 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain kernel: device tap8fdc5379-5c left promiscuous mode
Dec 06 10:20:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:17.598 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:17 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d667a7cf2\x2d00f8\x2d4896\x2d8e3d\x2d8222fad7f397.mount: Deactivated successfully.
Dec 06 10:20:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:17.615 262572 INFO neutron.agent.dhcp.agent [None req-3c974086-5f11-4663-9e64-61015ad1d4e4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:17.616 262572 INFO neutron.agent.dhcp.agent [None req-3c974086-5f11-4663-9e64-61015ad1d4e4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:17.616 262572 INFO neutron.agent.dhcp.agent [None req-3c974086-5f11-4663-9e64-61015ad1d4e4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:18.000 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548788.localdomain ceph-mon[293643]: pgmap v332: 177 pgs: 177 active+clean; 192 MiB data, 896 MiB used, 41 GiB / 42 GiB avail; 5.0 KiB/s rd, 16 KiB/s wr, 11 op/s
Dec 06 10:20:18 np0005548788.localdomain ceph-mon[293643]: osdmap e167: 6 total, 6 up, 6 in
Dec 06 10:20:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:18.328 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e167 do_prune osdmap full prune enabled
Dec 06 10:20:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e168 e168: 6 total, 6 up, 6 in
Dec 06 10:20:19 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in
Dec 06 10:20:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "format": "json"}]: dispatch
Dec 06 10:20:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:20:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:20:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:20:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:20:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:20:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19201 "" "Go-http-client/1.1"
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e168 do_prune osdmap full prune enabled
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "format": "json"}]: dispatch
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: pgmap v334: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 19 KiB/s wr, 47 op/s
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: osdmap e168: 6 total, 6 up, 6 in
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e169 e169: 6 total, 6 up, 6 in
Dec 06 10:20:20 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in
Dec 06 10:20:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:20.705 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.247 262572 INFO neutron.agent.dhcp.agent [None req-fbaf1db4-3f1a-4f08-92e6-d47bb9bf34bf - - - - - -] Synchronizing state
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e169 do_prune osdmap full prune enabled
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: osdmap e169: 6 total, 6 up, 6 in
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e170 e170: 6 total, 6 up, 6 in
Dec 06 10:20:21 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.369 262572 INFO neutron.agent.dhcp.agent [None req-5eb3f742-c0be-44d5-b3e3-d3cd043a4beb - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.370 262572 INFO neutron.agent.dhcp.agent [-] Starting network 5bdc19f6-732f-4ab8-8c1d-af6790669fce dhcp configuration
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.371 262572 INFO neutron.agent.dhcp.agent [-] Finished network 5bdc19f6-732f-4ab8-8c1d-af6790669fce dhcp configuration
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.372 262572 INFO neutron.agent.dhcp.agent [None req-5eb3f742-c0be-44d5-b3e3-d3cd043a4beb - - - - - -] Synchronizing state complete
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.372 262572 INFO neutron.agent.dhcp.agent [None req-a8d5b0b0-2751-476c-b1cd-e6a519648f06 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:21.991 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e170 do_prune osdmap full prune enabled
Dec 06 10:20:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:22.256 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e171 e171: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548788.localdomain podman[321126]: 2025-12-06 10:20:22.26697057 +0000 UTC m=+0.092918349 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:22 np0005548788.localdomain podman[321126]: 2025-12-06 10:20:22.282685277 +0000 UTC m=+0.108633046 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:20:22 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "81753d92-4847-43cb-b357-c4adab052a83", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: pgmap v337: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 21 KiB/s wr, 51 op/s
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "81753d92-4847-43cb-b357-c4adab052a83", "format": "json"}]: dispatch
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: osdmap e170: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: osdmap e171: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e171 do_prune osdmap full prune enabled
Dec 06 10:20:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:23.368 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e172 e172: 6 total, 6 up, 6 in
Dec 06 10:20:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:23 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in
Dec 06 10:20:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:23.711 2 INFO neutron.agent.securitygroups_rpc [None req-048ea5a4-2d0a-4365-a398-a917ab48f027 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['eb426258-160f-4f74-a9d2-50e476134e75']
Dec 06 10:20:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:23.992 2 INFO neutron.agent.securitygroups_rpc [None req-6df8f9bf-dce2-42c5-9279-2397b4b4c0d3 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['eb426258-160f-4f74-a9d2-50e476134e75']
Dec 06 10:20:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "format": "json"}]: dispatch
Dec 06 10:20:24 np0005548788.localdomain ceph-mon[293643]: pgmap v340: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 29 KiB/s wr, 113 op/s
Dec 06 10:20:24 np0005548788.localdomain ceph-mon[293643]: osdmap e172: 6 total, 6 up, 6 in
Dec 06 10:20:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:20:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:20:25 np0005548788.localdomain systemd[1]: tmp-crun.lq7hI9.mount: Deactivated successfully.
Dec 06 10:20:25 np0005548788.localdomain podman[321146]: 2025-12-06 10:20:25.294309634 +0000 UTC m=+0.120067381 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:20:25 np0005548788.localdomain podman[321146]: 2025-12-06 10:20:25.300585758 +0000 UTC m=+0.126343515 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:25 np0005548788.localdomain systemd[1]: tmp-crun.j52EfM.mount: Deactivated successfully.
Dec 06 10:20:25 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:20:25 np0005548788.localdomain podman[321145]: 2025-12-06 10:20:25.317588584 +0000 UTC m=+0.146152458 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:20:25 np0005548788.localdomain podman[321145]: 2025-12-06 10:20:25.330572457 +0000 UTC m=+0.159136331 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:20:25 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:20:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e172 do_prune osdmap full prune enabled
Dec 06 10:20:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e173 e173: 6 total, 6 up, 6 in
Dec 06 10:20:25 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in
Dec 06 10:20:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:25.708 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:26.136 2 INFO neutron.agent.securitygroups_rpc [None req-ef8cf78f-f1a9-46f9-a6c4-622f166b1f57 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: pgmap v342: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 24 KiB/s wr, 93 op/s
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: osdmap e173: 6 total, 6 up, 6 in
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:26.560 2 INFO neutron.agent.securitygroups_rpc [None req-6a0c3d03-5496-4b9d-aee6-2794cf73d3e3 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:26.778 2 INFO neutron.agent.securitygroups_rpc [None req-cb5ad30b-b885-42c3-a286-99bf227690f5 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:26.973 2 INFO neutron.agent.securitygroups_rpc [None req-415499aa-cb15-4206-8b55-b9a21ed2dc86 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:27 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:27.100 2 INFO neutron.agent.securitygroups_rpc [None req-2e89e39c-a441-4c37-8f6e-df561eb77ca2 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:27 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:27.212 2 INFO neutron.agent.securitygroups_rpc [None req-a03daea5-0289-4f98-a7ae-aa6379a3c0f5 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:27 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:27.244 262572 INFO neutron.agent.linux.ip_lib [None req-3c2fd465-c20e-4782-ae6a-de6140cf87ad - - - - - -] Device tapfe03a8ad-30 cannot be used as it has no MAC address
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e173 do_prune osdmap full prune enabled
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e174 e174: 6 total, 6 up, 6 in
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.266 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain kernel: device tapfe03a8ad-30 entered promiscuous mode
Dec 06 10:20:27 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016427.2727] manager: (tapfe03a8ad-30): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.272 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00198|binding|INFO|Claiming lport fe03a8ad-30c4-4836-8477-37a417211623 for this chassis.
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00199|binding|INFO|fe03a8ad-30c4-4836-8477-37a417211623: Claiming unknown
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in
Dec 06 10:20:27 np0005548788.localdomain systemd-udevd[321195]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.289 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00200|binding|INFO|Setting lport fe03a8ad-30c4-4836-8477-37a417211623 ovn-installed in OVS
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.305 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.307 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.343 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.371 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "81753d92-4847-43cb-b357-c4adab052a83", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "81753d92-4847-43cb-b357-c4adab052a83", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:27 np0005548788.localdomain ceph-mon[293643]: osdmap e174: 6 total, 6 up, 6 in
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00201|binding|INFO|Setting lport fe03a8ad-30c4-4836-8477-37a417211623 up in Southbound
Dec 06 10:20:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:27.459 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-03ec4140-c10f-48dc-83d7-88901bfd0c7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03ec4140-c10f-48dc-83d7-88901bfd0c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cce49b44-3ef1-4453-9cd1-da369aa714f9, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=fe03a8ad-30c4-4836-8477-37a417211623) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:27.461 159620 INFO neutron.agent.ovn.metadata.agent [-] Port fe03a8ad-30c4-4836-8477-37a417211623 in datapath 03ec4140-c10f-48dc-83d7-88901bfd0c7c bound to our chassis
Dec 06 10:20:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:27.462 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 03ec4140-c10f-48dc-83d7-88901bfd0c7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:27 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:27.463 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8c02de-3b02-40b4-b7b9-6e5e868b0b23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:27 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:27.750 262572 INFO neutron.agent.linux.ip_lib [None req-0df8d140-f0a0-4ed7-bd10-e659247afe5a - - - - - -] Device tapf6342200-ac cannot be used as it has no MAC address
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.789 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain kernel: device tapf6342200-ac entered promiscuous mode
Dec 06 10:20:27 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016427.7969] manager: (tapf6342200-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00202|binding|INFO|Claiming lport f6342200-ac22-4690-89fa-8b20414bf7a9 for this chassis.
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00203|binding|INFO|f6342200-ac22-4690-89fa-8b20414bf7a9: Claiming unknown
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.803 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:27Z|00204|binding|INFO|Setting lport f6342200-ac22-4690-89fa-8b20414bf7a9 ovn-installed in OVS
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.840 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.889 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:27.923 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:28 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:28Z|00205|binding|INFO|Setting lport f6342200-ac22-4690-89fa-8b20414bf7a9 up in Southbound
Dec 06 10:20:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:28.038 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-2215b5fe-e12f-489f-83d1-505bed6394f0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2215b5fe-e12f-489f-83d1-505bed6394f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8852fa10-d8bb-4266-88a9-e10b4845d79c, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=f6342200-ac22-4690-89fa-8b20414bf7a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:28.041 159620 INFO neutron.agent.ovn.metadata.agent [-] Port f6342200-ac22-4690-89fa-8b20414bf7a9 in datapath 2215b5fe-e12f-489f-83d1-505bed6394f0 bound to our chassis
Dec 06 10:20:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:28.042 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2215b5fe-e12f-489f-83d1-505bed6394f0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:28.043 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[04ccee0b-949e-46bc-a69d-59f558da160c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:28 np0005548788.localdomain podman[321279]: 
Dec 06 10:20:28 np0005548788.localdomain podman[321279]: 2025-12-06 10:20:28.279512791 +0000 UTC m=+0.092821876 container create 63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:20:28 np0005548788.localdomain systemd[1]: Started libpod-conmon-63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13.scope.
Dec 06 10:20:28 np0005548788.localdomain podman[321279]: 2025-12-06 10:20:28.233122465 +0000 UTC m=+0.046431630 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:28 np0005548788.localdomain systemd[1]: tmp-crun.D8lEZ1.mount: Deactivated successfully.
Dec 06 10:20:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:28.412 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:28 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:28 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60821da11a2e3b130d2fef1c7d0b0ee586b6f5fa6ab6c74e09e533dc39ac1d2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:28 np0005548788.localdomain ceph-mon[293643]: pgmap v344: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 20 KiB/s wr, 77 op/s
Dec 06 10:20:28 np0005548788.localdomain podman[321279]: 2025-12-06 10:20:28.431320493 +0000 UTC m=+0.244629578 container init 63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:28 np0005548788.localdomain podman[321279]: 2025-12-06 10:20:28.445363169 +0000 UTC m=+0.258672254 container start 63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321307]: started, version 2.85 cachesize 150
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321307]: DNS service limited to local subnets
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321307]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321307]: warning: no upstream servers configured
Dec 06 10:20:28 np0005548788.localdomain dnsmasq-dhcp[321307]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321307]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/addn_hosts - 0 addresses
Dec 06 10:20:28 np0005548788.localdomain dnsmasq-dhcp[321307]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/host
Dec 06 10:20:28 np0005548788.localdomain dnsmasq-dhcp[321307]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/opts
Dec 06 10:20:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:28.681 262572 INFO neutron.agent.dhcp.agent [None req-a557fb3a-0956-46ef-9811-093c5c5bd020 - - - - - -] DHCP configuration for ports {'fe30635c-8c01-4655-a24c-dae0960d8559'} is completed
Dec 06 10:20:28 np0005548788.localdomain podman[321331]: 
Dec 06 10:20:28 np0005548788.localdomain podman[321331]: 2025-12-06 10:20:28.815591797 +0000 UTC m=+0.089478883 container create 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:28 np0005548788.localdomain systemd[1]: Started libpod-conmon-9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620.scope.
Dec 06 10:20:28 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:28 np0005548788.localdomain podman[321331]: 2025-12-06 10:20:28.772386028 +0000 UTC m=+0.046273124 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:28 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b53ac063d17b49bfc5e20d4b63e49236912457aab81fe526a702d21b3eb27ae7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:28 np0005548788.localdomain podman[321331]: 2025-12-06 10:20:28.882997515 +0000 UTC m=+0.156884601 container init 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:28 np0005548788.localdomain podman[321331]: 2025-12-06 10:20:28.892983903 +0000 UTC m=+0.166870989 container start 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321373]: started, version 2.85 cachesize 150
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321373]: DNS service limited to local subnets
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321373]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321373]: warning: no upstream servers configured
Dec 06 10:20:28 np0005548788.localdomain dnsmasq-dhcp[321373]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/addn_hosts - 0 addresses
Dec 06 10:20:28 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/host
Dec 06 10:20:28 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/opts
Dec 06 10:20:28 np0005548788.localdomain dnsmasq[321307]: exiting on receipt of SIGTERM
Dec 06 10:20:28 np0005548788.localdomain podman[321357]: 2025-12-06 10:20:28.952369853 +0000 UTC m=+0.113005611 container kill 63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:28 np0005548788.localdomain systemd[1]: libpod-63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13.scope: Deactivated successfully.
Dec 06 10:20:28 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:28.993 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:27Z, description=, device_id=5739df96-ca09-48a1-aea1-c3a4cb45fc2a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c667c850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c667c3d0>], id=7e7a6a02-dd33-46ba-9cbf-75faee1a1f05, ip_allocation=immediate, mac_address=fa:16:3e:a6:11:ed, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:24Z, description=, dns_domain=, id=2215b5fe-e12f-489f-83d1-505bed6394f0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-799031500, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64159, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2416, status=ACTIVE, subnets=['b33890cd-628f-44d8-83a7-52e88f4a9f55'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:26Z, vlan_transparent=None, network_id=2215b5fe-e12f-489f-83d1-505bed6394f0, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2436, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:27Z on network 2215b5fe-e12f-489f-83d1-505bed6394f0
Dec 06 10:20:29 np0005548788.localdomain podman[321378]: 2025-12-06 10:20:29.024746745 +0000 UTC m=+0.052598920 container died 63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:20:29 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:29.039 262572 INFO neutron.agent.dhcp.agent [None req-6e9ae9aa-ced3-4a50-972c-6f7b48f538da - - - - - -] DHCP configuration for ports {'8c07f142-e413-4bb4-b2f2-3f1142791338'} is completed
Dec 06 10:20:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:29.043 2 INFO neutron.agent.securitygroups_rpc [None req-3c4f32bd-b684-4b66-9a34-68450fbeb73b 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:29 np0005548788.localdomain podman[321378]: 2025-12-06 10:20:29.072071901 +0000 UTC m=+0.099924036 container remove 63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:20:29 np0005548788.localdomain systemd[1]: libpod-conmon-63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13.scope: Deactivated successfully.
Dec 06 10:20:29 np0005548788.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/addn_hosts - 1 addresses
Dec 06 10:20:29 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/host
Dec 06 10:20:29 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/opts
Dec 06 10:20:29 np0005548788.localdomain podman[321418]: 2025-12-06 10:20:29.198806966 +0000 UTC m=+0.067792750 container kill 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:29.262 2 INFO neutron.agent.securitygroups_rpc [None req-03f78b6a-c5ab-4048-95e5-dac6933624ce 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-60821da11a2e3b130d2fef1c7d0b0ee586b6f5fa6ab6c74e09e533dc39ac1d2b-merged.mount: Deactivated successfully.
Dec 06 10:20:29 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63243b3a7069cb0d9196c204d8cf966feaa50a1aef9692e72536b9969bd1ec13-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e174 do_prune osdmap full prune enabled
Dec 06 10:20:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e175 e175: 6 total, 6 up, 6 in
Dec 06 10:20:29 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in
Dec 06 10:20:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:29.503 2 INFO neutron.agent.securitygroups_rpc [None req-bec77d6f-35e0-4121-9e3c-321d796fa6a3 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:29 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:29.548 262572 INFO neutron.agent.dhcp.agent [None req-f2133e48-1358-4d77-965e-1b9ef69f51d7 - - - - - -] DHCP configuration for ports {'7e7a6a02-dd33-46ba-9cbf-75faee1a1f05'} is completed
Dec 06 10:20:29 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:29.697 2 INFO neutron.agent.securitygroups_rpc [None req-b83ddb52-2122-4d26-8d71-acc6737aed87 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:30 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:30.486 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:27Z, description=, device_id=5739df96-ca09-48a1-aea1-c3a4cb45fc2a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6697cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6697d60>], id=7e7a6a02-dd33-46ba-9cbf-75faee1a1f05, ip_allocation=immediate, mac_address=fa:16:3e:a6:11:ed, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:24Z, description=, dns_domain=, id=2215b5fe-e12f-489f-83d1-505bed6394f0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-799031500, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64159, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2416, status=ACTIVE, subnets=['b33890cd-628f-44d8-83a7-52e88f4a9f55'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:26Z, vlan_transparent=None, network_id=2215b5fe-e12f-489f-83d1-505bed6394f0, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2436, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:27Z on network 2215b5fe-e12f-489f-83d1-505bed6394f0
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: pgmap v346: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 19 KiB/s wr, 42 op/s
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "format": "json"}]: dispatch
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: osdmap e175: 6 total, 6 up, 6 in
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "snap_name": "bc9ed0e0-cf16-4288-b389-6bbd7bbfed8b", "format": "json"}]: dispatch
Dec 06 10:20:30 np0005548788.localdomain podman[321490]: 
Dec 06 10:20:30 np0005548788.localdomain podman[321490]: 2025-12-06 10:20:30.624986223 +0000 UTC m=+0.101267868 container create ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:20:30 np0005548788.localdomain systemd[1]: Started libpod-conmon-ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5.scope.
Dec 06 10:20:30 np0005548788.localdomain podman[321490]: 2025-12-06 10:20:30.574047745 +0000 UTC m=+0.050329420 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:30 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:30 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f2ba2ddf5264a3d6bedf4b8c785ef77aacbfcc4a0b720d3f1ec78b9a64d23de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:30 np0005548788.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/addn_hosts - 1 addresses
Dec 06 10:20:30 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/host
Dec 06 10:20:30 np0005548788.localdomain podman[321520]: 2025-12-06 10:20:30.708316344 +0000 UTC m=+0.063808637 container kill 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:20:30 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/opts
Dec 06 10:20:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:30.711 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:30 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:30.742 2 INFO neutron.agent.securitygroups_rpc [None req-1a7eb84d-a3b4-4a88-a0ae-062b0b90ebc4 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['ea4ca242-5187-4603-82cf-af66665b0039']
Dec 06 10:20:30 np0005548788.localdomain podman[321490]: 2025-12-06 10:20:30.75273356 +0000 UTC m=+0.229015205 container init ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:20:30 np0005548788.localdomain podman[321490]: 2025-12-06 10:20:30.765056812 +0000 UTC m=+0.241338457 container start ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:20:30 np0005548788.localdomain dnsmasq[321541]: started, version 2.85 cachesize 150
Dec 06 10:20:30 np0005548788.localdomain dnsmasq[321541]: DNS service limited to local subnets
Dec 06 10:20:30 np0005548788.localdomain dnsmasq[321541]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:30 np0005548788.localdomain dnsmasq[321541]: warning: no upstream servers configured
Dec 06 10:20:30 np0005548788.localdomain dnsmasq-dhcp[321541]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:30 np0005548788.localdomain dnsmasq-dhcp[321541]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:30 np0005548788.localdomain dnsmasq[321541]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/addn_hosts - 0 addresses
Dec 06 10:20:30 np0005548788.localdomain dnsmasq-dhcp[321541]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/host
Dec 06 10:20:30 np0005548788.localdomain dnsmasq-dhcp[321541]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/opts
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3989965424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:20:30 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3989965424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:31.052 262572 INFO neutron.agent.dhcp.agent [None req-2db3e8e7-0ec7-422c-b845-e8fb2d45d538 - - - - - -] DHCP configuration for ports {'fe03a8ad-30c4-4836-8477-37a417211623', 'fe30635c-8c01-4655-a24c-dae0960d8559'} is completed
Dec 06 10:20:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:31.231 262572 INFO neutron.agent.dhcp.agent [None req-bac9a0cf-1c5c-4ce8-bc42-94b4aafb95b5 - - - - - -] DHCP configuration for ports {'7e7a6a02-dd33-46ba-9cbf-75faee1a1f05'} is completed
Dec 06 10:20:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb", "format": "json"}]: dispatch
Dec 06 10:20:31 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3989965424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:31 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3989965424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e175 do_prune osdmap full prune enabled
Dec 06 10:20:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e176 e176: 6 total, 6 up, 6 in
Dec 06 10:20:32 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in
Dec 06 10:20:32 np0005548788.localdomain ceph-mon[293643]: pgmap v348: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 19 KiB/s wr, 42 op/s
Dec 06 10:20:32 np0005548788.localdomain ceph-mon[293643]: osdmap e176: 6 total, 6 up, 6 in
Dec 06 10:20:32 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:32.641 2 INFO neutron.agent.securitygroups_rpc [None req-8df6a51f-2782-49f1-a34d-739b1e2f53d1 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['0223fd9f-7d67-4f35-8221-a118caed647f']
Dec 06 10:20:32 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:32.951 2 INFO neutron.agent.securitygroups_rpc [None req-c356675a-b0a4-4bc7-b431-054879bdecb2 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['0223fd9f-7d67-4f35-8221-a118caed647f']
Dec 06 10:20:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:33.414 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:33 np0005548788.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/addn_hosts - 0 addresses
Dec 06 10:20:33 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/host
Dec 06 10:20:33 np0005548788.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/2215b5fe-e12f-489f-83d1-505bed6394f0/opts
Dec 06 10:20:33 np0005548788.localdomain podman[321564]: 2025-12-06 10:20:33.475074475 +0000 UTC m=+0.056286155 container kill 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:20:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "format": "json"}]: dispatch
Dec 06 10:20:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:33 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:33Z|00206|binding|INFO|Releasing lport f6342200-ac22-4690-89fa-8b20414bf7a9 from this chassis (sb_readonly=0)
Dec 06 10:20:33 np0005548788.localdomain kernel: device tapf6342200-ac left promiscuous mode
Dec 06 10:20:33 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:33Z|00207|binding|INFO|Setting lport f6342200-ac22-4690-89fa-8b20414bf7a9 down in Southbound
Dec 06 10:20:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:33.681 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:33.690 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-2215b5fe-e12f-489f-83d1-505bed6394f0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2215b5fe-e12f-489f-83d1-505bed6394f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8852fa10-d8bb-4266-88a9-e10b4845d79c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=f6342200-ac22-4690-89fa-8b20414bf7a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:33.692 159620 INFO neutron.agent.ovn.metadata.agent [-] Port f6342200-ac22-4690-89fa-8b20414bf7a9 in datapath 2215b5fe-e12f-489f-83d1-505bed6394f0 unbound from our chassis
Dec 06 10:20:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:33.693 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2215b5fe-e12f-489f-83d1-505bed6394f0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:33.694 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5556ca-28a9-48bc-8a4a-d56bf03c18c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:33.704 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:34 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb", "target_sub_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:20:34 np0005548788.localdomain ceph-mon[293643]: pgmap v350: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 37 KiB/s wr, 91 op/s
Dec 06 10:20:34 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:20:34 np0005548788.localdomain dnsmasq[321373]: exiting on receipt of SIGTERM
Dec 06 10:20:34 np0005548788.localdomain systemd[1]: libpod-9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620.scope: Deactivated successfully.
Dec 06 10:20:34 np0005548788.localdomain podman[321603]: 2025-12-06 10:20:34.996975495 +0000 UTC m=+0.078711669 container kill 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:35 np0005548788.localdomain podman[321617]: 2025-12-06 10:20:35.061981789 +0000 UTC m=+0.054409136 container died 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:20:35 np0005548788.localdomain systemd[1]: tmp-crun.0Ns1x5.mount: Deactivated successfully.
Dec 06 10:20:35 np0005548788.localdomain podman[321617]: 2025-12-06 10:20:35.103244437 +0000 UTC m=+0.095671744 container cleanup 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:35 np0005548788.localdomain systemd[1]: libpod-conmon-9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620.scope: Deactivated successfully.
Dec 06 10:20:35 np0005548788.localdomain podman[321624]: 2025-12-06 10:20:35.150051767 +0000 UTC m=+0.129060248 container remove 9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2215b5fe-e12f-489f-83d1-505bed6394f0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:20:35 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:35.430 262572 INFO neutron.agent.dhcp.agent [None req-30458788-33cc-43c1-bb6b-d4ac677a2d05 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:35 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:35.431 262572 INFO neutron.agent.dhcp.agent [None req-30458788-33cc-43c1-bb6b-d4ac677a2d05 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:35.499 2 INFO neutron.agent.securitygroups_rpc [None req-70f019b0-4c49-406a-b078-506915b4f443 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['2bea4444-1a2f-4249-8686-d0a5b03f529f']
Dec 06 10:20:35 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e49: np0005548790.kvkfyr(active, since 8m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:20:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:20:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:35.640 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:35.713 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:35.896 2 INFO neutron.agent.securitygroups_rpc [None req-76169eb0-4558-4a54-88c6-853dfb7935a8 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['2bea4444-1a2f-4249-8686-d0a5b03f529f']
Dec 06 10:20:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b53ac063d17b49bfc5e20d4b63e49236912457aab81fe526a702d21b3eb27ae7-merged.mount: Deactivated successfully.
Dec 06 10:20:35 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9311744d99475f6a91e43bd92c15a2a59ee96277f8a4363954c2e9f6eb58c620-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:35 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d2215b5fe\x2de12f\x2d489f\x2d83d1\x2d505bed6394f0.mount: Deactivated successfully.
Dec 06 10:20:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:36.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:36.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:20:36 np0005548788.localdomain ceph-mon[293643]: pgmap v351: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 28 KiB/s wr, 69 op/s
Dec 06 10:20:36 np0005548788.localdomain ceph-mon[293643]: mgrmap e49: np0005548790.kvkfyr(active, since 8m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:20:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:36 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3962963758' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:37 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:20:37 np0005548788.localdomain podman[321646]: 2025-12-06 10:20:37.248565969 +0000 UTC m=+0.077167311 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e176 do_prune osdmap full prune enabled
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e177 e177: 6 total, 6 up, 6 in
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in
Dec 06 10:20:37 np0005548788.localdomain podman[321646]: 2025-12-06 10:20:37.287240378 +0000 UTC m=+0.115841770 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:20:37 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "format": "json"}]: dispatch
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "snap_name": "bc9ed0e0-cf16-4288-b389-6bbd7bbfed8b_19292787-d6d4-497b-bcb5-105ffd3d6c15", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "snap_name": "bc9ed0e0-cf16-4288-b389-6bbd7bbfed8b", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1176083142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:37 np0005548788.localdomain ceph-mon[293643]: osdmap e177: 6 total, 6 up, 6 in
Dec 06 10:20:38 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:38.296 2 INFO neutron.agent.securitygroups_rpc [None req-10166391-fcb0-4201-a7d2-7443ab5c9b01 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:38.415 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:38 np0005548788.localdomain ceph-mon[293643]: pgmap v352: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 13 KiB/s wr, 36 op/s
Dec 06 10:20:38 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:38.766 2 INFO neutron.agent.securitygroups_rpc [None req-867b4687-c36d-47aa-8d2e-c76597d4a6cb 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:20:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:20:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:20:38 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:38.972 2 INFO neutron.agent.securitygroups_rpc [None req-80b8119e-1e57-4c4d-b95c-97abe74340b6 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:20:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2229978036' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:20:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2229978036' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:39.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:39.300 2 INFO neutron.agent.securitygroups_rpc [None req-335b324d-f81b-4cc4-b913-ab18408e9420 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:39.651 2 INFO neutron.agent.securitygroups_rpc [None req-c2f184e7-70de-480a-95d7-35dc53af97f7 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2229978036' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2229978036' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2101341474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2101341474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:39.851 262572 INFO neutron.agent.linux.ip_lib [None req-e64fe2f3-7d30-417d-b7a4-9fa31116163f - - - - - -] Device tap17182611-87 cannot be used as it has no MAC address
Dec 06 10:20:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:39.909 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548788.localdomain kernel: device tap17182611-87 entered promiscuous mode
Dec 06 10:20:39 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016439.9178] manager: (tap17182611-87): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Dec 06 10:20:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:39Z|00208|binding|INFO|Claiming lport 17182611-874f-47f1-ba54-6d62e96884d6 for this chassis.
Dec 06 10:20:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:39Z|00209|binding|INFO|17182611-874f-47f1-ba54-6d62e96884d6: Claiming unknown
Dec 06 10:20:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:39.919 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548788.localdomain systemd-udevd[321681]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:39.929 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-892ac187-d96f-4087-816e-2f923a5e7104', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-892ac187-d96f-4087-816e-2f923a5e7104', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a23c119-1915-4009-a210-a9e739efb139, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=17182611-874f-47f1-ba54-6d62e96884d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:39.932 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 17182611-874f-47f1-ba54-6d62e96884d6 in datapath 892ac187-d96f-4087-816e-2f923a5e7104 bound to our chassis
Dec 06 10:20:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:39.934 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 892ac187-d96f-4087-816e-2f923a5e7104 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:39.935 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[524ccc26-b603-4862-a836-cd7bf18702e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:39Z|00210|binding|INFO|Setting lport 17182611-874f-47f1-ba54-6d62e96884d6 ovn-installed in OVS
Dec 06 10:20:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:39Z|00211|binding|INFO|Setting lport 17182611-874f-47f1-ba54-6d62e96884d6 up in Southbound
Dec 06 10:20:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:39.963 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap17182611-87: No such device
Dec 06 10:20:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:40.005 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:40.036 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:40 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:40.456 2 INFO neutron.agent.securitygroups_rpc [None req-4752e8b4-a74a-419a-afa7-aac12ad63453 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e177 do_prune osdmap full prune enabled
Dec 06 10:20:40 np0005548788.localdomain ceph-mon[293643]: pgmap v354: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 46 KiB/s wr, 46 op/s
Dec 06 10:20:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e178 e178: 6 total, 6 up, 6 in
Dec 06 10:20:40 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in
Dec 06 10:20:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:40.715 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:41 np0005548788.localdomain podman[321752]: 
Dec 06 10:20:41 np0005548788.localdomain podman[321752]: 2025-12-06 10:20:41.026176161 +0000 UTC m=+0.108538952 container create 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: Started libpod-conmon-06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a.scope.
Dec 06 10:20:41 np0005548788.localdomain podman[321752]: 2025-12-06 10:20:40.973240792 +0000 UTC m=+0.055603653 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:41 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a895819e1af1bd4ad702b00f3fcf9318c9f1a1e735a5cab1eb14714f1179493/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:41 np0005548788.localdomain podman[321752]: 2025-12-06 10:20:41.102400523 +0000 UTC m=+0.184763314 container init 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:20:41 np0005548788.localdomain podman[321752]: 2025-12-06 10:20:41.116416597 +0000 UTC m=+0.198779408 container start 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:41 np0005548788.localdomain dnsmasq[321796]: started, version 2.85 cachesize 150
Dec 06 10:20:41 np0005548788.localdomain dnsmasq[321796]: DNS service limited to local subnets
Dec 06 10:20:41 np0005548788.localdomain dnsmasq[321796]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:41 np0005548788.localdomain dnsmasq[321796]: warning: no upstream servers configured
Dec 06 10:20:41 np0005548788.localdomain dnsmasq-dhcp[321796]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:41 np0005548788.localdomain dnsmasq[321796]: read /var/lib/neutron/dhcp/892ac187-d96f-4087-816e-2f923a5e7104/addn_hosts - 0 addresses
Dec 06 10:20:41 np0005548788.localdomain dnsmasq-dhcp[321796]: read /var/lib/neutron/dhcp/892ac187-d96f-4087-816e-2f923a5e7104/host
Dec 06 10:20:41 np0005548788.localdomain dnsmasq-dhcp[321796]: read /var/lib/neutron/dhcp/892ac187-d96f-4087-816e-2f923a5e7104/opts
Dec 06 10:20:41 np0005548788.localdomain podman[321767]: 2025-12-06 10:20:41.177894541 +0000 UTC m=+0.108598205 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:20:41 np0005548788.localdomain podman[321768]: 2025-12-06 10:20:41.216328911 +0000 UTC m=+0.141880685 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 10:20:41 np0005548788.localdomain podman[321768]: 2025-12-06 10:20:41.232954406 +0000 UTC m=+0.158506230 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:20:41 np0005548788.localdomain podman[321767]: 2025-12-06 10:20:41.288181137 +0000 UTC m=+0.218884801 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:20:41 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:41.310 262572 INFO neutron.agent.dhcp.agent [None req-be962179-17fd-42c4-bd9b-1d28bb614630 - - - - - -] DHCP configuration for ports {'ccdbe577-a4f0-419a-8852-a8568136741c'} is completed
Dec 06 10:20:41 np0005548788.localdomain podman[321766]: 2025-12-06 10:20:41.3802727 +0000 UTC m=+0.312043348 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:41 np0005548788.localdomain podman[321766]: 2025-12-06 10:20:41.393786369 +0000 UTC m=+0.325557017 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:41 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:20:41 np0005548788.localdomain dnsmasq[321796]: read /var/lib/neutron/dhcp/892ac187-d96f-4087-816e-2f923a5e7104/addn_hosts - 0 addresses
Dec 06 10:20:41 np0005548788.localdomain dnsmasq-dhcp[321796]: read /var/lib/neutron/dhcp/892ac187-d96f-4087-816e-2f923a5e7104/host
Dec 06 10:20:41 np0005548788.localdomain dnsmasq-dhcp[321796]: read /var/lib/neutron/dhcp/892ac187-d96f-4087-816e-2f923a5e7104/opts
Dec 06 10:20:41 np0005548788.localdomain podman[321847]: 2025-12-06 10:20:41.586380273 +0000 UTC m=+0.066514760 container kill 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548788.localdomain ceph-mon[293643]: osdmap e178: 6 total, 6 up, 6 in
Dec 06 10:20:41 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:41.915 262572 INFO neutron.agent.dhcp.agent [None req-4b3c31bf-85e4-4340-acd0-a3a14da86364 - - - - - -] DHCP configuration for ports {'17182611-874f-47f1-ba54-6d62e96884d6', 'ccdbe577-a4f0-419a-8852-a8568136741c'} is completed
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:42 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:20:42.096 2 INFO neutron.agent.securitygroups_rpc [None req-739af509-ab08-45e2-ba83-57dd6efc5660 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['6ae4fdb3-8bab-4aac-9ae7-1f521287092b']
Dec 06 10:20:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:42Z|00212|binding|INFO|Removing iface tap17182611-87 ovn-installed in OVS
Dec 06 10:20:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:42Z|00213|binding|INFO|Removing lport 17182611-874f-47f1-ba54-6d62e96884d6 ovn-installed in OVS
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.312 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:42.315 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bb6fb282-bcc7-4153-8243-c7dbaa8a19ca with type ""
Dec 06 10:20:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:42.316 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-892ac187-d96f-4087-816e-2f923a5e7104', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-892ac187-d96f-4087-816e-2f923a5e7104', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a23c119-1915-4009-a210-a9e739efb139, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=17182611-874f-47f1-ba54-6d62e96884d6) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.319 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:42.320 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 17182611-874f-47f1-ba54-6d62e96884d6 in datapath 892ac187-d96f-4087-816e-2f923a5e7104 unbound from our chassis
Dec 06 10:20:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:42.323 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 892ac187-d96f-4087-816e-2f923a5e7104, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:42 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:42.324 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[b60238ce-0e49-4d1f-a2fa-cd29b1be7cbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:42 np0005548788.localdomain dnsmasq[321796]: exiting on receipt of SIGTERM
Dec 06 10:20:42 np0005548788.localdomain systemd[1]: tmp-crun.qkVT3z.mount: Deactivated successfully.
Dec 06 10:20:42 np0005548788.localdomain podman[321885]: 2025-12-06 10:20:42.492372267 +0000 UTC m=+0.073737096 container kill 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:42 np0005548788.localdomain systemd[1]: libpod-06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a.scope: Deactivated successfully.
Dec 06 10:20:42 np0005548788.localdomain podman[321897]: 2025-12-06 10:20:42.563130018 +0000 UTC m=+0.059588306 container died 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:42 np0005548788.localdomain podman[321897]: 2025-12-06 10:20:42.597048069 +0000 UTC m=+0.093506327 container cleanup 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:42 np0005548788.localdomain systemd[1]: libpod-conmon-06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a.scope: Deactivated successfully.
Dec 06 10:20:42 np0005548788.localdomain podman[321904]: 2025-12-06 10:20:42.651564598 +0000 UTC m=+0.133538898 container remove 06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-892ac187-d96f-4087-816e-2f923a5e7104, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.664 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:42 np0005548788.localdomain kernel: device tap17182611-87 left promiscuous mode
Dec 06 10:20:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:42.676 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:42 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:42.710 262572 INFO neutron.agent.dhcp.agent [None req-5eb3f742-c0be-44d5-b3e3-d3cd043a4beb - - - - - -] Synchronizing state
Dec 06 10:20:42 np0005548788.localdomain ceph-mon[293643]: pgmap v356: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s rd, 32 KiB/s wr, 9 op/s
Dec 06 10:20:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-7a895819e1af1bd4ad702b00f3fcf9318c9f1a1e735a5cab1eb14714f1179493-merged.mount: Deactivated successfully.
Dec 06 10:20:43 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-06906a0524e1a8671d0b4a15569e95a5eaed7524f911ff364faf2cd8e299169a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:43 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d892ac187\x2dd96f\x2d4087\x2d816e\x2d2f923a5e7104.mount: Deactivated successfully.
Dec 06 10:20:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:43.090 262572 INFO neutron.agent.dhcp.agent [None req-a7a0c4b5-ca75-4b63-bb5a-33ab2906d7e7 - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:43.091 262572 INFO neutron.agent.dhcp.agent [-] Starting network 892ac187-d96f-4087-816e-2f923a5e7104 dhcp configuration
Dec 06 10:20:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:43.175 262572 INFO neutron.agent.dhcp.agent [None req-13a6aa67-6fcb-4404-a0db-936e01f26130 - - - - - -] Finished network 892ac187-d96f-4087-816e-2f923a5e7104 dhcp configuration
Dec 06 10:20:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:43.176 262572 INFO neutron.agent.dhcp.agent [None req-a7a0c4b5-ca75-4b63-bb5a-33ab2906d7e7 - - - - - -] Synchronizing state complete
Dec 06 10:20:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:43.238 262572 INFO neutron.agent.dhcp.agent [None req-3bbeac75-f173-430f-8e0d-c9990458ac76 - - - - - -] DHCP configuration for ports {'ccdbe577-a4f0-419a-8852-a8568136741c'} is completed
Dec 06 10:20:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:43.278 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:43.417 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:20:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:44 np0005548788.localdomain ceph-mon[293643]: pgmap v357: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 45 KiB/s wr, 33 op/s
Dec 06 10:20:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "format": "json"}]: dispatch
Dec 06 10:20:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.007 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.007 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.022 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:20:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:45.498 262572 INFO neutron.agent.linux.ip_lib [None req-163f0c2f-a001-4e0e-bd5d-2d4ba4842e7b - - - - - -] Device tap5553c92f-02 cannot be used as it has no MAC address
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.521 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:45 np0005548788.localdomain kernel: device tap5553c92f-02 entered promiscuous mode
Dec 06 10:20:45 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016445.5301] manager: (tap5553c92f-02): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Dec 06 10:20:45 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:45Z|00214|binding|INFO|Claiming lport 5553c92f-02c8-40b5-b10d-ae7e42aa954e for this chassis.
Dec 06 10:20:45 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:45Z|00215|binding|INFO|5553c92f-02c8-40b5-b10d-ae7e42aa954e: Claiming unknown
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.531 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:45 np0005548788.localdomain systemd-udevd[321938]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:45 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:45.540 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-325306d4-953a-4d34-b750-2e912d2ef3f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-325306d4-953a-4d34-b750-2e912d2ef3f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ab0525e-2c2e-4f3c-b3d5-206779ff3c0f, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=5553c92f-02c8-40b5-b10d-ae7e42aa954e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:45 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:45.542 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 5553c92f-02c8-40b5-b10d-ae7e42aa954e in datapath 325306d4-953a-4d34-b750-2e912d2ef3f3 bound to our chassis
Dec 06 10:20:45 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:45.544 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 325306d4-953a-4d34-b750-2e912d2ef3f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:45 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:45.545 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[de544db3-2d59-4429-ab56-7076d45a6653]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:45Z|00216|binding|INFO|Setting lport 5553c92f-02c8-40b5-b10d-ae7e42aa954e ovn-installed in OVS
Dec 06 10:20:45 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:45Z|00217|binding|INFO|Setting lport 5553c92f-02c8-40b5-b10d-ae7e42aa954e up in Southbound
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.575 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.578 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap5553c92f-02: No such device
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.614 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.642 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:45.717 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:46.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "format": "json"}]: dispatch
Dec 06 10:20:46 np0005548788.localdomain podman[322009]: 
Dec 06 10:20:46 np0005548788.localdomain podman[322009]: 2025-12-06 10:20:46.509571981 +0000 UTC m=+0.094801658 container create 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:46 np0005548788.localdomain systemd[1]: Started libpod-conmon-04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3.scope.
Dec 06 10:20:46 np0005548788.localdomain podman[322009]: 2025-12-06 10:20:46.464536586 +0000 UTC m=+0.049766293 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:46 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:46 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5261c16be22e12c71104be067d80eddf130879bc38acd811a5e095c2d97f87a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:46 np0005548788.localdomain podman[322009]: 2025-12-06 10:20:46.583092198 +0000 UTC m=+0.168321885 container init 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:46 np0005548788.localdomain podman[322009]: 2025-12-06 10:20:46.593735738 +0000 UTC m=+0.178965425 container start 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:46 np0005548788.localdomain dnsmasq[322028]: started, version 2.85 cachesize 150
Dec 06 10:20:46 np0005548788.localdomain dnsmasq[322028]: DNS service limited to local subnets
Dec 06 10:20:46 np0005548788.localdomain dnsmasq[322028]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:46 np0005548788.localdomain dnsmasq[322028]: warning: no upstream servers configured
Dec 06 10:20:46 np0005548788.localdomain dnsmasq-dhcp[322028]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:46 np0005548788.localdomain dnsmasq[322028]: read /var/lib/neutron/dhcp/325306d4-953a-4d34-b750-2e912d2ef3f3/addn_hosts - 0 addresses
Dec 06 10:20:46 np0005548788.localdomain dnsmasq-dhcp[322028]: read /var/lib/neutron/dhcp/325306d4-953a-4d34-b750-2e912d2ef3f3/host
Dec 06 10:20:46 np0005548788.localdomain dnsmasq-dhcp[322028]: read /var/lib/neutron/dhcp/325306d4-953a-4d34-b750-2e912d2ef3f3/opts
Dec 06 10:20:46 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:46Z|00218|binding|INFO|Removing iface tap5553c92f-02 ovn-installed in OVS
Dec 06 10:20:46 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:46Z|00219|binding|INFO|Removing lport 5553c92f-02c8-40b5-b10d-ae7e42aa954e ovn-installed in OVS
Dec 06 10:20:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:46.748 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 42f41d35-24a5-494c-9bad-6b1ade64b96d with type ""
Dec 06 10:20:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:46.751 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-325306d4-953a-4d34-b750-2e912d2ef3f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-325306d4-953a-4d34-b750-2e912d2ef3f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ab0525e-2c2e-4f3c-b3d5-206779ff3c0f, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=5553c92f-02c8-40b5-b10d-ae7e42aa954e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:46.751 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:46.754 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 5553c92f-02c8-40b5-b10d-ae7e42aa954e in datapath 325306d4-953a-4d34-b750-2e912d2ef3f3 unbound from our chassis
Dec 06 10:20:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:46.756 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:46.757 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 325306d4-953a-4d34-b750-2e912d2ef3f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:46 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:46.758 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[4570ee87-7119-49ee-99e7-c56845db0fbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:46 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:46.791 262572 INFO neutron.agent.dhcp.agent [None req-4e7548cc-b43c-49d4-b76e-b5b18452938a - - - - - -] DHCP configuration for ports {'e107abeb-3321-4662-9061-547d8b8290a8'} is completed
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.016 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.017 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:47 np0005548788.localdomain dnsmasq[322028]: read /var/lib/neutron/dhcp/325306d4-953a-4d34-b750-2e912d2ef3f3/addn_hosts - 0 addresses
Dec 06 10:20:47 np0005548788.localdomain dnsmasq-dhcp[322028]: read /var/lib/neutron/dhcp/325306d4-953a-4d34-b750-2e912d2ef3f3/host
Dec 06 10:20:47 np0005548788.localdomain dnsmasq-dhcp[322028]: read /var/lib/neutron/dhcp/325306d4-953a-4d34-b750-2e912d2ef3f3/opts
Dec 06 10:20:47 np0005548788.localdomain podman[322047]: 2025-12-06 10:20:47.025705608 +0000 UTC m=+0.067264064 container kill 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.039 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.040 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.040 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.041 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.042 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: pgmap v358: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 45 KiB/s wr, 33 op/s
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4269218752' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2201144363' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.256 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.265 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548788.localdomain kernel: device tap5553c92f-02 left promiscuous mode
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e178 do_prune osdmap full prune enabled
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.281 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 e179: 6 total, 6 up, 6 in
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in
Dec 06 10:20:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:47.307 262572 INFO neutron.agent.dhcp.agent [None req-3146900a-bf3e-4ebd-98da-eaf97cde9f42 - - - - - -] DHCP configuration for ports {'e107abeb-3321-4662-9061-547d8b8290a8'} is completed
Dec 06 10:20:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:47.443 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:47.443 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:47 np0005548788.localdomain dnsmasq[322028]: exiting on receipt of SIGTERM
Dec 06 10:20:47 np0005548788.localdomain podman[322108]: 2025-12-06 10:20:47.462633832 +0000 UTC m=+0.062067223 container kill 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:47 np0005548788.localdomain systemd[1]: libpod-04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3.scope: Deactivated successfully.
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:20:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2851515333' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.530 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:20:47 np0005548788.localdomain podman[322120]: 2025-12-06 10:20:47.548930946 +0000 UTC m=+0.070441603 container died 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:20:47 np0005548788.localdomain systemd[1]: tmp-crun.lgWfiF.mount: Deactivated successfully.
Dec 06 10:20:47 np0005548788.localdomain podman[322120]: 2025-12-06 10:20:47.658328464 +0000 UTC m=+0.179839111 container cleanup 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:47 np0005548788.localdomain systemd[1]: libpod-conmon-04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3.scope: Deactivated successfully.
Dec 06 10:20:47 np0005548788.localdomain podman[322122]: 2025-12-06 10:20:47.680380647 +0000 UTC m=+0.191666088 container remove 04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-325306d4-953a-4d34-b750-2e912d2ef3f3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:20:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:47.735 262572 INFO neutron.agent.dhcp.agent [None req-21f2b4ec-075d-4b41-a50f-98954d9b8e02 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.760 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.762 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11433MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.763 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.763 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.839 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.840 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:20:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:47.859 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:20:48 np0005548788.localdomain ceph-mon[293643]: pgmap v359: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 37 KiB/s wr, 27 op/s
Dec 06 10:20:48 np0005548788.localdomain ceph-mon[293643]: osdmap e179: 6 total, 6 up, 6 in
Dec 06 10:20:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2851515333' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:20:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2477604507' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:48.358 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:20:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:48.365 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:20:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:48.385 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:20:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:48.388 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:20:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:48.389 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:48.459 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5261c16be22e12c71104be067d80eddf130879bc38acd811a5e095c2d97f87a0-merged.mount: Deactivated successfully.
Dec 06 10:20:48 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04e0ed58d789ef840e66522ae53790b1f877c8557ddc12631a94f465a86d5dd3-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:48 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d325306d4\x2d953a\x2d4d34\x2db750\x2d2e912d2ef3f3.mount: Deactivated successfully.
Dec 06 10:20:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2477604507' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3778715811' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3778715811' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:49 np0005548788.localdomain dnsmasq[321541]: exiting on receipt of SIGTERM
Dec 06 10:20:49 np0005548788.localdomain podman[322188]: 2025-12-06 10:20:49.368148656 +0000 UTC m=+0.063961533 container kill ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:49 np0005548788.localdomain systemd[1]: libpod-ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5.scope: Deactivated successfully.
Dec 06 10:20:49 np0005548788.localdomain podman[322200]: 2025-12-06 10:20:49.446111491 +0000 UTC m=+0.065794619 container died ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:49 np0005548788.localdomain podman[322200]: 2025-12-06 10:20:49.481124476 +0000 UTC m=+0.100807574 container cleanup ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:49 np0005548788.localdomain systemd[1]: libpod-conmon-ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5.scope: Deactivated successfully.
Dec 06 10:20:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-7f2ba2ddf5264a3d6bedf4b8c785ef77aacbfcc4a0b720d3f1ec78b9a64d23de-merged.mount: Deactivated successfully.
Dec 06 10:20:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:49 np0005548788.localdomain podman[322207]: 2025-12-06 10:20:49.530116133 +0000 UTC m=+0.134731595 container remove ff633615c9923281ed2f8cfa09d686e19a4cac5054cfc461b5d4951cfec6b3b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:20:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:20:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:20:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:20:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:20:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19206 "" "Go-http-client/1.1"
Dec 06 10:20:50 np0005548788.localdomain podman[322278]: 
Dec 06 10:20:50 np0005548788.localdomain podman[322278]: 2025-12-06 10:20:50.314368105 +0000 UTC m=+0.087664716 container create 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "format": "json"}]: dispatch
Dec 06 10:20:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:50 np0005548788.localdomain ceph-mon[293643]: pgmap v361: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 26 KiB/s wr, 31 op/s
Dec 06 10:20:50 np0005548788.localdomain systemd[1]: Started libpod-conmon-9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b.scope.
Dec 06 10:20:50 np0005548788.localdomain podman[322278]: 2025-12-06 10:20:50.27353776 +0000 UTC m=+0.046834381 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:50 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:50 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5657c3b9b30e3fd7edfa91f27aee623c76f6575ad4999e4085828968c047f16d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:50 np0005548788.localdomain podman[322278]: 2025-12-06 10:20:50.394976052 +0000 UTC m=+0.168272663 container init 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:20:50 np0005548788.localdomain podman[322278]: 2025-12-06 10:20:50.407263803 +0000 UTC m=+0.180560404 container start 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:50 np0005548788.localdomain dnsmasq[322296]: started, version 2.85 cachesize 150
Dec 06 10:20:50 np0005548788.localdomain dnsmasq[322296]: DNS service limited to local subnets
Dec 06 10:20:50 np0005548788.localdomain dnsmasq[322296]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:50 np0005548788.localdomain dnsmasq[322296]: warning: no upstream servers configured
Dec 06 10:20:50 np0005548788.localdomain dnsmasq-dhcp[322296]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:50 np0005548788.localdomain dnsmasq[322296]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/addn_hosts - 0 addresses
Dec 06 10:20:50 np0005548788.localdomain dnsmasq-dhcp[322296]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/host
Dec 06 10:20:50 np0005548788.localdomain dnsmasq-dhcp[322296]: read /var/lib/neutron/dhcp/03ec4140-c10f-48dc-83d7-88901bfd0c7c/opts
Dec 06 10:20:50 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:50.602 262572 INFO neutron.agent.dhcp.agent [None req-1e118946-3e96-4d6e-aeda-ab6cecafd59a - - - - - -] DHCP configuration for ports {'fe03a8ad-30c4-4836-8477-37a417211623', 'fe30635c-8c01-4655-a24c-dae0960d8559'} is completed
Dec 06 10:20:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:50Z|00220|binding|INFO|Removing iface tapfe03a8ad-30 ovn-installed in OVS
Dec 06 10:20:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:50.672 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 570c2b74-091d-4ddd-a3ec-d9df981a3165 with type ""
Dec 06 10:20:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:50Z|00221|binding|INFO|Removing lport fe03a8ad-30c4-4836-8477-37a417211623 ovn-installed in OVS
Dec 06 10:20:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:50.674 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-03ec4140-c10f-48dc-83d7-88901bfd0c7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03ec4140-c10f-48dc-83d7-88901bfd0c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cce49b44-3ef1-4453-9cd1-da369aa714f9, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=fe03a8ad-30c4-4836-8477-37a417211623) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:50.676 159620 INFO neutron.agent.ovn.metadata.agent [-] Port fe03a8ad-30c4-4836-8477-37a417211623 in datapath 03ec4140-c10f-48dc-83d7-88901bfd0c7c unbound from our chassis
Dec 06 10:20:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:50.679 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 03ec4140-c10f-48dc-83d7-88901bfd0c7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:50.680 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[52fc4131-e7b0-43d5-9151-51c0db4bdcb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:50.714 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:50.720 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:50 np0005548788.localdomain dnsmasq[322296]: exiting on receipt of SIGTERM
Dec 06 10:20:50 np0005548788.localdomain podman[322313]: 2025-12-06 10:20:50.74034506 +0000 UTC m=+0.090995149 container kill 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:50 np0005548788.localdomain systemd[1]: libpod-9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b.scope: Deactivated successfully.
Dec 06 10:20:50 np0005548788.localdomain podman[322325]: 2025-12-06 10:20:50.811574206 +0000 UTC m=+0.054781007 container died 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:20:50 np0005548788.localdomain systemd[1]: tmp-crun.GP0Mu8.mount: Deactivated successfully.
Dec 06 10:20:50 np0005548788.localdomain podman[322325]: 2025-12-06 10:20:50.853526915 +0000 UTC m=+0.096733666 container cleanup 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:50 np0005548788.localdomain systemd[1]: libpod-conmon-9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b.scope: Deactivated successfully.
Dec 06 10:20:50 np0005548788.localdomain podman[322327]: 2025-12-06 10:20:50.897821758 +0000 UTC m=+0.132723572 container remove 9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03ec4140-c10f-48dc-83d7-88901bfd0c7c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:50.910 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:50 np0005548788.localdomain kernel: device tapfe03a8ad-30 left promiscuous mode
Dec 06 10:20:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:50.927 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:50 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:50.950 262572 INFO neutron.agent.dhcp.agent [None req-df135bf5-f049-4e97-9dd4-487a126037f6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:50 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:50.972 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:51.374 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-5657c3b9b30e3fd7edfa91f27aee623c76f6575ad4999e4085828968c047f16d-merged.mount: Deactivated successfully.
Dec 06 10:20:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9edac76def002729d7d71e77f1a7f0bfbbab5e0f14b725460c2d8b2b22a8f40b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:51 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d03ec4140\x2dc10f\x2d48dc\x2d83d7\x2d88901bfd0c7c.mount: Deactivated successfully.
Dec 06 10:20:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:52 np0005548788.localdomain ceph-mon[293643]: pgmap v362: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 22 KiB/s wr, 26 op/s
Dec 06 10:20:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:20:53 np0005548788.localdomain podman[322357]: 2025-12-06 10:20:53.268534122 +0000 UTC m=+0.086859472 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:20:53 np0005548788.localdomain podman[322357]: 2025-12-06 10:20:53.284707462 +0000 UTC m=+0.103032862 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:53 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:20:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:53.463 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain ceph-mon[293643]: pgmap v363: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 17 KiB/s wr, 21 op/s
Dec 06 10:20:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:54.663 262572 INFO neutron.agent.linux.ip_lib [None req-7b296587-fcf0-4bc7-8c23-410d8c4032a7 - - - - - -] Device tap45dc4901-51 cannot be used as it has no MAC address
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.719 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain kernel: device tap45dc4901-51 entered promiscuous mode
Dec 06 10:20:54 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016454.7272] manager: (tap45dc4901-51): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.728 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:54Z|00222|binding|INFO|Claiming lport 45dc4901-517b-4007-9cc5-db6dbc7ac462 for this chassis.
Dec 06 10:20:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:54Z|00223|binding|INFO|45dc4901-517b-4007-9cc5-db6dbc7ac462: Claiming unknown
Dec 06 10:20:54 np0005548788.localdomain systemd-udevd[322386]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:54.743 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-27d28c7b-7944-44ba-a5d6-3cac3a50371c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27d28c7b-7944-44ba-a5d6-3cac3a50371c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa91d75c-7777-4f6d-a83e-cc2d797d32fd, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=45dc4901-517b-4007-9cc5-db6dbc7ac462) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:54.744 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 45dc4901-517b-4007-9cc5-db6dbc7ac462 in datapath 27d28c7b-7944-44ba-a5d6-3cac3a50371c bound to our chassis
Dec 06 10:20:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:54.747 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 13d19acb-a8ef-48b3-b453-c91ff3423dd9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:20:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:54.747 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27d28c7b-7944-44ba-a5d6-3cac3a50371c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:54 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:54.748 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[71d4b922-35e2-4a0c-a533-db450317fb76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.757 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:54Z|00224|binding|INFO|Setting lport 45dc4901-517b-4007-9cc5-db6dbc7ac462 ovn-installed in OVS
Dec 06 10:20:54 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:54Z|00225|binding|INFO|Setting lport 45dc4901-517b-4007-9cc5-db6dbc7ac462 up in Southbound
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.761 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.763 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap45dc4901-51: No such device
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.800 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:54.832 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:55 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "format": "json"}]: dispatch
Dec 06 10:20:55 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:55 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:55Z|00226|binding|INFO|Removing iface tap45dc4901-51 ovn-installed in OVS
Dec 06 10:20:55 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:55Z|00227|binding|INFO|Removing lport 45dc4901-517b-4007-9cc5-db6dbc7ac462 ovn-installed in OVS
Dec 06 10:20:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:55.496 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 13d19acb-a8ef-48b3-b453-c91ff3423dd9 with type ""
Dec 06 10:20:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:55.496 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:55.498 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-27d28c7b-7944-44ba-a5d6-3cac3a50371c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27d28c7b-7944-44ba-a5d6-3cac3a50371c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa91d75c-7777-4f6d-a83e-cc2d797d32fd, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=45dc4901-517b-4007-9cc5-db6dbc7ac462) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:55.501 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 45dc4901-517b-4007-9cc5-db6dbc7ac462 in datapath 27d28c7b-7944-44ba-a5d6-3cac3a50371c unbound from our chassis
Dec 06 10:20:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:55.503 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:55.504 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27d28c7b-7944-44ba-a5d6-3cac3a50371c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:55.505 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[80008c24-a85a-428a-978b-bab5f04bead4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:55 np0005548788.localdomain podman[322457]: 
Dec 06 10:20:55 np0005548788.localdomain podman[322457]: 2025-12-06 10:20:55.71594549 +0000 UTC m=+0.091175114 container create 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:55.724 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:20:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:20:55 np0005548788.localdomain systemd[1]: Started libpod-conmon-1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180.scope.
Dec 06 10:20:55 np0005548788.localdomain systemd[1]: tmp-crun.y0FIlZ.mount: Deactivated successfully.
Dec 06 10:20:55 np0005548788.localdomain podman[322457]: 2025-12-06 10:20:55.673855057 +0000 UTC m=+0.049084721 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:55 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:55 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/075cbc7b5c6e32c026f3684e5642549ec2ba8645dfa57442f5768fb0ef91db8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:55 np0005548788.localdomain podman[322472]: 2025-12-06 10:20:55.83732704 +0000 UTC m=+0.080985799 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:55 np0005548788.localdomain podman[322472]: 2025-12-06 10:20:55.847552877 +0000 UTC m=+0.091211666 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:55 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:20:55 np0005548788.localdomain podman[322457]: 2025-12-06 10:20:55.860958122 +0000 UTC m=+0.236187756 container init 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:55 np0005548788.localdomain podman[322457]: 2025-12-06 10:20:55.870850028 +0000 UTC m=+0.246079662 container start 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:20:55 np0005548788.localdomain dnsmasq[322503]: started, version 2.85 cachesize 150
Dec 06 10:20:55 np0005548788.localdomain dnsmasq[322503]: DNS service limited to local subnets
Dec 06 10:20:55 np0005548788.localdomain dnsmasq[322503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:55 np0005548788.localdomain dnsmasq[322503]: warning: no upstream servers configured
Dec 06 10:20:55 np0005548788.localdomain dnsmasq-dhcp[322503]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:55 np0005548788.localdomain dnsmasq[322503]: read /var/lib/neutron/dhcp/27d28c7b-7944-44ba-a5d6-3cac3a50371c/addn_hosts - 0 addresses
Dec 06 10:20:55 np0005548788.localdomain dnsmasq-dhcp[322503]: read /var/lib/neutron/dhcp/27d28c7b-7944-44ba-a5d6-3cac3a50371c/host
Dec 06 10:20:55 np0005548788.localdomain dnsmasq-dhcp[322503]: read /var/lib/neutron/dhcp/27d28c7b-7944-44ba-a5d6-3cac3a50371c/opts
Dec 06 10:20:55 np0005548788.localdomain podman[322470]: 2025-12-06 10:20:55.947536084 +0000 UTC m=+0.195436635 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:20:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:55.986 262572 INFO neutron.agent.dhcp.agent [None req-13c2f075-e12a-463d-849e-42067af1a55a - - - - - -] DHCP configuration for ports {'2e2d9749-04c4-4743-b109-320eda072a90'} is completed
Dec 06 10:20:56 np0005548788.localdomain podman[322470]: 2025-12-06 10:20:56.038322616 +0000 UTC m=+0.286223237 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain podman[322531]: 2025-12-06 10:20:56.082866706 +0000 UTC m=+0.065140769 container kill 96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-60e0b038-f342-417c-a752-f0ee5b99d802, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:20:56 np0005548788.localdomain dnsmasq[319320]: exiting on receipt of SIGTERM
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: libpod-96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44.scope: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:56Z|00228|binding|INFO|Removing iface tap8c1e5a28-dc ovn-installed in OVS
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.146 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:56.148 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3812f1ed-b146-45c0-b834-e866d0b08cc6 with type ""
Dec 06 10:20:56 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:20:56Z|00229|binding|INFO|Removing lport 8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b ovn-installed in OVS
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.152 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:56.153 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-60e0b038-f342-417c-a752-f0ee5b99d802', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-60e0b038-f342-417c-a752-f0ee5b99d802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a37023c5-d1b8-4cda-9b3d-bacf98434407, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:56 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:56.157 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 8c1e5a28-dcc0-4d95-abe4-b6fa3d16831b in datapath 60e0b038-f342-417c-a752-f0ee5b99d802 unbound from our chassis
Dec 06 10:20:56 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:56.158 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 60e0b038-f342-417c-a752-f0ee5b99d802 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:56 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:20:56.159 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[927a60e3-85bc-41ad-bd4f-6da03897481e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:56 np0005548788.localdomain podman[322558]: 2025-12-06 10:20:56.190388917 +0000 UTC m=+0.077005627 container died 96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-60e0b038-f342-417c-a752-f0ee5b99d802, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:56 np0005548788.localdomain podman[322558]: 2025-12-06 10:20:56.228329332 +0000 UTC m=+0.114946052 container remove 96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-60e0b038-f342-417c-a752-f0ee5b99d802, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:56 np0005548788.localdomain dnsmasq[322503]: read /var/lib/neutron/dhcp/27d28c7b-7944-44ba-a5d6-3cac3a50371c/addn_hosts - 0 addresses
Dec 06 10:20:56 np0005548788.localdomain dnsmasq-dhcp[322503]: read /var/lib/neutron/dhcp/27d28c7b-7944-44ba-a5d6-3cac3a50371c/host
Dec 06 10:20:56 np0005548788.localdomain dnsmasq-dhcp[322503]: read /var/lib/neutron/dhcp/27d28c7b-7944-44ba-a5d6-3cac3a50371c/opts
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.244 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain podman[322581]: 2025-12-06 10:20:56.244655258 +0000 UTC m=+0.071348252 container kill 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:56 np0005548788.localdomain kernel: device tap8c1e5a28-dc left promiscuous mode
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: libpod-conmon-96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44.scope: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.257 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:56.274 262572 INFO neutron.agent.dhcp.agent [None req-e6a33ad8-103e-4b48-9bbd-d5226abddd92 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:56 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:56.276 262572 INFO neutron.agent.dhcp.agent [None req-e6a33ad8-103e-4b48-9bbd-d5226abddd92 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:56 np0005548788.localdomain ceph-mon[293643]: pgmap v364: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 17 KiB/s wr, 21 op/s
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.482 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain dnsmasq[322503]: exiting on receipt of SIGTERM
Dec 06 10:20:56 np0005548788.localdomain podman[322625]: 2025-12-06 10:20:56.675735731 +0000 UTC m=+0.065315855 container kill 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: libpod-1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180.scope: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-01ecd3a90f083b79c53d27c577db10e1616a356bb4357024bda8f9ac0c5b7cc3-merged.mount: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96aad18f341504175c4ffe8ad9733236aceb4563d950fc3233ced8c05cfb6d44-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d60e0b038\x2df342\x2d417c\x2da752\x2df0ee5b99d802.mount: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain podman[322638]: 2025-12-06 10:20:56.759892997 +0000 UTC m=+0.065314404 container died 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain podman[322638]: 2025-12-06 10:20:56.794167749 +0000 UTC m=+0.099589126 container cleanup 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:20:56 np0005548788.localdomain systemd[1]: libpod-conmon-1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180.scope: Deactivated successfully.
Dec 06 10:20:56 np0005548788.localdomain podman[322639]: 2025-12-06 10:20:56.845799018 +0000 UTC m=+0.145642523 container remove 1d568ca0b3a1e0731e69192949d47470e5f90ccdb0bd8e0a08fdc8258c3e0180 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-27d28c7b-7944-44ba-a5d6-3cac3a50371c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.898 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain kernel: device tap45dc4901-51 left promiscuous mode
Dec 06 10:20:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:56.917 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:56 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:56.941 262572 INFO neutron.agent.dhcp.agent [None req-5ac70108-f0e8-4df1-9fb4-e5ca8222091a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:56 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:20:56.942 262572 INFO neutron.agent.dhcp.agent [None req-5ac70108-f0e8-4df1-9fb4-e5ca8222091a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:57 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-075cbc7b5c6e32c026f3684e5642549ec2ba8645dfa57442f5768fb0ef91db8d-merged.mount: Deactivated successfully.
Dec 06 10:20:57 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d27d28c7b\x2d7944\x2d44ba\x2da5d6\x2d3cac3a50371c.mount: Deactivated successfully.
Dec 06 10:20:58 np0005548788.localdomain ceph-mon[293643]: pgmap v365: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 17 KiB/s wr, 21 op/s
Dec 06 10:20:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:20:58.497 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4153025700' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4153025700' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:20:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/11180419' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:20:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/11180419' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:00 np0005548788.localdomain sudo[322666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:21:00 np0005548788.localdomain sudo[322666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:21:00 np0005548788.localdomain sudo[322666]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:00 np0005548788.localdomain sudo[322684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:21:00 np0005548788.localdomain sudo[322684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:21:00 np0005548788.localdomain ceph-mon[293643]: pgmap v366: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 21 KiB/s wr, 20 op/s
Dec 06 10:21:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/11180419' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/11180419' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:00.726 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:00 np0005548788.localdomain sudo[322684]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:21:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:21:01 np0005548788.localdomain sudo[322734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:21:01 np0005548788.localdomain sudo[322734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:21:01 np0005548788.localdomain sudo[322734]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:21:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:21:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:21:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:21:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:21:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:21:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:02 np0005548788.localdomain ceph-mon[293643]: pgmap v367: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s rd, 11 KiB/s wr, 14 op/s
Dec 06 10:21:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:21:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:03.500 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:04 np0005548788.localdomain ceph-mon[293643]: pgmap v368: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 31 op/s
Dec 06 10:21:04 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:04.618 2 INFO neutron.agent.securitygroups_rpc [None req-a8006eb6-87bf-4e51-be14-a8e4ed75c69e a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:05.756 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:06 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:06.330 2 INFO neutron.agent.securitygroups_rpc [None req-58c814b2-70d6-487c-b572-f1a393accc35 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:06 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:06.564 2 INFO neutron.agent.securitygroups_rpc [None req-58c814b2-70d6-487c-b572-f1a393accc35 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:06 np0005548788.localdomain ceph-mon[293643]: pgmap v369: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 7.7 KiB/s wr, 19 op/s
Dec 06 10:21:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:21:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:07.529 2 INFO neutron.agent.securitygroups_rpc [None req-beefe55f-e6d8-4aae-b3a4-9c077707e8ab a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:08 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:21:08 np0005548788.localdomain podman[322752]: 2025-12-06 10:21:08.272721639 +0000 UTC m=+0.097135149 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: pgmap v370: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 7.7 KiB/s wr, 19 op/s
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2520375608' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2520375608' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3418725974' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3418725974' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548788.localdomain podman[322752]: 2025-12-06 10:21:08.360687164 +0000 UTC m=+0.185100364 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:08 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:21:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:08.502 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:08 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:08.503 2 INFO neutron.agent.securitygroups_rpc [None req-4469b72b-a2a8-46c0-b170-fb645c70fec6 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:08 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:08.533 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:21:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "format": "json"}]: dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/126387762' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/126387762' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3307122424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3307122424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1826059961' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:09 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1826059961' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:10 np0005548788.localdomain ceph-mon[293643]: pgmap v371: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 9.9 KiB/s wr, 62 op/s
Dec 06 10:21:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1826059961' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1826059961' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:10.759 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:12.056 2 INFO neutron.agent.securitygroups_rpc [None req-a8044dd6-257e-4d6a-a5a6-1617984725a4 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:21:12 np0005548788.localdomain podman[322777]: 2025-12-06 10:21:12.254634861 +0000 UTC m=+0.084226560 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:12 np0005548788.localdomain podman[322777]: 2025-12-06 10:21:12.263582048 +0000 UTC m=+0.093173727 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:21:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: tmp-crun.bIdLS1.mount: Deactivated successfully.
Dec 06 10:21:12 np0005548788.localdomain podman[322779]: 2025-12-06 10:21:12.313036 +0000 UTC m=+0.133571489 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc.)
Dec 06 10:21:12 np0005548788.localdomain podman[322778]: 2025-12-06 10:21:12.363307517 +0000 UTC m=+0.188232252 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:21:12 np0005548788.localdomain ceph-mon[293643]: pgmap v372: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 3.7 KiB/s wr, 60 op/s
Dec 06 10:21:12 np0005548788.localdomain podman[322778]: 2025-12-06 10:21:12.370952103 +0000 UTC m=+0.195876898 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:21:12 np0005548788.localdomain podman[322779]: 2025-12-06 10:21:12.458752193 +0000 UTC m=+0.279287672 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-type=git, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Dec 06 10:21:12 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:21:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:12.764 2 INFO neutron.agent.securitygroups_rpc [None req-ce6b28f9-e93f-45d2-8a9b-cc88bd3abac1 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:13.082 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:13.083 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:13.084 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:21:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "snap_name": "2eb048c9-c193-4307-94d1-af1469cf7b46", "format": "json"}]: dispatch
Dec 06 10:21:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:13.504 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:14 np0005548788.localdomain ceph-mon[293643]: pgmap v373: 177 pgs: 177 active+clean; 193 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 8.5 KiB/s wr, 101 op/s
Dec 06 10:21:14 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:14.549 2 INFO neutron.agent.securitygroups_rpc [None req-ff4891e2-327c-4b51-b5fc-5a809ca2f304 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['903bc67e-3e9d-4b17-b669-c51b1cfd9fe6']
Dec 06 10:21:14 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:14.890 2 INFO neutron.agent.securitygroups_rpc [None req-4caeb4e0-6839-4739-a438-dff319ba5ebb 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['903bc67e-3e9d-4b17-b669-c51b1cfd9fe6']
Dec 06 10:21:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:15.784 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:15 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:15.816 2 INFO neutron.agent.securitygroups_rpc [None req-90100f1d-f80a-4a04-b568-2870d38561f7 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:16 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:16.185 262572 INFO neutron.agent.linux.ip_lib [None req-12842079-daa4-4ce2-8073-330f05166712 - - - - - -] Device tap54b7d2e8-83 cannot be used as it has no MAC address
Dec 06 10:21:16 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:16.204 2 INFO neutron.agent.securitygroups_rpc [None req-1962285c-ec71-4abe-922e-2812550a1f59 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:16.212 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:16 np0005548788.localdomain kernel: device tap54b7d2e8-83 entered promiscuous mode
Dec 06 10:21:16 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016476.2228] manager: (tap54b7d2e8-83): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Dec 06 10:21:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:16Z|00230|binding|INFO|Claiming lport 54b7d2e8-831a-416a-a8c9-0bf759002915 for this chassis.
Dec 06 10:21:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:16Z|00231|binding|INFO|54b7d2e8-831a-416a-a8c9-0bf759002915: Claiming unknown
Dec 06 10:21:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:16.222 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:16 np0005548788.localdomain systemd-udevd[322846]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:16.235 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-b7a477f5-928c-45d9-a629-53d1a6d2a99c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7a477f5-928c-45d9-a629-53d1a6d2a99c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6abc5e9-ed08-4ca2-bfe3-5575aee152c7, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=54b7d2e8-831a-416a-a8c9-0bf759002915) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:16.236 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 54b7d2e8-831a-416a-a8c9-0bf759002915 in datapath b7a477f5-928c-45d9-a629-53d1a6d2a99c bound to our chassis
Dec 06 10:21:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:16.238 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b7a477f5-928c-45d9-a629-53d1a6d2a99c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:16.239 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e99243a3-d40e-4c4b-be40-4b71d9e20b21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:16.262 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:16Z|00232|binding|INFO|Setting lport 54b7d2e8-831a-416a-a8c9-0bf759002915 ovn-installed in OVS
Dec 06 10:21:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:16Z|00233|binding|INFO|Setting lport 54b7d2e8-831a-416a-a8c9-0bf759002915 up in Southbound
Dec 06 10:21:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:16.268 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap54b7d2e8-83: No such device
Dec 06 10:21:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:16.307 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:16.337 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:16 np0005548788.localdomain ceph-mon[293643]: pgmap v374: 177 pgs: 177 active+clean; 193 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 7.0 KiB/s wr, 84 op/s
Dec 06 10:21:16 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:16.658 2 INFO neutron.agent.securitygroups_rpc [None req-62279be6-b47d-4f82-9bae-6dcf35358e74 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:16 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:16.876 2 INFO neutron.agent.securitygroups_rpc [None req-c061e6e9-3e5b-41ac-9ef0-5285d101333c 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:17.120 2 INFO neutron.agent.securitygroups_rpc [None req-2a7b6fd7-f3e3-4d7a-9d68-bfb25de0babb 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548788.localdomain podman[322917]: 
Dec 06 10:21:17 np0005548788.localdomain podman[322917]: 2025-12-06 10:21:17.21181284 +0000 UTC m=+0.067637316 container create aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:17.247 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 673cf6d9-441d-4526-b09d-356afa146b47 with type ""
Dec 06 10:21:17 np0005548788.localdomain systemd[1]: Started libpod-conmon-aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589.scope.
Dec 06 10:21:17 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:17Z|00234|binding|INFO|Removing iface tap54b7d2e8-83 ovn-installed in OVS
Dec 06 10:21:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:17.248 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-b7a477f5-928c-45d9-a629-53d1a6d2a99c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7a477f5-928c-45d9-a629-53d1a6d2a99c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6abc5e9-ed08-4ca2-bfe3-5575aee152c7, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=54b7d2e8-831a-416a-a8c9-0bf759002915) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:17 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:17Z|00235|binding|INFO|Removing lport 54b7d2e8-831a-416a-a8c9-0bf759002915 ovn-installed in OVS
Dec 06 10:21:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:17.250 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 54b7d2e8-831a-416a-a8c9-0bf759002915 in datapath b7a477f5-928c-45d9-a629-53d1a6d2a99c unbound from our chassis
Dec 06 10:21:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:17.251 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b7a477f5-928c-45d9-a629-53d1a6d2a99c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:17 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:17.251 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[febed02d-80ce-4373-acce-476b5c9de56c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:17 np0005548788.localdomain podman[322917]: 2025-12-06 10:21:17.187494817 +0000 UTC m=+0.043319303 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:17.291 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:17.292 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:17.300 2 INFO neutron.agent.securitygroups_rpc [None req-a7542a85-dea5-4041-a20f-6eded131077b 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:17 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:17 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fde64c2f4f26a1b67f14ef8e4a12a2f4e8df76ebe74731da7aab55ef5c3b3ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:17 np0005548788.localdomain podman[322917]: 2025-12-06 10:21:17.332663174 +0000 UTC m=+0.188487650 container init aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:21:17 np0005548788.localdomain podman[322917]: 2025-12-06 10:21:17.339429333 +0000 UTC m=+0.195253809 container start aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:21:17 np0005548788.localdomain dnsmasq[322936]: started, version 2.85 cachesize 150
Dec 06 10:21:17 np0005548788.localdomain dnsmasq[322936]: DNS service limited to local subnets
Dec 06 10:21:17 np0005548788.localdomain dnsmasq[322936]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:17 np0005548788.localdomain dnsmasq[322936]: warning: no upstream servers configured
Dec 06 10:21:17 np0005548788.localdomain dnsmasq-dhcp[322936]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:17 np0005548788.localdomain dnsmasq[322936]: read /var/lib/neutron/dhcp/b7a477f5-928c-45d9-a629-53d1a6d2a99c/addn_hosts - 0 addresses
Dec 06 10:21:17 np0005548788.localdomain dnsmasq-dhcp[322936]: read /var/lib/neutron/dhcp/b7a477f5-928c-45d9-a629-53d1a6d2a99c/host
Dec 06 10:21:17 np0005548788.localdomain dnsmasq-dhcp[322936]: read /var/lib/neutron/dhcp/b7a477f5-928c-45d9-a629-53d1a6d2a99c/opts
Dec 06 10:21:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "snap_name": "2eb048c9-c193-4307-94d1-af1469cf7b46_f41ba66f-ce64-424e-90ff-4fce011eb0df", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "snap_name": "2eb048c9-c193-4307-94d1-af1469cf7b46", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:17.469 262572 INFO neutron.agent.dhcp.agent [None req-fb153376-f651-4671-9277-060e11d00732 - - - - - -] DHCP configuration for ports {'6bf27fca-60d5-473c-8b83-8c9adcbeb10d'} is completed
Dec 06 10:21:17 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:17.520 2 INFO neutron.agent.securitygroups_rpc [None req-2962579a-8cbd-4a57-b13b-f9182cbc39c2 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548788.localdomain dnsmasq[322936]: exiting on receipt of SIGTERM
Dec 06 10:21:17 np0005548788.localdomain podman[322954]: 2025-12-06 10:21:17.648828426 +0000 UTC m=+0.056480920 container kill aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:21:17 np0005548788.localdomain systemd[1]: libpod-aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589.scope: Deactivated successfully.
Dec 06 10:21:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:17.677 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:17 np0005548788.localdomain podman[322966]: 2025-12-06 10:21:17.726832093 +0000 UTC m=+0.064176459 container died aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:21:17 np0005548788.localdomain podman[322966]: 2025-12-06 10:21:17.760615229 +0000 UTC m=+0.097959545 container cleanup aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:21:17 np0005548788.localdomain systemd[1]: libpod-conmon-aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589.scope: Deactivated successfully.
Dec 06 10:21:17 np0005548788.localdomain podman[322968]: 2025-12-06 10:21:17.799683499 +0000 UTC m=+0.127998065 container remove aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a477f5-928c-45d9-a629-53d1a6d2a99c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:21:17 np0005548788.localdomain kernel: device tap54b7d2e8-83 left promiscuous mode
Dec 06 10:21:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:17.812 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:17.825 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:17.840 262572 INFO neutron.agent.dhcp.agent [None req-376451a4-1b48-4877-9765-6006cb7ab0d7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:17.841 262572 INFO neutron.agent.dhcp.agent [None req-376451a4-1b48-4877-9765-6006cb7ab0d7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:17.841 262572 INFO neutron.agent.dhcp.agent [None req-376451a4-1b48-4877-9765-6006cb7ab0d7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:18.072 2 INFO neutron.agent.securitygroups_rpc [None req-c4974b4a-fd74-4b95-bbd1-546aef178ffe 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:18.197 2 INFO neutron.agent.securitygroups_rpc [None req-3a1797f5-55dd-437d-aa3c-dfbf79d9d8b2 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:18.213 2 INFO neutron.agent.securitygroups_rpc [None req-44e28314-226f-4847-9798-44b59d6b4b35 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-8fde64c2f4f26a1b67f14ef8e4a12a2f4e8df76ebe74731da7aab55ef5c3b3ca-merged.mount: Deactivated successfully.
Dec 06 10:21:18 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa1dd6c75280b03c16cfa5e56d5221d4385caaf378ac05bc642fe95da7642589-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:18 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2db7a477f5\x2d928c\x2d45d9\x2da629\x2d53d1a6d2a99c.mount: Deactivated successfully.
Dec 06 10:21:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:18.337 2 INFO neutron.agent.securitygroups_rpc [None req-b55ae0b1-9a25-4f2d-ae6a-9ce8bc1e6fe6 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548788.localdomain ceph-mon[293643]: pgmap v375: 177 pgs: 177 active+clean; 193 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 7.0 KiB/s wr, 84 op/s
Dec 06 10:21:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:18.543 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:18 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:18.576 2 INFO neutron.agent.securitygroups_rpc [None req-b097e5fe-7591-4c88-9845-0ee25de4ff5d a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:18 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:18.598 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:19.023 2 INFO neutron.agent.securitygroups_rpc [None req-634a6650-1d18-4bf0-bb6f-8096dc9b484b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:19.199 2 INFO neutron.agent.securitygroups_rpc [None req-8740ee6b-b668-48f2-97f8-ee50cd2a4f10 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['3e4cda00-96df-465b-a218-fdd9aa158162']
Dec 06 10:21:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:19.441 2 INFO neutron.agent.securitygroups_rpc [None req-44297d43-b394-4987-b7fa-1e1c5c65d7e5 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:19.481 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:19 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1072020705' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:19 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1072020705' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:21:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:21:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:21:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:21:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:21:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18738 "" "Go-http-client/1.1"
Dec 06 10:21:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:19.802 2 INFO neutron.agent.securitygroups_rpc [None req-ec21728d-7a8b-4886-a802-a9f2fd832d5f a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:20 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:20.073 2 INFO neutron.agent.securitygroups_rpc [None req-ad373519-aa70-4a24-9c68-f8b3d34f07d1 05cea3733946411abb747782f855ad13 e82deaff368b4feea9fec0f06459a6ca - - default default] Security group member updated ['49ffd6de-2ba3-48fb-87b6-b485622383ee']
Dec 06 10:21:20 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:20.107 2 INFO neutron.agent.securitygroups_rpc [None req-387cce7d-30a2-4fff-b469-9e054d53578e 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['2dafdbdc-1eca-4442-97f4-c504a138db8a']
Dec 06 10:21:20 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:20.154 262572 INFO neutron.agent.linux.ip_lib [None req-eb924aa0-16e1-4620-82a9-3831e972e682 - - - - - -] Device tapb9ac57c7-0a cannot be used as it has no MAC address
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.225 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain kernel: device tapb9ac57c7-0a entered promiscuous mode
Dec 06 10:21:20 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016480.2313] manager: (tapb9ac57c7-0a): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Dec 06 10:21:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:20Z|00236|binding|INFO|Claiming lport b9ac57c7-0a9b-4b9d-8429-fcba765c2166 for this chassis.
Dec 06 10:21:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:20Z|00237|binding|INFO|b9ac57c7-0a9b-4b9d-8429-fcba765c2166: Claiming unknown
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.233 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain systemd-udevd[323007]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:20.248 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-ab8b841e-5576-41c7-959b-523b68dc75b8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab8b841e-5576-41c7-959b-523b68dc75b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db11bdf-bde4-4f32-85e6-f2aa55728a87, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=b9ac57c7-0a9b-4b9d-8429-fcba765c2166) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:20.251 159620 INFO neutron.agent.ovn.metadata.agent [-] Port b9ac57c7-0a9b-4b9d-8429-fcba765c2166 in datapath ab8b841e-5576-41c7-959b-523b68dc75b8 bound to our chassis
Dec 06 10:21:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:20.253 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 0bd04a06-e9f2-4e9d-93e8-a61efd455657 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:21:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:20.253 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab8b841e-5576-41c7-959b-523b68dc75b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:21:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:20.254 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[05e9fb57-6d2a-4a40-8bb6-08e2357916ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.273 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:20Z|00238|binding|INFO|Setting lport b9ac57c7-0a9b-4b9d-8429-fcba765c2166 ovn-installed in OVS
Dec 06 10:21:20 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:20Z|00239|binding|INFO|Setting lport b9ac57c7-0a9b-4b9d-8429-fcba765c2166 up in Southbound
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.278 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb9ac57c7-0a: No such device
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.303 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.333 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e179 do_prune osdmap full prune enabled
Dec 06 10:21:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e180 e180: 6 total, 6 up, 6 in
Dec 06 10:21:20 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in
Dec 06 10:21:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "format": "json"}]: dispatch
Dec 06 10:21:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:20 np0005548788.localdomain ceph-mon[293643]: pgmap v376: 177 pgs: 177 active+clean; 193 MiB data, 909 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 14 KiB/s wr, 87 op/s
Dec 06 10:21:20 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:20.586 2 INFO neutron.agent.securitygroups_rpc [None req-d0a643e4-c9dc-4942-bc89-0b7141ebc4ce 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['2dafdbdc-1eca-4442-97f4-c504a138db8a']
Dec 06 10:21:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:20.787 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:20.879 2 INFO neutron.agent.securitygroups_rpc [None req-46e2ac08-cde4-4c43-95df-6267eb9b2508 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:20 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:20.909 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:21 np0005548788.localdomain podman[323078]: 
Dec 06 10:21:21 np0005548788.localdomain podman[323078]: 2025-12-06 10:21:21.237763025 +0000 UTC m=+0.096497891 container create 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:21 np0005548788.localdomain systemd[1]: Started libpod-conmon-9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82.scope.
Dec 06 10:21:21 np0005548788.localdomain podman[323078]: 2025-12-06 10:21:21.187380704 +0000 UTC m=+0.046115590 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:21 np0005548788.localdomain systemd[1]: tmp-crun.2Razbp.mount: Deactivated successfully.
Dec 06 10:21:21 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:21 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f28faa4d14b25f4efd1879622fad4a51cf0c5cdee34a8ac95a3345e970df499/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:21 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:21.310 2 INFO neutron.agent.securitygroups_rpc [None req-7161a7af-e2d1-4aef-85f9-991736510e62 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['9ec4be56-d6a3-492b-8ed5-b0e035114ef3']
Dec 06 10:21:21 np0005548788.localdomain podman[323078]: 2025-12-06 10:21:21.317638738 +0000 UTC m=+0.176373604 container init 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:21:21 np0005548788.localdomain podman[323078]: 2025-12-06 10:21:21.326356488 +0000 UTC m=+0.185091354 container start 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:21 np0005548788.localdomain dnsmasq[323096]: started, version 2.85 cachesize 150
Dec 06 10:21:21 np0005548788.localdomain dnsmasq[323096]: DNS service limited to local subnets
Dec 06 10:21:21 np0005548788.localdomain dnsmasq[323096]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:21 np0005548788.localdomain dnsmasq[323096]: warning: no upstream servers configured
Dec 06 10:21:21 np0005548788.localdomain dnsmasq-dhcp[323096]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:21:21 np0005548788.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/addn_hosts - 0 addresses
Dec 06 10:21:21 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/host
Dec 06 10:21:21 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/opts
Dec 06 10:21:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:21.388 262572 INFO neutron.agent.dhcp.agent [None req-28c4ac7e-365f-46f0-bb34-7b1d03574651 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6776850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6776af0>], id=ef58cb89-4e32-47b7-af5e-b802a5e2a149, ip_allocation=immediate, mac_address=fa:16:3e:ed:3a:97, name=tempest-RoutersTest-1589124864, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:17Z, description=, dns_domain=, id=ab8b841e-5576-41c7-959b-523b68dc75b8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1527093647, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29322, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2730, status=ACTIVE, subnets=['b50d0368-a322-4422-98ab-06e974213c53'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:18Z, vlan_transparent=None, network_id=ab8b841e-5576-41c7-959b-523b68dc75b8, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['49ffd6de-2ba3-48fb-87b6-b485622383ee'], standard_attr_id=2755, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:19Z on network ab8b841e-5576-41c7-959b-523b68dc75b8
Dec 06 10:21:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:21.528 262572 INFO neutron.agent.dhcp.agent [None req-548249b6-05ec-4be5-87c6-35f8526cf8bb - - - - - -] DHCP configuration for ports {'5fdb567c-def7-40ca-a8af-dacfd58f0a40'} is completed
Dec 06 10:21:21 np0005548788.localdomain ceph-mon[293643]: osdmap e180: 6 total, 6 up, 6 in
Dec 06 10:21:21 np0005548788.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/addn_hosts - 1 addresses
Dec 06 10:21:21 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/host
Dec 06 10:21:21 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/opts
Dec 06 10:21:21 np0005548788.localdomain podman[323114]: 2025-12-06 10:21:21.700407925 +0000 UTC m=+0.057514183 container kill 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:21:21 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:21.957 2 INFO neutron.agent.securitygroups_rpc [None req-5446130a-9f42-445f-9ada-e84f5ef55ebf 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['9ec4be56-d6a3-492b-8ed5-b0e035114ef3']
Dec 06 10:21:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:21.981 262572 INFO neutron.agent.dhcp.agent [None req-19e5e8e9-58db-49bd-918e-e9d7b1a14b85 - - - - - -] DHCP configuration for ports {'ef58cb89-4e32-47b7-af5e-b802a5e2a149'} is completed
Dec 06 10:21:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:22 np0005548788.localdomain ceph-mon[293643]: pgmap v378: 177 pgs: 177 active+clean; 193 MiB data, 909 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 14 KiB/s wr, 52 op/s
Dec 06 10:21:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:23.030 2 INFO neutron.agent.securitygroups_rpc [None req-027b5fbe-7eaf-4a47-82ae-b4c37a4f7304 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:23.030 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:19Z, description=, device_id=8b38719e-ca9e-4066-93be-bce325317d7d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c66377f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6637a00>], id=ef58cb89-4e32-47b7-af5e-b802a5e2a149, ip_allocation=immediate, mac_address=fa:16:3e:ed:3a:97, name=tempest-RoutersTest-1589124864, network_id=ab8b841e-5576-41c7-959b-523b68dc75b8, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['49ffd6de-2ba3-48fb-87b6-b485622383ee'], standard_attr_id=2755, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:20Z on network ab8b841e-5576-41c7-959b-523b68dc75b8
Dec 06 10:21:23 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:23.086 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:21:23 np0005548788.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/addn_hosts - 1 addresses
Dec 06 10:21:23 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/host
Dec 06 10:21:23 np0005548788.localdomain podman[323152]: 2025-12-06 10:21:23.274850494 +0000 UTC m=+0.065544812 container kill 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:23 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/opts
Dec 06 10:21:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:23.385 2 INFO neutron.agent.securitygroups_rpc [None req-045a3cff-be52-4b3f-bacc-9f67cad8a71e 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:23.545 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:23.550 262572 INFO neutron.agent.dhcp.agent [None req-ce6b8a2f-9616-4e31-8d78-6592b39388f7 - - - - - -] DHCP configuration for ports {'ef58cb89-4e32-47b7-af5e-b802a5e2a149'} is completed
Dec 06 10:21:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1202839971' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1202839971' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:23 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:23.766 2 INFO neutron.agent.securitygroups_rpc [None req-7256d2df-f723-495d-9fc4-3babfe884545 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:24 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:24.018 2 INFO neutron.agent.securitygroups_rpc [None req-41802285-db31-4507-ae87-65b8788f6146 05cea3733946411abb747782f855ad13 e82deaff368b4feea9fec0f06459a6ca - - default default] Security group member updated ['49ffd6de-2ba3-48fb-87b6-b485622383ee']
Dec 06 10:21:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:21:24 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:24.195 2 INFO neutron.agent.securitygroups_rpc [None req-2554a227-dcf4-4ba9-820f-f0670d58a0fd 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:24 np0005548788.localdomain podman[323189]: 2025-12-06 10:21:24.263443976 +0000 UTC m=+0.091743633 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 06 10:21:24 np0005548788.localdomain podman[323189]: 2025-12-06 10:21:24.279807362 +0000 UTC m=+0.108107019 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:21:24 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:21:24 np0005548788.localdomain dnsmasq[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/addn_hosts - 0 addresses
Dec 06 10:21:24 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/host
Dec 06 10:21:24 np0005548788.localdomain podman[323197]: 2025-12-06 10:21:24.31459314 +0000 UTC m=+0.120845424 container kill 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:21:24 np0005548788.localdomain dnsmasq-dhcp[323096]: read /var/lib/neutron/dhcp/ab8b841e-5576-41c7-959b-523b68dc75b8/opts
Dec 06 10:21:24 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:24.513 2 INFO neutron.agent.securitygroups_rpc [None req-59f267c4-4d57-4206-9795-2fa0d0bba04b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:24.565 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:24Z|00240|binding|INFO|Releasing lport b9ac57c7-0a9b-4b9d-8429-fcba765c2166 from this chassis (sb_readonly=0)
Dec 06 10:21:24 np0005548788.localdomain kernel: device tapb9ac57c7-0a left promiscuous mode
Dec 06 10:21:24 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:24Z|00241|binding|INFO|Setting lport b9ac57c7-0a9b-4b9d-8429-fcba765c2166 down in Southbound
Dec 06 10:21:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:24.591 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:24 np0005548788.localdomain ceph-mon[293643]: pgmap v379: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 16 KiB/s wr, 38 op/s
Dec 06 10:21:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:24.603 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-ab8b841e-5576-41c7-959b-523b68dc75b8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab8b841e-5576-41c7-959b-523b68dc75b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db11bdf-bde4-4f32-85e6-f2aa55728a87, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=b9ac57c7-0a9b-4b9d-8429-fcba765c2166) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:24.605 159620 INFO neutron.agent.ovn.metadata.agent [-] Port b9ac57c7-0a9b-4b9d-8429-fcba765c2166 in datapath ab8b841e-5576-41c7-959b-523b68dc75b8 unbound from our chassis
Dec 06 10:21:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:24.607 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab8b841e-5576-41c7-959b-523b68dc75b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:21:24 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:24.608 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[99cd61f0-ef62-4720-8c47-987d9b11ac28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:25.010 2 INFO neutron.agent.securitygroups_rpc [None req-e2ff1412-56d0-4e73-b6d0-7b3abe3cbdda 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:25.310 2 INFO neutron.agent.securitygroups_rpc [None req-502d7035-1806-4b5f-89df-e87b8d86a05e 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:25 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:25.524 2 INFO neutron.agent.securitygroups_rpc [None req-4635dd6b-d978-4577-967f-9e6194773640 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:25.826 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:25 np0005548788.localdomain dnsmasq[323096]: exiting on receipt of SIGTERM
Dec 06 10:21:25 np0005548788.localdomain podman[323249]: 2025-12-06 10:21:25.857544264 +0000 UTC m=+0.099028140 container kill 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:21:25 np0005548788.localdomain systemd[1]: libpod-9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82.scope: Deactivated successfully.
Dec 06 10:21:25 np0005548788.localdomain podman[323261]: 2025-12-06 10:21:25.929054398 +0000 UTC m=+0.059946937 container died 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:21:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:21:25 np0005548788.localdomain systemd[1]: tmp-crun.a3hGkx.mount: Deactivated successfully.
Dec 06 10:21:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:25 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1f28faa4d14b25f4efd1879622fad4a51cf0c5cdee34a8ac95a3345e970df499-merged.mount: Deactivated successfully.
Dec 06 10:21:25 np0005548788.localdomain podman[323261]: 2025-12-06 10:21:25.982421401 +0000 UTC m=+0.113313890 container cleanup 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:25 np0005548788.localdomain systemd[1]: libpod-conmon-9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82.scope: Deactivated successfully.
Dec 06 10:21:26 np0005548788.localdomain podman[323269]: 2025-12-06 10:21:26.003397621 +0000 UTC m=+0.120031109 container remove 9de9694d3a70cf51d40432879f7b1cd03bb4d6cf551bf6587fc3b2b86fbe4c82 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ab8b841e-5576-41c7-959b-523b68dc75b8, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:26.028 2 INFO neutron.agent.securitygroups_rpc [None req-47fed9fe-9d6c-4fec-9c42-e4b99f91a654 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:26 np0005548788.localdomain podman[323289]: 2025-12-06 10:21:26.047354522 +0000 UTC m=+0.076336115 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:26 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:26.059 262572 INFO neutron.agent.dhcp.agent [None req-d7e50081-2626-494f-ae6f-3bd2c378091f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:26 np0005548788.localdomain podman[323289]: 2025-12-06 10:21:26.079721085 +0000 UTC m=+0.108702648 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:21:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:26.085 2 INFO neutron.agent.securitygroups_rpc [None req-23962b7d-779d-4700-97cd-a1c3604fb216 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['4227525c-3196-4e1b-83f0-62a3222dd04d']
Dec 06 10:21:26 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:21:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:21:26 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:26.135 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:26 np0005548788.localdomain podman[323310]: 2025-12-06 10:21:26.208793263 +0000 UTC m=+0.087623225 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:21:26 np0005548788.localdomain podman[323310]: 2025-12-06 10:21:26.220519966 +0000 UTC m=+0.099349958 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:21:26 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:21:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:26.430 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:26 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:26.542 2 INFO neutron.agent.securitygroups_rpc [None req-79d1c16c-d523-499a-9021-4bdb2b87f9e2 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:26 np0005548788.localdomain ceph-mon[293643]: pgmap v380: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 16 KiB/s wr, 38 op/s
Dec 06 10:21:26 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/921077461' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:26 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/921077461' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:26 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2dab8b841e\x2d5576\x2d41c7\x2d959b\x2d523b68dc75b8.mount: Deactivated successfully.
Dec 06 10:21:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e180 do_prune osdmap full prune enabled
Dec 06 10:21:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e181 e181: 6 total, 6 up, 6 in
Dec 06 10:21:27 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in
Dec 06 10:21:28 np0005548788.localdomain ceph-mon[293643]: pgmap v381: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 16 KiB/s wr, 38 op/s
Dec 06 10:21:28 np0005548788.localdomain ceph-mon[293643]: osdmap e181: 6 total, 6 up, 6 in
Dec 06 10:21:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:28.577 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "format": "json"}]: dispatch
Dec 06 10:21:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:30 np0005548788.localdomain ceph-mon[293643]: pgmap v383: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 16 KiB/s wr, 61 op/s
Dec 06 10:21:30 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:30.492 262572 INFO neutron.agent.linux.ip_lib [None req-0f0f8eba-2991-4f1c-b57f-354f3bb66118 - - - - - -] Device tapa4757746-6f cannot be used as it has no MAC address
Dec 06 10:21:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:30.518 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:30 np0005548788.localdomain kernel: device tapa4757746-6f entered promiscuous mode
Dec 06 10:21:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:30Z|00242|binding|INFO|Claiming lport a4757746-6f4b-433b-990d-b4fde8a525cd for this chassis.
Dec 06 10:21:30 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016490.5300] manager: (tapa4757746-6f): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Dec 06 10:21:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:30Z|00243|binding|INFO|a4757746-6f4b-433b-990d-b4fde8a525cd: Claiming unknown
Dec 06 10:21:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:30.530 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:30 np0005548788.localdomain systemd-udevd[323344]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:30Z|00244|binding|INFO|Setting lport a4757746-6f4b-433b-990d-b4fde8a525cd ovn-installed in OVS
Dec 06 10:21:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:30.566 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapa4757746-6f: No such device
Dec 06 10:21:30 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:30Z|00245|binding|INFO|Setting lport a4757746-6f4b-433b-990d-b4fde8a525cd up in Southbound
Dec 06 10:21:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:30.601 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-79f4b415-75a5-4046-90dd-c5480491214a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79f4b415-75a5-4046-90dd-c5480491214a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba99e4aa-7ce9-49b0-98c5-e5996fb977c1, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=a4757746-6f4b-433b-990d-b4fde8a525cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:30.603 159620 INFO neutron.agent.ovn.metadata.agent [-] Port a4757746-6f4b-433b-990d-b4fde8a525cd in datapath 79f4b415-75a5-4046-90dd-c5480491214a bound to our chassis
Dec 06 10:21:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:30.604 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 79f4b415-75a5-4046-90dd-c5480491214a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:30.605 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec28962-e103-4338-91ed-bcd595abb4e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:30.610 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:30.644 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:30 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:30.647 2 INFO neutron.agent.securitygroups_rpc [None req-650d6f16-dff9-4e9f-a532-00c1d7cd9ac8 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:30.855 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:31 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3604592233' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:31 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3604592233' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:31 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:31.379 2 INFO neutron.agent.securitygroups_rpc [None req-1b25accf-a6fc-48da-91f7-6e8e14cb52bd a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:31 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:31.438 2 INFO neutron.agent.securitygroups_rpc [None req-253b8277-210a-4573-82a4-3ce2f38be71e cc9a0aebc5df40baa5d30408481c8824 5ea98fc77f0c4728a4c2d7a5429d8129 - - default default] Security group rule updated ['113d3ef2-1b05-41a6-846b-b981d95adda0']
Dec 06 10:21:31 np0005548788.localdomain podman[323415]: 
Dec 06 10:21:31 np0005548788.localdomain podman[323415]: 2025-12-06 10:21:31.623856737 +0000 UTC m=+0.093400074 container create c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:21:31 np0005548788.localdomain systemd[1]: Started libpod-conmon-c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac.scope.
Dec 06 10:21:31 np0005548788.localdomain podman[323415]: 2025-12-06 10:21:31.581245757 +0000 UTC m=+0.050789134 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:31 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:31 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48a5c4e1010302bf07bd30b5b3b894796291ee3387fc6979ce67fd48cf3bc08a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:31 np0005548788.localdomain podman[323415]: 2025-12-06 10:21:31.699083977 +0000 UTC m=+0.168627314 container init c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:31 np0005548788.localdomain podman[323415]: 2025-12-06 10:21:31.708580441 +0000 UTC m=+0.178123798 container start c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:31 np0005548788.localdomain dnsmasq[323434]: started, version 2.85 cachesize 150
Dec 06 10:21:31 np0005548788.localdomain dnsmasq[323434]: DNS service limited to local subnets
Dec 06 10:21:31 np0005548788.localdomain dnsmasq[323434]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:31 np0005548788.localdomain dnsmasq[323434]: warning: no upstream servers configured
Dec 06 10:21:31 np0005548788.localdomain dnsmasq-dhcp[323434]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:31 np0005548788.localdomain dnsmasq[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/addn_hosts - 0 addresses
Dec 06 10:21:31 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/host
Dec 06 10:21:31 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/opts
Dec 06 10:21:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:31.770 262572 INFO neutron.agent.dhcp.agent [None req-0f0f8eba-2991-4f1c-b57f-354f3bb66118 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c667c3a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c667c670>], id=70ebcf45-3259-4076-a3e5-ed5914f79cb0, ip_allocation=immediate, mac_address=fa:16:3e:96:8c:a4, name=tempest-PortsIpV6TestJSON-500481643, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:28Z, description=, dns_domain=, id=79f4b415-75a5-4046-90dd-c5480491214a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1324934681, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20853, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2835, status=ACTIVE, subnets=['78a25e7e-f2ab-47bd-ac2b-75049234b37b'], tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:29Z, vlan_transparent=None, network_id=79f4b415-75a5-4046-90dd-c5480491214a, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f89459b7-5955-49a9-980d-ccf671c641e2'], standard_attr_id=2846, status=DOWN, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:30Z on network 79f4b415-75a5-4046-90dd-c5480491214a
Dec 06 10:21:31 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:31.867 262572 INFO neutron.agent.dhcp.agent [None req-78d7ebfa-f7b3-4610-82a1-c8bfd574bce3 - - - - - -] DHCP configuration for ports {'81f242c3-ac32-4156-adc4-8c2ef1dfc8cf'} is completed
Dec 06 10:21:31 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:31.888 2 INFO neutron.agent.securitygroups_rpc [None req-0a2b9713-3499-411d-9f50-733d3731fca5 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:31 np0005548788.localdomain dnsmasq[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/addn_hosts - 1 addresses
Dec 06 10:21:31 np0005548788.localdomain podman[323453]: 2025-12-06 10:21:31.984301702 +0000 UTC m=+0.063919311 container kill c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:21:31 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/host
Dec 06 10:21:31 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/opts
Dec 06 10:21:32 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:32.174 262572 INFO neutron.agent.dhcp.agent [None req-0f0f8eba-2991-4f1c-b57f-354f3bb66118 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c676fdf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c676fa90>], id=18ffa2c0-bd6e-4de7-87c5-b788ac670883, ip_allocation=immediate, mac_address=fa:16:3e:1e:47:56, name=tempest-PortsIpV6TestJSON-260128295, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:28Z, description=, dns_domain=, id=79f4b415-75a5-4046-90dd-c5480491214a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1324934681, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20853, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2835, status=ACTIVE, subnets=['78a25e7e-f2ab-47bd-ac2b-75049234b37b'], tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:29Z, vlan_transparent=None, network_id=79f4b415-75a5-4046-90dd-c5480491214a, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f89459b7-5955-49a9-980d-ccf671c641e2'], standard_attr_id=2855, status=DOWN, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:31Z on network 79f4b415-75a5-4046-90dd-c5480491214a
Dec 06 10:21:32 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:32.212 262572 INFO neutron.agent.dhcp.agent [None req-ef36126a-63cd-4b47-a6ac-97e6cd1a72e8 - - - - - -] DHCP configuration for ports {'70ebcf45-3259-4076-a3e5-ed5914f79cb0'} is completed
Dec 06 10:21:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:32 np0005548788.localdomain dnsmasq[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/addn_hosts - 2 addresses
Dec 06 10:21:32 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/host
Dec 06 10:21:32 np0005548788.localdomain podman[323491]: 2025-12-06 10:21:32.388471911 +0000 UTC m=+0.062431184 container kill c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:21:32 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/opts
Dec 06 10:21:32 np0005548788.localdomain ceph-mon[293643]: pgmap v384: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 14 KiB/s wr, 53 op/s
Dec 06 10:21:32 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:32.510 2 INFO neutron.agent.securitygroups_rpc [None req-9523440b-4459-4eba-b7c1-0671f42e7aa4 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:32 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:32.629 262572 INFO neutron.agent.dhcp.agent [None req-2018d173-68a1-4eac-90d0-9fbac3504669 - - - - - -] DHCP configuration for ports {'18ffa2c0-bd6e-4de7-87c5-b788ac670883'} is completed
Dec 06 10:21:32 np0005548788.localdomain podman[323530]: 2025-12-06 10:21:32.788548184 +0000 UTC m=+0.058845544 container kill c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:21:32 np0005548788.localdomain dnsmasq[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/addn_hosts - 1 addresses
Dec 06 10:21:32 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/host
Dec 06 10:21:32 np0005548788.localdomain dnsmasq-dhcp[323434]: read /var/lib/neutron/dhcp/79f4b415-75a5-4046-90dd-c5480491214a/opts
Dec 06 10:21:33 np0005548788.localdomain systemd[1]: tmp-crun.pm78Gn.mount: Deactivated successfully.
Dec 06 10:21:33 np0005548788.localdomain dnsmasq[323434]: exiting on receipt of SIGTERM
Dec 06 10:21:33 np0005548788.localdomain podman[323569]: 2025-12-06 10:21:33.277298392 +0000 UTC m=+0.071751302 container kill c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:33 np0005548788.localdomain systemd[1]: libpod-c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac.scope: Deactivated successfully.
Dec 06 10:21:33 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:33Z|00246|binding|INFO|Removing iface tapa4757746-6f ovn-installed in OVS
Dec 06 10:21:33 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:33Z|00247|binding|INFO|Removing lport a4757746-6f4b-433b-990d-b4fde8a525cd ovn-installed in OVS
Dec 06 10:21:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:33.318 159620 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 6a60dbef-7c21-4480-afba-662da681d0a6 with type ""
Dec 06 10:21:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:33.322 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-79f4b415-75a5-4046-90dd-c5480491214a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79f4b415-75a5-4046-90dd-c5480491214a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba99e4aa-7ce9-49b0-98c5-e5996fb977c1, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=a4757746-6f4b-433b-990d-b4fde8a525cd) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:33.326 159620 INFO neutron.agent.ovn.metadata.agent [-] Port a4757746-6f4b-433b-990d-b4fde8a525cd in datapath 79f4b415-75a5-4046-90dd-c5480491214a unbound from our chassis
Dec 06 10:21:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:33.328 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 79f4b415-75a5-4046-90dd-c5480491214a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:33.329 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[30992b12-72dc-4dbd-9772-2a6e26bb8bb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:33 np0005548788.localdomain podman[323584]: 2025-12-06 10:21:33.354449803 +0000 UTC m=+0.059185204 container died c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:21:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:33.355 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "format": "json"}]: dispatch
Dec 06 10:21:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:33 np0005548788.localdomain podman[323584]: 2025-12-06 10:21:33.440631422 +0000 UTC m=+0.145366833 container cleanup c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:21:33 np0005548788.localdomain systemd[1]: libpod-conmon-c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac.scope: Deactivated successfully.
Dec 06 10:21:33 np0005548788.localdomain podman[323585]: 2025-12-06 10:21:33.468094022 +0000 UTC m=+0.162123333 container remove c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79f4b415-75a5-4046-90dd-c5480491214a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:21:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:33.481 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:33 np0005548788.localdomain kernel: device tapa4757746-6f left promiscuous mode
Dec 06 10:21:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:33.502 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:33.529 262572 INFO neutron.agent.dhcp.agent [None req-e9350418-22e7-47ae-b1b3-b713d315bae2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:33.530 262572 INFO neutron.agent.dhcp.agent [None req-e9350418-22e7-47ae-b1b3-b713d315bae2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:33.579 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:33 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:33.602 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-48a5c4e1010302bf07bd30b5b3b894796291ee3387fc6979ce67fd48cf3bc08a-merged.mount: Deactivated successfully.
Dec 06 10:21:33 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9939aecdbe3b5ed2fa155ab888423edba3b90b229c9bef4cbf033d100e092ac-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:33 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d79f4b415\x2d75a5\x2d4046\x2d90dd\x2dc5480491214a.mount: Deactivated successfully.
Dec 06 10:21:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:33.767 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:34 np0005548788.localdomain ceph-mon[293643]: pgmap v385: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 15 KiB/s wr, 42 op/s
Dec 06 10:21:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/620957375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:34 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/620957375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e181 do_prune osdmap full prune enabled
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e182 e182: 6 total, 6 up, 6 in
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2748713948' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:35 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:35.707 2 INFO neutron.agent.securitygroups_rpc [None req-b9509f86-2a19-49fe-82cf-8c23b9e8fca9 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:35.897 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e182 do_prune osdmap full prune enabled
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: pgmap v386: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 15 KiB/s wr, 42 op/s
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: osdmap e182: 6 total, 6 up, 6 in
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e183 e183: 6 total, 6 up, 6 in
Dec 06 10:21:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e183 do_prune osdmap full prune enabled
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "format": "json"}]: dispatch
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: osdmap e183: 6 total, 6 up, 6 in
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2846497516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1607464631' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e184 e184: 6 total, 6 up, 6 in
Dec 06 10:21:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548788.localdomain sshd[323612]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e184 do_prune osdmap full prune enabled
Dec 06 10:21:38 np0005548788.localdomain ceph-mon[293643]: pgmap v389: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 29 op/s
Dec 06 10:21:38 np0005548788.localdomain ceph-mon[293643]: osdmap e184: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e185 e185: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:38.580 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:21:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:21:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/929112309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/929112309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:39 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:21:39 np0005548788.localdomain podman[323614]: 2025-12-06 10:21:39.263980251 +0000 UTC m=+0.083728794 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:39 np0005548788.localdomain podman[323614]: 2025-12-06 10:21:39.318628524 +0000 UTC m=+0.138377027 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Dec 06 10:21:39 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:21:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:39.378 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:39.378 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: osdmap e185: 6 total, 6 up, 6 in
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/929112309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/929112309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:39 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:39.903 2 INFO neutron.agent.securitygroups_rpc [None req-07bddad5-b128-4210-8a04-c1adeb45eb18 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:40.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e185 do_prune osdmap full prune enabled
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: pgmap v392: 177 pgs: 177 active+clean; 193 MiB data, 931 MiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 25 KiB/s wr, 71 op/s
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e186 e186: 6 total, 6 up, 6 in
Dec 06 10:21:40 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in
Dec 06 10:21:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:40.900 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:41 np0005548788.localdomain ceph-mon[293643]: osdmap e186: 6 total, 6 up, 6 in
Dec 06 10:21:41 np0005548788.localdomain sshd[323612]: Received disconnect from 45.78.194.186 port 35138:11: Bye Bye [preauth]
Dec 06 10:21:41 np0005548788.localdomain sshd[323612]: Disconnected from authenticating user root 45.78.194.186 port 35138 [preauth]
Dec 06 10:21:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:42.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:42.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e186 do_prune osdmap full prune enabled
Dec 06 10:21:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e187 e187: 6 total, 6 up, 6 in
Dec 06 10:21:42 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in
Dec 06 10:21:42 np0005548788.localdomain ceph-mon[293643]: pgmap v394: 177 pgs: 177 active+clean; 193 MiB data, 931 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 22 KiB/s wr, 60 op/s
Dec 06 10:21:42 np0005548788.localdomain ceph-mon[293643]: osdmap e187: 6 total, 6 up, 6 in
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: tmp-crun.PlS4kb.mount: Deactivated successfully.
Dec 06 10:21:43 np0005548788.localdomain podman[323640]: 2025-12-06 10:21:43.27744608 +0000 UTC m=+0.098074729 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:21:43 np0005548788.localdomain podman[323641]: 2025-12-06 10:21:43.310738011 +0000 UTC m=+0.127941554 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Dec 06 10:21:43 np0005548788.localdomain podman[323640]: 2025-12-06 10:21:43.31812481 +0000 UTC m=+0.138753499 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:21:43 np0005548788.localdomain podman[323641]: 2025-12-06 10:21:43.375106885 +0000 UTC m=+0.192310468 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:21:43 np0005548788.localdomain podman[323639]: 2025-12-06 10:21:43.376440726 +0000 UTC m=+0.199914583 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:43 np0005548788.localdomain podman[323639]: 2025-12-06 10:21:43.462854213 +0000 UTC m=+0.286328090 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125)
Dec 06 10:21:43 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.582 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:43.599 262572 INFO neutron.agent.linux.ip_lib [None req-ade9c51f-beb1-4eb4-95c0-afcdac080b45 - - - - - -] Device tapf92ead08-1c cannot be used as it has no MAC address
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.625 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:43 np0005548788.localdomain kernel: device tapf92ead08-1c entered promiscuous mode
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.635 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:43 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016503.6362] manager: (tapf92ead08-1c): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Dec 06 10:21:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:43Z|00248|binding|INFO|Claiming lport f92ead08-1ccb-4037-aa51-866ea98cc833 for this chassis.
Dec 06 10:21:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:43Z|00249|binding|INFO|f92ead08-1ccb-4037-aa51-866ea98cc833: Claiming unknown
Dec 06 10:21:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:43.651 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-8f6f0031-6aba-4497-bf55-cc7008fa0e7e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f6f0031-6aba-4497-bf55-cc7008fa0e7e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8baea40-88ff-45fd-b627-b6a55f59f6f6, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=f92ead08-1ccb-4037-aa51-866ea98cc833) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:43.653 159620 INFO neutron.agent.ovn.metadata.agent [-] Port f92ead08-1ccb-4037-aa51-866ea98cc833 in datapath 8f6f0031-6aba-4497-bf55-cc7008fa0e7e bound to our chassis
Dec 06 10:21:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:43.653 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f6f0031-6aba-4497-bf55-cc7008fa0e7e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:43.654 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[d3882216-075f-451c-97e3-d7cec94595be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.677 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:43Z|00250|binding|INFO|Setting lport f92ead08-1ccb-4037-aa51-866ea98cc833 ovn-installed in OVS
Dec 06 10:21:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:43Z|00251|binding|INFO|Setting lport f92ead08-1ccb-4037-aa51-866ea98cc833 up in Southbound
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.680 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e50: np0005548790.kvkfyr(active, since 10m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:21:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb_a3496f8c-73ba-4d3a-8401-faf81aff8654", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/518841331' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/518841331' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.733 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:43.769 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:44.225 262572 INFO neutron.agent.linux.ip_lib [None req-19d0c1e4-67e0-452f-8b4b-8a33bbd9611e - - - - - -] Device tap11c65192-e3 cannot be used as it has no MAC address
Dec 06 10:21:44 np0005548788.localdomain systemd[1]: tmp-crun.mlOLBC.mount: Deactivated successfully.
Dec 06 10:21:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:44.275 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548788.localdomain kernel: device tap11c65192-e3 entered promiscuous mode
Dec 06 10:21:44 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016504.2814] manager: (tap11c65192-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Dec 06 10:21:44 np0005548788.localdomain systemd-udevd[323709]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:44Z|00252|binding|INFO|Claiming lport 11c65192-e355-4348-916d-c405f800f3fc for this chassis.
Dec 06 10:21:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:44Z|00253|binding|INFO|11c65192-e355-4348-916d-c405f800f3fc: Claiming unknown
Dec 06 10:21:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:44.283 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:44.294 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-368e35d9-76eb-4980-95bb-4c79010f8e1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-368e35d9-76eb-4980-95bb-4c79010f8e1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd1c979900294beeb6f273c0e1a6333a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8955b85-2eef-4be8-98c9-809745805d25, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=11c65192-e355-4348-916d-c405f800f3fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:44.297 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 11c65192-e355-4348-916d-c405f800f3fc in datapath 368e35d9-76eb-4980-95bb-4c79010f8e1c bound to our chassis
Dec 06 10:21:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:44.300 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port f4e7f9fb-ff9f-4907-bbb8-e94f8f1d7861 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:21:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:44.300 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 368e35d9-76eb-4980-95bb-4c79010f8e1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:21:44 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:44.301 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[dffc11b8-f7b7-47bf-85a5-eb1b777315cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:44 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:44.322 2 INFO neutron.agent.securitygroups_rpc [None req-e0373cd9-ce28-4174-b7a6-2d0d511546d5 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['ecf618e7-df48-4fb6-89e3-d9952de70569']
Dec 06 10:21:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:44Z|00254|binding|INFO|Setting lport 11c65192-e355-4348-916d-c405f800f3fc ovn-installed in OVS
Dec 06 10:21:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:44Z|00255|binding|INFO|Setting lport 11c65192-e355-4348-916d-c405f800f3fc up in Southbound
Dec 06 10:21:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:44.329 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:44.383 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:44.418 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548788.localdomain ceph-mon[293643]: pgmap v396: 177 pgs: 177 active+clean; 193 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 52 KiB/s wr, 140 op/s
Dec 06 10:21:44 np0005548788.localdomain ceph-mon[293643]: mgrmap e50: np0005548790.kvkfyr(active, since 10m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:21:44 np0005548788.localdomain podman[323790]: 
Dec 06 10:21:44 np0005548788.localdomain podman[323790]: 2025-12-06 10:21:44.793953324 +0000 UTC m=+0.101680960 container create d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:21:44 np0005548788.localdomain systemd[1]: Started libpod-conmon-d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb.scope.
Dec 06 10:21:44 np0005548788.localdomain podman[323790]: 2025-12-06 10:21:44.747067173 +0000 UTC m=+0.054794829 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:44 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:44 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c85ab2c0ce24e5167b9eb5bd715235bb164f3317ca124684fb0c9906c51697d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:44 np0005548788.localdomain podman[323790]: 2025-12-06 10:21:44.888900795 +0000 UTC m=+0.196628421 container init d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:21:44 np0005548788.localdomain podman[323790]: 2025-12-06 10:21:44.898844274 +0000 UTC m=+0.206571900 container start d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:21:44 np0005548788.localdomain dnsmasq[323816]: started, version 2.85 cachesize 150
Dec 06 10:21:44 np0005548788.localdomain dnsmasq[323816]: DNS service limited to local subnets
Dec 06 10:21:44 np0005548788.localdomain dnsmasq[323816]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:44 np0005548788.localdomain dnsmasq[323816]: warning: no upstream servers configured
Dec 06 10:21:44 np0005548788.localdomain dnsmasq-dhcp[323816]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:44 np0005548788.localdomain dnsmasq[323816]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 0 addresses
Dec 06 10:21:44 np0005548788.localdomain dnsmasq-dhcp[323816]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:44 np0005548788.localdomain dnsmasq-dhcp[323816]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:44 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:44.956 262572 INFO neutron.agent.dhcp.agent [None req-ade9c51f-beb1-4eb4-95c0-afcdac080b45 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:43Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c664f8b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c664fbe0>], id=6b2570c6-8494-4617-b77d-366d6b96ff7b, ip_allocation=immediate, mac_address=fa:16:3e:b9:f1:48, name=tempest-PortsIpV6TestJSON-251483394, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:02Z, description=, dns_domain=, id=8f6f0031-6aba-4497-bf55-cc7008fa0e7e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-112549492, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53883, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2632, status=ACTIVE, subnets=['ef701842-0546-4ce3-9ae9-7dda77a97278'], tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:42Z, vlan_transparent=None, network_id=8f6f0031-6aba-4497-bf55-cc7008fa0e7e, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ecf618e7-df48-4fb6-89e3-d9952de70569'], standard_attr_id=2947, status=DOWN, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:43Z on network 8f6f0031-6aba-4497-bf55-cc7008fa0e7e
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323816]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 1 addresses
Dec 06 10:21:45 np0005548788.localdomain dnsmasq-dhcp[323816]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:45 np0005548788.localdomain podman[323839]: 2025-12-06 10:21:45.157237147 +0000 UTC m=+0.053173878 container kill d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:45 np0005548788.localdomain dnsmasq-dhcp[323816]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:45.217 262572 INFO neutron.agent.dhcp.agent [None req-b5688e16-2afb-4b98-bfd5-418f16f677f5 - - - - - -] DHCP configuration for ports {'e3d26bb5-a627-44c1-951d-86a7aa4ab371', '377c93f2-4b77-4cc8-942d-efb146acede8'} is completed
Dec 06 10:21:45 np0005548788.localdomain podman[323882]: 
Dec 06 10:21:45 np0005548788.localdomain podman[323882]: 2025-12-06 10:21:45.421997467 +0000 UTC m=+0.086367535 container create 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:21:45 np0005548788.localdomain systemd[1]: Started libpod-conmon-2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9.scope.
Dec 06 10:21:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:45.465 262572 INFO neutron.agent.dhcp.agent [None req-22cf4d89-2fa4-4468-80a6-58392639fcf6 - - - - - -] DHCP configuration for ports {'6b2570c6-8494-4617-b77d-366d6b96ff7b'} is completed
Dec 06 10:21:45 np0005548788.localdomain podman[323882]: 2025-12-06 10:21:45.381255156 +0000 UTC m=+0.045625264 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:45 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:45 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/722f83c98c5f485a7cb5e9359c332465631e66d9aebdece877c0c1c999a34102/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:45 np0005548788.localdomain podman[323882]: 2025-12-06 10:21:45.501307394 +0000 UTC m=+0.165677462 container init 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:21:45 np0005548788.localdomain podman[323882]: 2025-12-06 10:21:45.515036939 +0000 UTC m=+0.179406997 container start 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323901]: started, version 2.85 cachesize 150
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323901]: DNS service limited to local subnets
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323901]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323901]: warning: no upstream servers configured
Dec 06 10:21:45 np0005548788.localdomain dnsmasq-dhcp[323901]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/addn_hosts - 0 addresses
Dec 06 10:21:45 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/host
Dec 06 10:21:45 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/opts
Dec 06 10:21:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:45.726 262572 INFO neutron.agent.dhcp.agent [None req-a8279387-d089-4aea-beb5-1d62e4185f50 - - - - - -] DHCP configuration for ports {'f20cc93c-066b-414b-a765-678f86a93892'} is completed
Dec 06 10:21:45 np0005548788.localdomain dnsmasq[323816]: exiting on receipt of SIGTERM
Dec 06 10:21:45 np0005548788.localdomain podman[323919]: 2025-12-06 10:21:45.849264442 +0000 UTC m=+0.062136426 container kill d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:21:45 np0005548788.localdomain systemd[1]: libpod-d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb.scope: Deactivated successfully.
Dec 06 10:21:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:45.902 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:45 np0005548788.localdomain podman[323933]: 2025-12-06 10:21:45.910898091 +0000 UTC m=+0.042724304 container died d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:21:45 np0005548788.localdomain podman[323933]: 2025-12-06 10:21:45.950480597 +0000 UTC m=+0.082306810 container remove d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:21:46 np0005548788.localdomain systemd[1]: libpod-conmon-d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb.scope: Deactivated successfully.
Dec 06 10:21:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:46.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:46.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:21:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:46.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:21:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:46.022 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:21:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-c85ab2c0ce24e5167b9eb5bd715235bb164f3317ca124684fb0c9906c51697d2-merged.mount: Deactivated successfully.
Dec 06 10:21:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6f11ecaed87277123050cac2ad2beb668103513cef57689a6c4a033a12e2eeb-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:46 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:46.598 2 INFO neutron.agent.securitygroups_rpc [None req-f9e919cd-0c1a-4346-a33d-5b1944f06d7b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['d5c7cc25-d8ea-40c9-b1b2-04cf074e64bb', 'ecf618e7-df48-4fb6-89e3-d9952de70569']
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.030 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.031 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.031 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.032 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.033 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:21:47 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:47.251 2 INFO neutron.agent.securitygroups_rpc [None req-cf0a2394-56b4-45a3-97ca-aedf09b47c4d a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['d5c7cc25-d8ea-40c9-b1b2-04cf074e64bb']
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e187 do_prune osdmap full prune enabled
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: pgmap v397: 177 pgs: 177 active+clean; 193 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 29 KiB/s wr, 77 op/s
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "format": "json"}]: dispatch
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3433747594' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e188 e188: 6 total, 6 up, 6 in
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in
Dec 06 10:21:47 np0005548788.localdomain podman[324028]: 
Dec 06 10:21:47 np0005548788.localdomain podman[324028]: 2025-12-06 10:21:47.398449698 +0000 UTC m=+0.087183791 container create ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:21:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:47.443 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:47 np0005548788.localdomain systemd[1]: Started libpod-conmon-ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035.scope.
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4095004814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:47 np0005548788.localdomain podman[324028]: 2025-12-06 10:21:47.349686138 +0000 UTC m=+0.038420271 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:47 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:47 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81986b6175f1cffad3c1cfb430ac6cbf89daadfacde419093042e770dd8657b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.469 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:21:47 np0005548788.localdomain podman[324028]: 2025-12-06 10:21:47.478034703 +0000 UTC m=+0.166768796 container init ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:21:47 np0005548788.localdomain podman[324028]: 2025-12-06 10:21:47.484900596 +0000 UTC m=+0.173634689 container start ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:21:47 np0005548788.localdomain dnsmasq[324049]: started, version 2.85 cachesize 150
Dec 06 10:21:47 np0005548788.localdomain dnsmasq[324049]: DNS service limited to local subnets
Dec 06 10:21:47 np0005548788.localdomain dnsmasq[324049]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:47 np0005548788.localdomain dnsmasq[324049]: warning: no upstream servers configured
Dec 06 10:21:47 np0005548788.localdomain dnsmasq-dhcp[324049]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:47 np0005548788.localdomain dnsmasq-dhcp[324049]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:21:47 np0005548788.localdomain dnsmasq[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 1 addresses
Dec 06 10:21:47 np0005548788.localdomain dnsmasq-dhcp[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:47 np0005548788.localdomain dnsmasq-dhcp[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:47.557 262572 INFO neutron.agent.dhcp.agent [None req-0f2f04fe-dfbc-47fa-9ef2-b96499bb5d47 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:43Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6650bb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6650850>], id=6b2570c6-8494-4617-b77d-366d6b96ff7b, ip_allocation=immediate, mac_address=fa:16:3e:b9:f1:48, name=tempest-PortsIpV6TestJSON-578453656, network_id=8f6f0031-6aba-4497-bf55-cc7008fa0e7e, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['d5c7cc25-d8ea-40c9-b1b2-04cf074e64bb'], standard_attr_id=2947, status=DOWN, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:46Z on network 8f6f0031-6aba-4497-bf55-cc7008fa0e7e
Dec 06 10:21:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:47.589 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:47Z, description=, device_id=ce829857-8272-489e-9d8d-e074f2d58a5d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c663d430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67652b0>], id=ce9477a7-98b7-4cc7-9a61-203aa99c9ae7, ip_allocation=immediate, mac_address=fa:16:3e:84:14:1e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:41Z, description=, dns_domain=, id=368e35d9-76eb-4980-95bb-4c79010f8e1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1371334496-network, port_security_enabled=True, project_id=fd1c979900294beeb6f273c0e1a6333a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61153, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2933, status=ACTIVE, subnets=['03d088e2-10aa-4810-b478-5bcd3b76c3e8'], tags=[], tenant_id=fd1c979900294beeb6f273c0e1a6333a, updated_at=2025-12-06T10:21:42Z, vlan_transparent=None, network_id=368e35d9-76eb-4980-95bb-4c79010f8e1c, port_security_enabled=False, project_id=fd1c979900294beeb6f273c0e1a6333a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2979, status=DOWN, tags=[], tenant_id=fd1c979900294beeb6f273c0e1a6333a, updated_at=2025-12-06T10:21:47Z on network 368e35d9-76eb-4980-95bb-4c79010f8e1c
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.689 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.691 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11424MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.691 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.692 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:47 np0005548788.localdomain dnsmasq[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 1 addresses
Dec 06 10:21:47 np0005548788.localdomain dnsmasq-dhcp[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:47 np0005548788.localdomain podman[324068]: 2025-12-06 10:21:47.772812475 +0000 UTC m=+0.058947387 container kill ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:21:47 np0005548788.localdomain dnsmasq-dhcp[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.952 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.953 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:21:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:47.978 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:21:48 np0005548788.localdomain dnsmasq[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/addn_hosts - 1 addresses
Dec 06 10:21:48 np0005548788.localdomain podman[324118]: 2025-12-06 10:21:48.14617992 +0000 UTC m=+0.098412250 container kill 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:48 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/host
Dec 06 10:21:48 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/opts
Dec 06 10:21:48 np0005548788.localdomain dnsmasq[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 0 addresses
Dec 06 10:21:48 np0005548788.localdomain dnsmasq-dhcp[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:48 np0005548788.localdomain dnsmasq-dhcp[324049]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:48 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:48.182 262572 INFO neutron.agent.dhcp.agent [None req-5d49b856-7450-4493-9cac-df5635ab96be - - - - - -] DHCP configuration for ports {'f92ead08-1ccb-4037-aa51-866ea98cc833', 'e3d26bb5-a627-44c1-951d-86a7aa4ab371', '377c93f2-4b77-4cc8-942d-efb146acede8', '6b2570c6-8494-4617-b77d-366d6b96ff7b'} is completed
Dec 06 10:21:48 np0005548788.localdomain podman[324134]: 2025-12-06 10:21:48.183262338 +0000 UTC m=+0.072946570 container kill ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:21:48 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:48.311 262572 INFO neutron.agent.dhcp.agent [None req-3ce7bddb-95f7-459f-8387-0594e9b4166c - - - - - -] DHCP configuration for ports {'6b2570c6-8494-4617-b77d-366d6b96ff7b'} is completed
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e188 do_prune osdmap full prune enabled
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: pgmap v398: 177 pgs: 177 active+clean; 193 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 24 KiB/s wr, 64 op/s
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: osdmap e188: 6 total, 6 up, 6 in
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4095004814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "format": "json"}]: dispatch
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3407317111' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e189 e189: 6 total, 6 up, 6 in
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:21:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2519745187' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:48.413 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:21:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:48.419 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:21:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:48.437 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:21:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:48.440 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:21:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:48.441 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:48 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:48.457 262572 INFO neutron.agent.dhcp.agent [None req-c38eb170-526e-4039-b73f-f4ae62e3a784 - - - - - -] DHCP configuration for ports {'ce9477a7-98b7-4cc7-9a61-203aa99c9ae7'} is completed
Dec 06 10:21:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:48.620 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:48 np0005548788.localdomain dnsmasq[324049]: exiting on receipt of SIGTERM
Dec 06 10:21:48 np0005548788.localdomain podman[324201]: 2025-12-06 10:21:48.870726413 +0000 UTC m=+0.060819515 container kill ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:48 np0005548788.localdomain systemd[1]: libpod-ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035.scope: Deactivated successfully.
Dec 06 10:21:48 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:48.912 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:47Z, description=, device_id=ce829857-8272-489e-9d8d-e074f2d58a5d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6696550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6696520>], id=ce9477a7-98b7-4cc7-9a61-203aa99c9ae7, ip_allocation=immediate, mac_address=fa:16:3e:84:14:1e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:41Z, description=, dns_domain=, id=368e35d9-76eb-4980-95bb-4c79010f8e1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1371334496-network, port_security_enabled=True, project_id=fd1c979900294beeb6f273c0e1a6333a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61153, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2933, status=ACTIVE, subnets=['03d088e2-10aa-4810-b478-5bcd3b76c3e8'], tags=[], tenant_id=fd1c979900294beeb6f273c0e1a6333a, updated_at=2025-12-06T10:21:42Z, vlan_transparent=None, network_id=368e35d9-76eb-4980-95bb-4c79010f8e1c, port_security_enabled=False, project_id=fd1c979900294beeb6f273c0e1a6333a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2979, status=DOWN, tags=[], tenant_id=fd1c979900294beeb6f273c0e1a6333a, updated_at=2025-12-06T10:21:47Z on network 368e35d9-76eb-4980-95bb-4c79010f8e1c
Dec 06 10:21:48 np0005548788.localdomain podman[324215]: 2025-12-06 10:21:48.944085985 +0000 UTC m=+0.061091113 container died ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:21:48 np0005548788.localdomain podman[324215]: 2025-12-06 10:21:48.977408807 +0000 UTC m=+0.094413905 container cleanup ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:21:48 np0005548788.localdomain systemd[1]: libpod-conmon-ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035.scope: Deactivated successfully.
Dec 06 10:21:49 np0005548788.localdomain podman[324217]: 2025-12-06 10:21:49.031595476 +0000 UTC m=+0.140594996 container remove ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:21:49 np0005548788.localdomain dnsmasq[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/addn_hosts - 1 addresses
Dec 06 10:21:49 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/host
Dec 06 10:21:49 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/opts
Dec 06 10:21:49 np0005548788.localdomain podman[324261]: 2025-12-06 10:21:49.13182041 +0000 UTC m=+0.053225429 container kill 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:49 np0005548788.localdomain ceph-mon[293643]: osdmap e189: 6 total, 6 up, 6 in
Dec 06 10:21:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2519745187' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-81986b6175f1cffad3c1cfb430ac6cbf89daadfacde419093042e770dd8657b9-merged.mount: Deactivated successfully.
Dec 06 10:21:49 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ccecd953be7ee76bf9dedba8922f07c0ffb6330fb0ea2c7a4dcbaaf61c8c5035-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:49 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:49.513 262572 INFO neutron.agent.dhcp.agent [None req-580c0c98-159c-4a0f-90ec-120a65767b4b - - - - - -] DHCP configuration for ports {'ce9477a7-98b7-4cc7-9a61-203aa99c9ae7'} is completed
Dec 06 10:21:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:21:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:21:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:21:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:21:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:21:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19213 "" "Go-http-client/1.1"
Dec 06 10:21:49 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:49Z|00256|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:21:49 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:49Z|00257|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:21:49 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:49Z|00258|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:21:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:49.985 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:49.990 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:49.993 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:50.009 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:50 np0005548788.localdomain podman[324330]: 
Dec 06 10:21:50 np0005548788.localdomain podman[324330]: 2025-12-06 10:21:50.01993054 +0000 UTC m=+0.101927689 container create b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:21:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:50.033 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:50.044 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:50 np0005548788.localdomain systemd[1]: Started libpod-conmon-b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae.scope.
Dec 06 10:21:50 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:50 np0005548788.localdomain podman[324330]: 2025-12-06 10:21:49.973442079 +0000 UTC m=+0.055439278 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:50 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbca539b0b48c9d6aa612d6a6ae5206beffc5f851f856245cff0c1edc989e8d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:50 np0005548788.localdomain podman[324330]: 2025-12-06 10:21:50.090795095 +0000 UTC m=+0.172792224 container init b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:21:50 np0005548788.localdomain podman[324330]: 2025-12-06 10:21:50.104628693 +0000 UTC m=+0.186625852 container start b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:21:50 np0005548788.localdomain dnsmasq[324350]: started, version 2.85 cachesize 150
Dec 06 10:21:50 np0005548788.localdomain dnsmasq[324350]: DNS service limited to local subnets
Dec 06 10:21:50 np0005548788.localdomain dnsmasq[324350]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:50 np0005548788.localdomain dnsmasq[324350]: warning: no upstream servers configured
Dec 06 10:21:50 np0005548788.localdomain dnsmasq-dhcp[324350]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:21:50 np0005548788.localdomain dnsmasq[324350]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 0 addresses
Dec 06 10:21:50 np0005548788.localdomain dnsmasq-dhcp[324350]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:50 np0005548788.localdomain dnsmasq-dhcp[324350]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:50 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:50.419 262572 INFO neutron.agent.dhcp.agent [None req-3523318e-3e0f-47dc-aac7-ad9c9d743a73 - - - - - -] DHCP configuration for ports {'f92ead08-1ccb-4037-aa51-866ea98cc833', 'e3d26bb5-a627-44c1-951d-86a7aa4ab371', '377c93f2-4b77-4cc8-942d-efb146acede8'} is completed
Dec 06 10:21:50 np0005548788.localdomain ceph-mon[293643]: pgmap v401: 177 pgs: 177 active+clean; 194 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 47 KiB/s wr, 78 op/s
Dec 06 10:21:50 np0005548788.localdomain podman[324367]: 2025-12-06 10:21:50.497051809 +0000 UTC m=+0.061057983 container kill b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:50 np0005548788.localdomain dnsmasq[324350]: exiting on receipt of SIGTERM
Dec 06 10:21:50 np0005548788.localdomain systemd[1]: libpod-b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae.scope: Deactivated successfully.
Dec 06 10:21:50 np0005548788.localdomain podman[324383]: 2025-12-06 10:21:50.560740092 +0000 UTC m=+0.046007237 container died b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:50 np0005548788.localdomain systemd[1]: tmp-crun.3UzQO1.mount: Deactivated successfully.
Dec 06 10:21:50 np0005548788.localdomain podman[324383]: 2025-12-06 10:21:50.60654513 +0000 UTC m=+0.091812235 container remove b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:21:50 np0005548788.localdomain systemd[1]: libpod-conmon-b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae.scope: Deactivated successfully.
Dec 06 10:21:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:50.906 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:51.058 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:51 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:51.080 2 INFO neutron.agent.securitygroups_rpc [None req-9aa7129f-1e5c-4b11-befb-2e4631e91223 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['2129cf94-a39f-4e4e-ab36-0d488acfdae6']
Dec 06 10:21:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-bbca539b0b48c9d6aa612d6a6ae5206beffc5f851f856245cff0c1edc989e8d4-merged.mount: Deactivated successfully.
Dec 06 10:21:51 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8b0073f4771e9ba0d1dfcbb7c0c4520e4e9b4370f4ebf88d34c3cdbbe28d9ae-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:51.639 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:51.819 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:52 np0005548788.localdomain podman[324455]: 
Dec 06 10:21:52 np0005548788.localdomain podman[324455]: 2025-12-06 10:21:52.019066073 +0000 UTC m=+0.084967362 container create 75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:21:52 np0005548788.localdomain systemd[1]: Started libpod-conmon-75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0.scope.
Dec 06 10:21:52 np0005548788.localdomain systemd[1]: tmp-crun.jycYrv.mount: Deactivated successfully.
Dec 06 10:21:52 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:52 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52e547fa5f3d6bb9032a781c393b7a000c05bacb0cae79cc2c74410ca876cd23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:52 np0005548788.localdomain podman[324455]: 2025-12-06 10:21:51.979648582 +0000 UTC m=+0.045549921 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:52 np0005548788.localdomain podman[324455]: 2025-12-06 10:21:52.085900004 +0000 UTC m=+0.151801303 container init 75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:21:52 np0005548788.localdomain podman[324455]: 2025-12-06 10:21:52.091322062 +0000 UTC m=+0.157223351 container start 75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:52 np0005548788.localdomain dnsmasq[324474]: started, version 2.85 cachesize 150
Dec 06 10:21:52 np0005548788.localdomain dnsmasq[324474]: DNS service limited to local subnets
Dec 06 10:21:52 np0005548788.localdomain dnsmasq[324474]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:52 np0005548788.localdomain dnsmasq[324474]: warning: no upstream servers configured
Dec 06 10:21:52 np0005548788.localdomain dnsmasq-dhcp[324474]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:52 np0005548788.localdomain dnsmasq-dhcp[324474]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:21:52 np0005548788.localdomain dnsmasq[324474]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 0 addresses
Dec 06 10:21:52 np0005548788.localdomain dnsmasq-dhcp[324474]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:52 np0005548788.localdomain dnsmasq-dhcp[324474]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e189 do_prune osdmap full prune enabled
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e190 e190: 6 total, 6 up, 6 in
Dec 06 10:21:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:52.443 262572 INFO neutron.agent.dhcp.agent [None req-488b729a-f0ba-4b50-829d-c6c2438997f5 - - - - - -] DHCP configuration for ports {'f92ead08-1ccb-4037-aa51-866ea98cc833', 'e3d26bb5-a627-44c1-951d-86a7aa4ab371', '377c93f2-4b77-4cc8-942d-efb146acede8'} is completed
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "snap_name": "34ca82e6-47fe-4edf-98f0-b49a22c3b971", "format": "json"}]: dispatch
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: pgmap v402: 177 pgs: 177 active+clean; 194 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 17 KiB/s wr, 3 op/s
Dec 06 10:21:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:52 np0005548788.localdomain dnsmasq[324474]: exiting on receipt of SIGTERM
Dec 06 10:21:52 np0005548788.localdomain podman[324492]: 2025-12-06 10:21:52.45834032 +0000 UTC m=+0.055510440 container kill 75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:52 np0005548788.localdomain systemd[1]: libpod-75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0.scope: Deactivated successfully.
Dec 06 10:21:52 np0005548788.localdomain podman[324508]: 2025-12-06 10:21:52.527340498 +0000 UTC m=+0.046701608 container died 75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:52 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:52 np0005548788.localdomain podman[324508]: 2025-12-06 10:21:52.58033688 +0000 UTC m=+0.099697940 container remove 75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:21:52 np0005548788.localdomain systemd[1]: libpod-conmon-75ab250acae95e82186557adb543432e4c16e80c92513f94e148bc3e2593a0a0.scope: Deactivated successfully.
Dec 06 10:21:52 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:52.771 2 INFO neutron.agent.securitygroups_rpc [None req-1911ca26-173a-428d-a937-0c8a9ec17a18 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['2129cf94-a39f-4e4e-ab36-0d488acfdae6', '398ea86c-4672-439c-a6e6-0b07306b07fb', 'fb8e4f1d-f459-47af-ae16-0205f1ef2540']
Dec 06 10:21:53 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:53.252 2 INFO neutron.agent.securitygroups_rpc [None req-e01a41a9-2c25-4070-a80a-0e758300cea8 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['398ea86c-4672-439c-a6e6-0b07306b07fb', 'fb8e4f1d-f459-47af-ae16-0205f1ef2540']
Dec 06 10:21:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-52e547fa5f3d6bb9032a781c393b7a000c05bacb0cae79cc2c74410ca876cd23-merged.mount: Deactivated successfully.
Dec 06 10:21:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "format": "json"}]: dispatch
Dec 06 10:21:53 np0005548788.localdomain ceph-mon[293643]: osdmap e190: 6 total, 6 up, 6 in
Dec 06 10:21:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:53.622 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548788.localdomain podman[324583]: 
Dec 06 10:21:54 np0005548788.localdomain podman[324583]: 2025-12-06 10:21:54.498232667 +0000 UTC m=+0.094390404 container create 4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:21:54 np0005548788.localdomain ceph-mon[293643]: pgmap v404: 177 pgs: 177 active+clean; 194 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 37 KiB/s wr, 53 op/s
Dec 06 10:21:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:21:54 np0005548788.localdomain systemd[1]: Started libpod-conmon-4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd.scope.
Dec 06 10:21:54 np0005548788.localdomain podman[324583]: 2025-12-06 10:21:54.450280541 +0000 UTC m=+0.046438358 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:54 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:54 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a20159e9fb0880cc0971fe1e45b7c6de6dc5aa8a2e906976f3d64e4fbd0d9f28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:54 np0005548788.localdomain podman[324583]: 2025-12-06 10:21:54.584396766 +0000 UTC m=+0.180554513 container init 4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:54 np0005548788.localdomain dnsmasq[324612]: started, version 2.85 cachesize 150
Dec 06 10:21:54 np0005548788.localdomain dnsmasq[324612]: DNS service limited to local subnets
Dec 06 10:21:54 np0005548788.localdomain dnsmasq[324612]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:54 np0005548788.localdomain dnsmasq[324612]: warning: no upstream servers configured
Dec 06 10:21:54 np0005548788.localdomain dnsmasq-dhcp[324612]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:54 np0005548788.localdomain dnsmasq-dhcp[324612]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d
Dec 06 10:21:54 np0005548788.localdomain dnsmasq-dhcp[324612]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:21:54 np0005548788.localdomain dnsmasq[324612]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/addn_hosts - 1 addresses
Dec 06 10:21:54 np0005548788.localdomain dnsmasq-dhcp[324612]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/host
Dec 06 10:21:54 np0005548788.localdomain dnsmasq-dhcp[324612]: read /var/lib/neutron/dhcp/8f6f0031-6aba-4497-bf55-cc7008fa0e7e/opts
Dec 06 10:21:54 np0005548788.localdomain podman[324596]: 2025-12-06 10:21:54.626293774 +0000 UTC m=+0.087822881 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:21:54 np0005548788.localdomain podman[324596]: 2025-12-06 10:21:54.638844053 +0000 UTC m=+0.100373210 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:54 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:21:54 np0005548788.localdomain podman[324583]: 2025-12-06 10:21:54.694985542 +0000 UTC m=+0.291143279 container start 4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:54.980 262572 INFO neutron.agent.dhcp.agent [None req-66a1d1b9-d3c5-4fb3-a3f6-44e6c162a185 - - - - - -] DHCP configuration for ports {'f92ead08-1ccb-4037-aa51-866ea98cc833', 'e3d26bb5-a627-44c1-951d-86a7aa4ab371', '377c93f2-4b77-4cc8-942d-efb146acede8', '072a7b4d-e9ac-46d1-89e3-9b9adc978b9b'} is completed
Dec 06 10:21:55 np0005548788.localdomain dnsmasq[324612]: exiting on receipt of SIGTERM
Dec 06 10:21:55 np0005548788.localdomain podman[324637]: 2025-12-06 10:21:55.043055883 +0000 UTC m=+0.062227058 container kill 4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:55 np0005548788.localdomain systemd[1]: libpod-4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd.scope: Deactivated successfully.
Dec 06 10:21:55 np0005548788.localdomain podman[324652]: 2025-12-06 10:21:55.114616919 +0000 UTC m=+0.051579108 container died 4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:21:55 np0005548788.localdomain podman[324652]: 2025-12-06 10:21:55.161796031 +0000 UTC m=+0.098758170 container remove 4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f6f0031-6aba-4497-bf55-cc7008fa0e7e, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:21:55 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:55Z|00259|binding|INFO|Releasing lport f92ead08-1ccb-4037-aa51-866ea98cc833 from this chassis (sb_readonly=0)
Dec 06 10:21:55 np0005548788.localdomain kernel: device tapf92ead08-1c left promiscuous mode
Dec 06 10:21:55 np0005548788.localdomain systemd[1]: libpod-conmon-4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd.scope: Deactivated successfully.
Dec 06 10:21:55 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:21:55Z|00260|binding|INFO|Setting lport f92ead08-1ccb-4037-aa51-866ea98cc833 down in Southbound
Dec 06 10:21:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:55.221 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:55.231 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8:0:2::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-8f6f0031-6aba-4497-bf55-cc7008fa0e7e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f6f0031-6aba-4497-bf55-cc7008fa0e7e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8baea40-88ff-45fd-b627-b6a55f59f6f6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=f92ead08-1ccb-4037-aa51-866ea98cc833) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:55.233 159620 INFO neutron.agent.ovn.metadata.agent [-] Port f92ead08-1ccb-4037-aa51-866ea98cc833 in datapath 8f6f0031-6aba-4497-bf55-cc7008fa0e7e unbound from our chassis
Dec 06 10:21:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:55.235 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f6f0031-6aba-4497-bf55-cc7008fa0e7e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:21:55.237 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c45def-eafe-4759-9937-2295d3c5cf22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:55.246 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:55 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:21:55.378 2 INFO neutron.agent.securitygroups_rpc [None req-41969300-114c-4f41-96fa-3fc73bdf2a0b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:55.441 262572 INFO neutron.agent.dhcp.agent [None req-3cb4fc51-9af0-4605-ad73-27dc0dd193fb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:55.442 262572 INFO neutron.agent.dhcp.agent [None req-3cb4fc51-9af0-4605-ad73-27dc0dd193fb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:55.442 262572 INFO neutron.agent.dhcp.agent [None req-3cb4fc51-9af0-4605-ad73-27dc0dd193fb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-a20159e9fb0880cc0971fe1e45b7c6de6dc5aa8a2e906976f3d64e4fbd0d9f28-merged.mount: Deactivated successfully.
Dec 06 10:21:55 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ab38909352a70aca01396e91b98ba1dd5a00a78a948f4be10058d2ff9918ddd-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:55 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d8f6f0031\x2d6aba\x2d4497\x2dbf55\x2dcc7008fa0e7e.mount: Deactivated successfully.
Dec 06 10:21:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1393965120' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1393965120' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:55 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:55 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:55 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:21:55.742 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:55.910 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:56.005 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:21:56 np0005548788.localdomain podman[324677]: 2025-12-06 10:21:56.228992558 +0000 UTC m=+0.061839387 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:21:56 np0005548788.localdomain podman[324677]: 2025-12-06 10:21:56.232460155 +0000 UTC m=+0.065307014 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:21:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:21:56 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:21:56 np0005548788.localdomain podman[324695]: 2025-12-06 10:21:56.345743294 +0000 UTC m=+0.075430197 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:21:56 np0005548788.localdomain podman[324695]: 2025-12-06 10:21:56.358632783 +0000 UTC m=+0.088319686 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:21:56 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:21:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "format": "json"}]: dispatch
Dec 06 10:21:56 np0005548788.localdomain ceph-mon[293643]: pgmap v405: 177 pgs: 177 active+clean; 194 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 28 KiB/s wr, 40 op/s
Dec 06 10:21:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "snap_name": "80ddbc86-47b7-40f0-8588-929d2cc3feb6", "format": "json"}]: dispatch
Dec 06 10:21:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/880620488' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/880620488' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e190 do_prune osdmap full prune enabled
Dec 06 10:21:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e191 e191: 6 total, 6 up, 6 in
Dec 06 10:21:57 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in
Dec 06 10:21:58 np0005548788.localdomain ceph-mon[293643]: pgmap v406: 177 pgs: 177 active+clean; 194 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 10 KiB/s wr, 33 op/s
Dec 06 10:21:58 np0005548788.localdomain ceph-mon[293643]: osdmap e191: 6 total, 6 up, 6 in
Dec 06 10:21:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:21:58.626 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "format": "json"}]: dispatch
Dec 06 10:21:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:21:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e191 do_prune osdmap full prune enabled
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "snap_name": "80ddbc86-47b7-40f0-8588-929d2cc3feb6_17d9a7bb-1ff4-4445-bd23-f0d930d9f87b", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "snap_name": "80ddbc86-47b7-40f0-8588-929d2cc3feb6", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: pgmap v408: 177 pgs: 177 active+clean; 282 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 11 MiB/s wr, 105 op/s
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e192 e192: 6 total, 6 up, 6 in
Dec 06 10:22:00 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in
Dec 06 10:22:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:00.917 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:01 np0005548788.localdomain sshd[324718]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:22:01 np0005548788.localdomain sshd[324718]: error: kex_exchange_identification: client sent invalid protocol identifier ""
Dec 06 10:22:01 np0005548788.localdomain sshd[324718]: banner exchange: Connection from 3.131.215.38 port 34728: invalid format
Dec 06 10:22:01 np0005548788.localdomain sudo[324719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:22:01 np0005548788.localdomain sudo[324719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:01 np0005548788.localdomain sudo[324719]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:01 np0005548788.localdomain sudo[324737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:22:01 np0005548788.localdomain sudo[324737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:01 np0005548788.localdomain ceph-mon[293643]: osdmap e192: 6 total, 6 up, 6 in
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.377005) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522377079, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2187, "num_deletes": 261, "total_data_size": 2940617, "memory_usage": 3059000, "flush_reason": "Manual Compaction"}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522392330, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2373234, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30439, "largest_seqno": 32625, "table_properties": {"data_size": 2365392, "index_size": 4347, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 21482, "raw_average_key_size": 22, "raw_value_size": 2347822, "raw_average_value_size": 2443, "num_data_blocks": 187, "num_entries": 961, "num_filter_entries": 961, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016395, "oldest_key_time": 1765016395, "file_creation_time": 1765016522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 15370 microseconds, and 7169 cpu microseconds.
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.392384) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2373234 bytes OK
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.392410) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.394298) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.394324) EVENT_LOG_v1 {"time_micros": 1765016522394316, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.394348) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 2930963, prev total WAL file size 2931287, number of live WAL files 2.
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.395595) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303034' seq:72057594037927935, type:22 .. '6D6772737461740034323536' seq:0, type:0; will stop at (end)
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2317KB)], [54(17MB)]
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522395666, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 21195694, "oldest_snapshot_seqno": -1}
Dec 06 10:22:02 np0005548788.localdomain podman[324828]: 2025-12-06 10:22:02.475284227 +0000 UTC m=+0.103808876 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13318 keys, 19558153 bytes, temperature: kUnknown
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522512710, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 19558153, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19482884, "index_size": 40864, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33349, "raw_key_size": 355670, "raw_average_key_size": 26, "raw_value_size": 19257347, "raw_average_value_size": 1445, "num_data_blocks": 1544, "num_entries": 13318, "num_filter_entries": 13318, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.513326) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 19558153 bytes
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.515548) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.6 rd, 166.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 18.0 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(17.2) write-amplify(8.2) OK, records in: 13796, records dropped: 478 output_compression: NoCompression
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.515580) EVENT_LOG_v1 {"time_micros": 1765016522515565, "job": 32, "event": "compaction_finished", "compaction_time_micros": 117336, "compaction_time_cpu_micros": 60019, "output_level": 6, "num_output_files": 1, "total_output_size": 19558153, "num_input_records": 13796, "num_output_records": 13318, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522516582, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522519739, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.395438) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.519917) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.519926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.519929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.519933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:02.519936) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548788.localdomain podman[324828]: 2025-12-06 10:22:02.623911201 +0000 UTC m=+0.252435860 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-type=git, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: pgmap v410: 177 pgs: 177 active+clean; 282 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 11 MiB/s wr, 68 op/s
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:03 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:03.202 262572 INFO neutron.agent.linux.ip_lib [None req-da7a394e-fa43-4659-b568-b6369f064c2d - - - - - -] Device tap995d2357-78 cannot be used as it has no MAC address
Dec 06 10:22:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:03.224 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:03 np0005548788.localdomain kernel: device tap995d2357-78 entered promiscuous mode
Dec 06 10:22:03 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016523.2317] manager: (tap995d2357-78): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Dec 06 10:22:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:03Z|00261|binding|INFO|Claiming lport 995d2357-7832-4bbb-8257-ec295217d70d for this chassis.
Dec 06 10:22:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:03Z|00262|binding|INFO|995d2357-7832-4bbb-8257-ec295217d70d: Claiming unknown
Dec 06 10:22:03 np0005548788.localdomain systemd-udevd[324954]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:22:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:03.236 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:03 np0005548788.localdomain sudo[324737]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:03Z|00263|binding|INFO|Setting lport 995d2357-7832-4bbb-8257-ec295217d70d ovn-installed in OVS
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:03.276 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap995d2357-78: No such device
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:22:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:03.313 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:03Z|00264|binding|INFO|Setting lport 995d2357-7832-4bbb-8257-ec295217d70d up in Southbound
Dec 06 10:22:03 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:03.336 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e30123e-2eb0-49cc-904b-5f16b0ffade9, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=995d2357-7832-4bbb-8257-ec295217d70d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:03 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:03.338 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 995d2357-7832-4bbb-8257-ec295217d70d in datapath 14c1b0b6-2513-481e-b567-f9b0cfa9ae0d bound to our chassis
Dec 06 10:22:03 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:03.341 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port c6cb025d-fafb-4672-ae61-61453fa96d87 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:22:03 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:03.341 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:22:03 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:03.343 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f31646-31e5-4e42-b15b-905d6ff0b739]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:03.348 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:03 np0005548788.localdomain sudo[324978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:22:03 np0005548788.localdomain sudo[324978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:03 np0005548788.localdomain sudo[324978]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:03 np0005548788.localdomain sudo[325000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:22:03 np0005548788.localdomain sudo[325000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:03.628 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:22:04 np0005548788.localdomain sudo[325000]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: pgmap v411: 177 pgs: 177 active+clean; 394 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 59 KiB/s rd, 25 MiB/s wr, 92 op/s
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:22:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548788.localdomain podman[325093]: 
Dec 06 10:22:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:04.322 262572 INFO neutron.agent.linux.ip_lib [None req-940365a4-c358-43c8-ac78-af14234f0483 - - - - - -] Device tap13ac79d0-81 cannot be used as it has no MAC address
Dec 06 10:22:04 np0005548788.localdomain sudo[325103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:22:04 np0005548788.localdomain podman[325093]: 2025-12-06 10:22:04.33912817 +0000 UTC m=+0.096181260 container create 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:04 np0005548788.localdomain sudo[325103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:04 np0005548788.localdomain sudo[325103]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:04.365 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548788.localdomain kernel: device tap13ac79d0-81 entered promiscuous mode
Dec 06 10:22:04 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016524.3755] manager: (tap13ac79d0-81): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Dec 06 10:22:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:04.374 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:04Z|00265|binding|INFO|Claiming lport 13ac79d0-81a1-4e9d-a090-1dc4be832223 for this chassis.
Dec 06 10:22:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:04Z|00266|binding|INFO|13ac79d0-81a1-4e9d-a090-1dc4be832223: Claiming unknown
Dec 06 10:22:04 np0005548788.localdomain podman[325093]: 2025-12-06 10:22:04.282828026 +0000 UTC m=+0.039881136 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:22:04 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:04.393 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-1c1b5fa9-2c40-4a7f-9138-79f42b49a545', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c1b5fa9-2c40-4a7f-9138-79f42b49a545', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4daafaf0e264da6a728bdd60c5d6377', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50becbcf-4258-41c3-8a76-24f257b0d832, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=13ac79d0-81a1-4e9d-a090-1dc4be832223) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:04 np0005548788.localdomain systemd[1]: Started libpod-conmon-71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89.scope.
Dec 06 10:22:04 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:04.396 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 13ac79d0-81a1-4e9d-a090-1dc4be832223 in datapath 1c1b5fa9-2c40-4a7f-9138-79f42b49a545 bound to our chassis
Dec 06 10:22:04 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:04.398 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1c1b5fa9-2c40-4a7f-9138-79f42b49a545 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:22:04 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:04.399 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7acc6e7-a87d-4a6c-9301-729965996f30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa63aa6ead9adde18ed66105b1a7ccd465f50c4de5378c4bb0c2b989314f5b94/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:22:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:04.423 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:04Z|00267|binding|INFO|Setting lport 13ac79d0-81a1-4e9d-a090-1dc4be832223 ovn-installed in OVS
Dec 06 10:22:04 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:04Z|00268|binding|INFO|Setting lport 13ac79d0-81a1-4e9d-a090-1dc4be832223 up in Southbound
Dec 06 10:22:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:04.428 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap13ac79d0-81: No such device
Dec 06 10:22:04 np0005548788.localdomain podman[325093]: 2025-12-06 10:22:04.450718907 +0000 UTC m=+0.207771987 container init 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:22:04 np0005548788.localdomain podman[325093]: 2025-12-06 10:22:04.460405847 +0000 UTC m=+0.217458927 container start 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:22:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:04.459 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548788.localdomain dnsmasq[325158]: started, version 2.85 cachesize 150
Dec 06 10:22:04 np0005548788.localdomain dnsmasq[325158]: DNS service limited to local subnets
Dec 06 10:22:04 np0005548788.localdomain dnsmasq[325158]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:22:04 np0005548788.localdomain dnsmasq[325158]: warning: no upstream servers configured
Dec 06 10:22:04 np0005548788.localdomain dnsmasq-dhcp[325158]: DHCP, static leases only on 10.102.0.0, lease time 1d
Dec 06 10:22:04 np0005548788.localdomain dnsmasq[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/addn_hosts - 0 addresses
Dec 06 10:22:04 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/host
Dec 06 10:22:04 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/opts
Dec 06 10:22:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:04.490 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:04.521 262572 INFO neutron.agent.dhcp.agent [None req-b56c6f7b-e803-4a4b-80ad-3f2a0753d645 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:02Z, description=, device_id=2b50982b-9c62-4665-9d93-c7bc12ad6e52, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6791cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6791df0>], id=07396789-cd61-4b90-85b0-9158a9476843, ip_allocation=immediate, mac_address=fa:16:3e:99:fe:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:59Z, description=, dns_domain=, id=14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-318220047, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40704, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3051, status=ACTIVE, subnets=['65cdf3b0-98f1-4dce-97e0-e5fecee182c6'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:00Z, vlan_transparent=None, network_id=14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3076, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:02Z on network 14c1b0b6-2513-481e-b567-f9b0cfa9ae0d
Dec 06 10:22:04 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:04.597 262572 INFO neutron.agent.dhcp.agent [None req-f3e49443-0ed4-44e6-9a34-213ce6318aa0 - - - - - -] DHCP configuration for ports {'400b8cd1-2739-4f96-bb6b-6f22d996ea66'} is completed
Dec 06 10:22:04 np0005548788.localdomain dnsmasq[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/addn_hosts - 1 addresses
Dec 06 10:22:04 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/host
Dec 06 10:22:04 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/opts
Dec 06 10:22:04 np0005548788.localdomain podman[325184]: 2025-12-06 10:22:04.791922666 +0000 UTC m=+0.065626575 container kill 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:05.072 262572 INFO neutron.agent.dhcp.agent [None req-d9e5c31a-9ce6-4324-88e2-c232adb9f12b - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:02Z, description=, device_id=2b50982b-9c62-4665-9d93-c7bc12ad6e52, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ca160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ada30>], id=07396789-cd61-4b90-85b0-9158a9476843, ip_allocation=immediate, mac_address=fa:16:3e:99:fe:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:59Z, description=, dns_domain=, id=14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-318220047, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40704, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3051, status=ACTIVE, subnets=['65cdf3b0-98f1-4dce-97e0-e5fecee182c6'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:00Z, vlan_transparent=None, network_id=14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3076, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:02Z on network 14c1b0b6-2513-481e-b567-f9b0cfa9ae0d
Dec 06 10:22:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:05.181 262572 INFO neutron.agent.dhcp.agent [None req-0542caec-a2e9-4d20-a922-09d05864ddf0 - - - - - -] DHCP configuration for ports {'07396789-cd61-4b90-85b0-9158a9476843'} is completed
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "format": "json"}]: dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain podman[325235]: 2025-12-06 10:22:05.319749685 +0000 UTC m=+0.057232953 container kill 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:22:05 np0005548788.localdomain dnsmasq[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/addn_hosts - 1 addresses
Dec 06 10:22:05 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/host
Dec 06 10:22:05 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/opts
Dec 06 10:22:05 np0005548788.localdomain podman[325273]: 
Dec 06 10:22:05 np0005548788.localdomain podman[325273]: 2025-12-06 10:22:05.512877337 +0000 UTC m=+0.093952750 container create 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:22:05 np0005548788.localdomain systemd[1]: Started libpod-conmon-02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed.scope.
Dec 06 10:22:05 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:22:05 np0005548788.localdomain podman[325273]: 2025-12-06 10:22:05.469279207 +0000 UTC m=+0.050354670 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:22:05 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f7709b627f8d9985c22fdfa3dfda3c0c6a2d24ce1669476a5a54438984d0391/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:22:05 np0005548788.localdomain podman[325273]: 2025-12-06 10:22:05.579631655 +0000 UTC m=+0.160707098 container init 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:05 np0005548788.localdomain podman[325273]: 2025-12-06 10:22:05.589141089 +0000 UTC m=+0.170216522 container start 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:22:05 np0005548788.localdomain dnsmasq[325295]: started, version 2.85 cachesize 150
Dec 06 10:22:05 np0005548788.localdomain dnsmasq[325295]: DNS service limited to local subnets
Dec 06 10:22:05 np0005548788.localdomain dnsmasq[325295]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:22:05 np0005548788.localdomain dnsmasq[325295]: warning: no upstream servers configured
Dec 06 10:22:05 np0005548788.localdomain dnsmasq-dhcp[325295]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:22:05 np0005548788.localdomain dnsmasq[325295]: read /var/lib/neutron/dhcp/1c1b5fa9-2c40-4a7f-9138-79f42b49a545/addn_hosts - 0 addresses
Dec 06 10:22:05 np0005548788.localdomain dnsmasq-dhcp[325295]: read /var/lib/neutron/dhcp/1c1b5fa9-2c40-4a7f-9138-79f42b49a545/host
Dec 06 10:22:05 np0005548788.localdomain dnsmasq-dhcp[325295]: read /var/lib/neutron/dhcp/1c1b5fa9-2c40-4a7f-9138-79f42b49a545/opts
Dec 06 10:22:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:05.657 262572 INFO neutron.agent.dhcp.agent [None req-6f906edd-140f-4fbd-8e58-bbcf95019ec1 - - - - - -] DHCP configuration for ports {'07396789-cd61-4b90-85b0-9158a9476843'} is completed
Dec 06 10:22:05 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:05.815 262572 INFO neutron.agent.dhcp.agent [None req-0bfc826d-ab76-4ba9-87c5-e30f68af6bd7 - - - - - -] DHCP configuration for ports {'aa776f53-53cd-4d3b-82c3-bbd31efdb739'} is completed
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:05 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:05.946 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:06 np0005548788.localdomain ceph-mon[293643]: pgmap v412: 177 pgs: 177 active+clean; 394 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 59 KiB/s rd, 25 MiB/s wr, 92 op/s
Dec 06 10:22:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "format": "json"}]: dispatch
Dec 06 10:22:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "format": "json"}]: dispatch
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e192 do_prune osdmap full prune enabled
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e193 e193: 6 total, 6 up, 6 in
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.403781) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527404219, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 429, "num_deletes": 253, "total_data_size": 272250, "memory_usage": 281832, "flush_reason": "Manual Compaction"}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527408859, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 270596, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32626, "largest_seqno": 33054, "table_properties": {"data_size": 267814, "index_size": 765, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6684, "raw_average_key_size": 18, "raw_value_size": 261976, "raw_average_value_size": 727, "num_data_blocks": 29, "num_entries": 360, "num_filter_entries": 360, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016522, "oldest_key_time": 1765016522, "file_creation_time": 1765016527, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 4897 microseconds, and 1937 cpu microseconds.
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.408924) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 270596 bytes OK
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.408949) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.410744) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.410767) EVENT_LOG_v1 {"time_micros": 1765016527410760, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.410793) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 269450, prev total WAL file size 269450, number of live WAL files 2.
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.411735) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353336' seq:72057594037927935, type:22 .. '6B760031373930' seq:0, type:0; will stop at (end)
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(264KB)], [57(18MB)]
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527411801, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 19828749, "oldest_snapshot_seqno": -1}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13146 keys, 18733152 bytes, temperature: kUnknown
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527512880, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 18733152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18660090, "index_size": 39105, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32901, "raw_key_size": 353794, "raw_average_key_size": 26, "raw_value_size": 18438291, "raw_average_value_size": 1402, "num_data_blocks": 1451, "num_entries": 13146, "num_filter_entries": 13146, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016527, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.513244) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 18733152 bytes
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516137) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 196.0 rd, 185.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 18.7 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(142.5) write-amplify(69.2) OK, records in: 13678, records dropped: 532 output_compression: NoCompression
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516159) EVENT_LOG_v1 {"time_micros": 1765016527516148, "job": 34, "event": "compaction_finished", "compaction_time_micros": 101172, "compaction_time_cpu_micros": 54057, "output_level": 6, "num_output_files": 1, "total_output_size": 18733152, "num_input_records": 13678, "num_output_records": 13146, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527516367, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527518027, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.411609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.518127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.518138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.518142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.518145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:07.518148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:08 np0005548788.localdomain ceph-mon[293643]: pgmap v413: 177 pgs: 177 active+clean; 394 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 48 KiB/s rd, 20 MiB/s wr, 74 op/s
Dec 06 10:22:08 np0005548788.localdomain ceph-mon[293643]: osdmap e193: 6 total, 6 up, 6 in
Dec 06 10:22:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:08.661 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:22:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:22:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:09 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:10.149 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:10 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:22:10 np0005548788.localdomain systemd[1]: tmp-crun.UkRmxk.mount: Deactivated successfully.
Dec 06 10:22:10 np0005548788.localdomain podman[325296]: 2025-12-06 10:22:10.275809031 +0000 UTC m=+0.094069355 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:22:10 np0005548788.localdomain podman[325296]: 2025-12-06 10:22:10.319746092 +0000 UTC m=+0.138006456 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:22:10 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:22:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4c1f4920-d346-409d-af49-370c9b85e205", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4c1f4920-d346-409d-af49-370c9b85e205", "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548788.localdomain ceph-mon[293643]: pgmap v415: 177 pgs: 177 active+clean; 514 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 22 KiB/s rd, 27 MiB/s wr, 44 op/s
Dec 06 10:22:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:10.950 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:12 np0005548788.localdomain ceph-mon[293643]: pgmap v416: 177 pgs: 177 active+clean; 514 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 37 op/s
Dec 06 10:22:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:13 np0005548788.localdomain dnsmasq[325295]: exiting on receipt of SIGTERM
Dec 06 10:22:13 np0005548788.localdomain podman[325336]: 2025-12-06 10:22:13.188641847 +0000 UTC m=+0.071342770 container kill 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: libpod-02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed.scope: Deactivated successfully.
Dec 06 10:22:13 np0005548788.localdomain podman[325348]: 2025-12-06 10:22:13.268979436 +0000 UTC m=+0.060390722 container died 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed-userdata-shm.mount: Deactivated successfully.
Dec 06 10:22:13 np0005548788.localdomain podman[325348]: 2025-12-06 10:22:13.304082243 +0000 UTC m=+0.095493499 container cleanup 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: libpod-conmon-02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed.scope: Deactivated successfully.
Dec 06 10:22:13 np0005548788.localdomain podman[325350]: 2025-12-06 10:22:13.352324768 +0000 UTC m=+0.139875214 container remove 02bea6808c7ec6053d1612774ca18f5c16df06a2cb08c31097f7d8f19be701ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1c1b5fa9-2c40-4a7f-9138-79f42b49a545, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:22:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:13.353 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:13.353 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:13.355 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:22:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:13Z|00269|binding|INFO|Releasing lport 13ac79d0-81a1-4e9d-a090-1dc4be832223 from this chassis (sb_readonly=1)
Dec 06 10:22:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:13.365 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:13Z|00270|if_status|INFO|Not setting lport 13ac79d0-81a1-4e9d-a090-1dc4be832223 down as sb is readonly
Dec 06 10:22:13 np0005548788.localdomain kernel: device tap13ac79d0-81 left promiscuous mode
Dec 06 10:22:13 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:13Z|00271|binding|INFO|Setting lport 13ac79d0-81a1-4e9d-a090-1dc4be832223 down in Southbound
Dec 06 10:22:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:13.379 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-1c1b5fa9-2c40-4a7f-9138-79f42b49a545', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c1b5fa9-2c40-4a7f-9138-79f42b49a545', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4daafaf0e264da6a728bdd60c5d6377', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50becbcf-4258-41c3-8a76-24f257b0d832, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=13ac79d0-81a1-4e9d-a090-1dc4be832223) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:13.381 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 13ac79d0-81a1-4e9d-a090-1dc4be832223 in datapath 1c1b5fa9-2c40-4a7f-9138-79f42b49a545 unbound from our chassis
Dec 06 10:22:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:13.383 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1c1b5fa9-2c40-4a7f-9138-79f42b49a545 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:22:13 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:13.384 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7526455-9f7b-4e71-a023-db7c9efbb18f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:13.391 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4c1f4920-d346-409d-af49-370c9b85e205", "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4c1f4920-d346-409d-af49-370c9b85e205", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:22:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:13.663 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:13 np0005548788.localdomain podman[325379]: 2025-12-06 10:22:13.69332359 +0000 UTC m=+0.087744179 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:13 np0005548788.localdomain podman[325379]: 2025-12-06 10:22:13.705820077 +0000 UTC m=+0.100240616 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:22:13 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:13.794 262572 INFO neutron.agent.dhcp.agent [None req-c93d6439-ec32-4074-b377-4f836b545d41 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:13 np0005548788.localdomain podman[325381]: 2025-12-06 10:22:13.801028206 +0000 UTC m=+0.188316844 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec 06 10:22:13 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:13.848 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:13 np0005548788.localdomain podman[325380]: 2025-12-06 10:22:13.855189024 +0000 UTC m=+0.246286580 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:22:13 np0005548788.localdomain podman[325381]: 2025-12-06 10:22:13.867268908 +0000 UTC m=+0.254557566 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350)
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:22:13 np0005548788.localdomain podman[325380]: 2025-12-06 10:22:13.920721524 +0000 UTC m=+0.311819110 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:22:13 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:22:14 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:14.064 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:14 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-1f7709b627f8d9985c22fdfa3dfda3c0c6a2d24ce1669476a5a54438984d0391-merged.mount: Deactivated successfully.
Dec 06 10:22:14 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d1c1b5fa9\x2d2c40\x2d4a7f\x2d9138\x2d79f42b49a545.mount: Deactivated successfully.
Dec 06 10:22:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:14.291 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:14 np0005548788.localdomain ceph-mon[293643]: pgmap v417: 177 pgs: 177 active+clean; 642 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 25 MiB/s wr, 36 op/s
Dec 06 10:22:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:15.952 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: pgmap v418: 177 pgs: 177 active+clean; 642 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 25 MiB/s wr, 36 op/s
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "format": "json"}]: dispatch
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.572682) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536572959, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 362, "num_deletes": 251, "total_data_size": 128852, "memory_usage": 136552, "flush_reason": "Manual Compaction"}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536577674, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 125501, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33055, "largest_seqno": 33416, "table_properties": {"data_size": 123249, "index_size": 363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6109, "raw_average_key_size": 19, "raw_value_size": 118707, "raw_average_value_size": 384, "num_data_blocks": 16, "num_entries": 309, "num_filter_entries": 309, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016528, "oldest_key_time": 1765016528, "file_creation_time": 1765016536, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 5337 microseconds, and 2300 cpu microseconds.
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.578032) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 125501 bytes OK
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.578165) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.605493) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.605552) EVENT_LOG_v1 {"time_micros": 1765016536605540, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.605584) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 126415, prev total WAL file size 126415, number of live WAL files 2.
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.607220) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(122KB)], [60(17MB)]
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536607264, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 18858653, "oldest_snapshot_seqno": -1}
Dec 06 10:22:16 np0005548788.localdomain dnsmasq[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/addn_hosts - 0 addresses
Dec 06 10:22:16 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/host
Dec 06 10:22:16 np0005548788.localdomain podman[325455]: 2025-12-06 10:22:16.685484493 +0000 UTC m=+0.062165676 container kill 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:22:16 np0005548788.localdomain dnsmasq-dhcp[325158]: read /var/lib/neutron/dhcp/14c1b0b6-2513-481e-b567-f9b0cfa9ae0d/opts
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 12940 keys, 17664140 bytes, temperature: kUnknown
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536704777, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 17664140, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17593696, "index_size": 36999, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32389, "raw_key_size": 350035, "raw_average_key_size": 27, "raw_value_size": 17376773, "raw_average_value_size": 1342, "num_data_blocks": 1359, "num_entries": 12940, "num_filter_entries": 12940, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016536, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.705170) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 17664140 bytes
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.706853) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.2 rd, 181.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 17.9 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(291.0) write-amplify(140.7) OK, records in: 13455, records dropped: 515 output_compression: NoCompression
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.706885) EVENT_LOG_v1 {"time_micros": 1765016536706871, "job": 36, "event": "compaction_finished", "compaction_time_micros": 97617, "compaction_time_cpu_micros": 51284, "output_level": 6, "num_output_files": 1, "total_output_size": 17664140, "num_input_records": 13455, "num_output_records": 12940, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536707057, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536710033, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.607113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.710174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.710184) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.710187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.710214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:22:16.710218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:16Z|00272|binding|INFO|Releasing lport 995d2357-7832-4bbb-8257-ec295217d70d from this chassis (sb_readonly=0)
Dec 06 10:22:16 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:16Z|00273|binding|INFO|Setting lport 995d2357-7832-4bbb-8257-ec295217d70d down in Southbound
Dec 06 10:22:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:16.984 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:16 np0005548788.localdomain kernel: device tap995d2357-78 left promiscuous mode
Dec 06 10:22:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:16.992 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e30123e-2eb0-49cc-904b-5f16b0ffade9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=995d2357-7832-4bbb-8257-ec295217d70d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:16.994 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 995d2357-7832-4bbb-8257-ec295217d70d in datapath 14c1b0b6-2513-481e-b567-f9b0cfa9ae0d unbound from our chassis
Dec 06 10:22:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:16.997 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:22:16 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:16.998 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[593a569d-7075-4b18-a5e1-26ccd8e950a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:17.009 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:17 np0005548788.localdomain dnsmasq[325158]: exiting on receipt of SIGTERM
Dec 06 10:22:17 np0005548788.localdomain podman[325497]: 2025-12-06 10:22:17.521594992 +0000 UTC m=+0.074709695 container kill 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:22:17 np0005548788.localdomain systemd[1]: libpod-71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89.scope: Deactivated successfully.
Dec 06 10:22:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567", "format": "json"}]: dispatch
Dec 06 10:22:17 np0005548788.localdomain podman[325511]: 2025-12-06 10:22:17.604407027 +0000 UTC m=+0.065322024 container died 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:22:17 np0005548788.localdomain systemd[1]: tmp-crun.ZvhNxo.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548788.localdomain podman[325511]: 2025-12-06 10:22:17.667148111 +0000 UTC m=+0.128063048 container cleanup 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:22:17 np0005548788.localdomain systemd[1]: libpod-conmon-71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89.scope: Deactivated successfully.
Dec 06 10:22:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-aa63aa6ead9adde18ed66105b1a7ccd465f50c4de5378c4bb0c2b989314f5b94-merged.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89-userdata-shm.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548788.localdomain podman[325513]: 2025-12-06 10:22:17.751655588 +0000 UTC m=+0.208157978 container remove 71527c0cd759593bd3348bba8abb4d58f6d4d97b64d865e042c8b27272b4ef89 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-14c1b0b6-2513-481e-b567-f9b0cfa9ae0d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:22:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:17.785 262572 INFO neutron.agent.dhcp.agent [None req-179fa40b-6fed-434d-9df3-90c6dacea1d5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:17 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d14c1b0b6\x2d2513\x2d481e\x2db567\x2df9b0cfa9ae0d.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:17.786 262572 INFO neutron.agent.dhcp.agent [None req-179fa40b-6fed-434d-9df3-90c6dacea1d5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:17.986 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:18 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:18.357 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:22:18 np0005548788.localdomain ceph-mon[293643]: pgmap v419: 177 pgs: 177 active+clean; 642 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 25 MiB/s wr, 36 op/s
Dec 06 10:22:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:18 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:18.708 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:22:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:22:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:22:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:22:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:22:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19215 "" "Go-http-client/1.1"
Dec 06 10:22:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "731d0819-2292-4c89-bff7-ec72ce366121", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "731d0819-2292-4c89-bff7-ec72ce366121", "format": "json"}]: dispatch
Dec 06 10:22:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567", "target_sub_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548788.localdomain ceph-mon[293643]: pgmap v420: 177 pgs: 177 active+clean; 746 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 22 KiB/s rd, 30 MiB/s wr, 45 op/s
Dec 06 10:22:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:20.986 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:21 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e51: np0005548790.kvkfyr(active, since 10m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2693859626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2693859626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: pgmap v421: 177 pgs: 177 active+clean; 746 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 19 MiB/s wr, 28 op/s
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: mgrmap e51: np0005548790.kvkfyr(active, since 10m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2693859626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:22 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2693859626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:23 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:23.746 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548788.localdomain ceph-mon[293643]: pgmap v422: 177 pgs: 177 active+clean; 867 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 21 KiB/s rd, 29 MiB/s wr, 44 op/s
Dec 06 10:22:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:22:25 np0005548788.localdomain podman[325540]: 2025-12-06 10:22:25.261696003 +0000 UTC m=+0.086723617 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:22:25 np0005548788.localdomain podman[325540]: 2025-12-06 10:22:25.301797265 +0000 UTC m=+0.126824859 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec 06 10:22:25 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:22:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:26.023 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:26 np0005548788.localdomain ceph-mon[293643]: pgmap v423: 177 pgs: 177 active+clean; 867 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 19 MiB/s wr, 30 op/s
Dec 06 10:22:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "format": "json"}]: dispatch
Dec 06 10:22:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "format": "json"}]: dispatch
Dec 06 10:22:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:22:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:22:27 np0005548788.localdomain podman[325560]: 2025-12-06 10:22:27.239938369 +0000 UTC m=+0.071089182 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:22:27 np0005548788.localdomain podman[325560]: 2025-12-06 10:22:27.251348113 +0000 UTC m=+0.082498896 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:22:27 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:22:27 np0005548788.localdomain podman[325561]: 2025-12-06 10:22:27.309191285 +0000 UTC m=+0.129393409 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:22:27 np0005548788.localdomain podman[325561]: 2025-12-06 10:22:27.346800529 +0000 UTC m=+0.167002603 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec 06 10:22:27 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:22:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:28 np0005548788.localdomain ceph-mon[293643]: pgmap v424: 177 pgs: 177 active+clean; 867 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 19 MiB/s wr, 30 op/s
Dec 06 10:22:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:28.791 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "731d0819-2292-4c89-bff7-ec72ce366121", "format": "json"}]: dispatch
Dec 06 10:22:29 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3455244347' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e193 do_prune osdmap full prune enabled
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "731d0819-2292-4c89-bff7-ec72ce366121", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: pgmap v425: 177 pgs: 177 active+clean; 987 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 21 KiB/s rd, 29 MiB/s wr, 45 op/s
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "snap_name": "34ca82e6-47fe-4edf-98f0-b49a22c3b971_3308a422-28d2-40bc-9817-d02064ebbe3c", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "snap_name": "34ca82e6-47fe-4edf-98f0-b49a22c3b971", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e194 e194: 6 total, 6 up, 6 in
Dec 06 10:22:30 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in
Dec 06 10:22:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:31.064 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e194 do_prune osdmap full prune enabled
Dec 06 10:22:31 np0005548788.localdomain ceph-mon[293643]: osdmap e194: 6 total, 6 up, 6 in
Dec 06 10:22:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e195 e195: 6 total, 6 up, 6 in
Dec 06 10:22:31 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in
Dec 06 10:22:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:32 np0005548788.localdomain ceph-mon[293643]: pgmap v427: 177 pgs: 177 active+clean; 987 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 16 KiB/s rd, 24 MiB/s wr, 37 op/s
Dec 06 10:22:32 np0005548788.localdomain ceph-mon[293643]: osdmap e195: 6 total, 6 up, 6 in
Dec 06 10:22:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "04a68994-1285-4b19-bd78-8daa43192107", "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "format": "json"}]: dispatch
Dec 06 10:22:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:33.824 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:35 np0005548788.localdomain ceph-mon[293643]: pgmap v429: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 30 MiB/s wr, 76 op/s
Dec 06 10:22:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: pgmap v430: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 30 MiB/s wr, 76 op/s
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "04a68994-1285-4b19-bd78-8daa43192107", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:36.102 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1087206490' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1087206490' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e195 do_prune osdmap full prune enabled
Dec 06 10:22:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e196 e196: 6 total, 6 up, 6 in
Dec 06 10:22:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/749219856' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/749219856' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: pgmap v431: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 15 MiB/s wr, 54 op/s
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: osdmap e196: 6 total, 6 up, 6 in
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4090728169' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/749219856' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/749219856' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3578013769' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e196 do_prune osdmap full prune enabled
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e197 e197: 6 total, 6 up, 6 in
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/457341292' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/457341292' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:22:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:22:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:22:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:38.871 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "df8d11b3-b101-4628-bf7f-13330bfcfc51", "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: osdmap e197: 6 total, 6 up, 6 in
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/457341292' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/457341292' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/819637897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/819637897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:40.439 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:40.440 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:40.440 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:22:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e197 do_prune osdmap full prune enabled
Dec 06 10:22:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e198 e198: 6 total, 6 up, 6 in
Dec 06 10:22:40 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in
Dec 06 10:22:40 np0005548788.localdomain ceph-mon[293643]: pgmap v434: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 2.8 MiB/s rd, 33 MiB/s wr, 183 op/s
Dec 06 10:22:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:41.143 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:41 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:22:41 np0005548788.localdomain podman[325600]: 2025-12-06 10:22:41.281232652 +0000 UTC m=+0.102460395 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:41 np0005548788.localdomain podman[325600]: 2025-12-06 10:22:41.347854245 +0000 UTC m=+0.169082018 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:22:41 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2929352609' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2929352609' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: osdmap e198: 6 total, 6 up, 6 in
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2929352609' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2929352609' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:42.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1478917471' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1478917471' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e198 do_prune osdmap full prune enabled
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e199 e199: 6 total, 6 up, 6 in
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: pgmap v436: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 103 KiB/s rd, 21 MiB/s wr, 158 op/s
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567_04b38652-f50e-477c-8a7a-6a8616208060", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "df8d11b3-b101-4628-bf7f-13330bfcfc51", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2211323013' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2211323013' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1478917471' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1478917471' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:43Z|00274|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:22:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:43Z|00275|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:22:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:43Z|00276|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:22:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:43.359 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:43.373 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:43 np0005548788.localdomain dnsmasq[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/addn_hosts - 0 addresses
Dec 06 10:22:43 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/host
Dec 06 10:22:43 np0005548788.localdomain dnsmasq-dhcp[323901]: read /var/lib/neutron/dhcp/368e35d9-76eb-4980-95bb-4c79010f8e1c/opts
Dec 06 10:22:43 np0005548788.localdomain podman[325641]: 2025-12-06 10:22:43.529532704 +0000 UTC m=+0.071853737 container kill 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:22:43 np0005548788.localdomain ceph-mon[293643]: osdmap e199: 6 total, 6 up, 6 in
Dec 06 10:22:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:43.747 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:43Z|00277|binding|INFO|Releasing lport 11c65192-e355-4348-916d-c405f800f3fc from this chassis (sb_readonly=0)
Dec 06 10:22:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:22:43Z|00278|binding|INFO|Setting lport 11c65192-e355-4348-916d-c405f800f3fc down in Southbound
Dec 06 10:22:43 np0005548788.localdomain kernel: device tap11c65192-e3 left promiscuous mode
Dec 06 10:22:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:43.756 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-368e35d9-76eb-4980-95bb-4c79010f8e1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-368e35d9-76eb-4980-95bb-4c79010f8e1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd1c979900294beeb6f273c0e1a6333a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8955b85-2eef-4be8-98c9-809745805d25, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=11c65192-e355-4348-916d-c405f800f3fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:43.758 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 11c65192-e355-4348-916d-c405f800f3fc in datapath 368e35d9-76eb-4980-95bb-4c79010f8e1c unbound from our chassis
Dec 06 10:22:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:43.761 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 368e35d9-76eb-4980-95bb-4c79010f8e1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:22:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:43.762 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7683f2-7bde-446d-bd16-05153037226f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:43.821 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:43.872 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:44.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:44.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:22:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:22:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:22:44 np0005548788.localdomain podman[325664]: 2025-12-06 10:22:44.285422118 +0000 UTC m=+0.101556628 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 06 10:22:44 np0005548788.localdomain podman[325664]: 2025-12-06 10:22:44.329806712 +0000 UTC m=+0.145941172 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:22:44 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:22:44 np0005548788.localdomain podman[325663]: 2025-12-06 10:22:44.337936364 +0000 UTC m=+0.156441476 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:22:44 np0005548788.localdomain podman[325662]: 2025-12-06 10:22:44.396281271 +0000 UTC m=+0.205501716 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125)
Dec 06 10:22:44 np0005548788.localdomain podman[325663]: 2025-12-06 10:22:44.423828224 +0000 UTC m=+0.242333386 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:22:44 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:22:44 np0005548788.localdomain podman[325662]: 2025-12-06 10:22:44.437857859 +0000 UTC m=+0.247078304 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:22:44 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:22:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e199 do_prune osdmap full prune enabled
Dec 06 10:22:44 np0005548788.localdomain ceph-mon[293643]: pgmap v438: 177 pgs: 177 active+clean; 195 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 290 KiB/s rd, 24 MiB/s wr, 436 op/s
Dec 06 10:22:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e200 e200: 6 total, 6 up, 6 in
Dec 06 10:22:44 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:45 np0005548788.localdomain dnsmasq[323901]: exiting on receipt of SIGTERM
Dec 06 10:22:45 np0005548788.localdomain podman[325742]: 2025-12-06 10:22:45.613216446 +0000 UTC m=+0.064366955 container kill 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:45 np0005548788.localdomain systemd[1]: libpod-2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9.scope: Deactivated successfully.
Dec 06 10:22:45 np0005548788.localdomain podman[325755]: 2025-12-06 10:22:45.691085547 +0000 UTC m=+0.063508517 container died 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:22:45 np0005548788.localdomain systemd[1]: tmp-crun.JxDxUy.mount: Deactivated successfully.
Dec 06 10:22:45 np0005548788.localdomain podman[325755]: 2025-12-06 10:22:45.727411803 +0000 UTC m=+0.099834733 container cleanup 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:22:45 np0005548788.localdomain systemd[1]: libpod-conmon-2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9.scope: Deactivated successfully.
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e200 do_prune osdmap full prune enabled
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "format": "json"}]: dispatch
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: osdmap e200: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1317054491' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1317054491' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e201 e201: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548788.localdomain podman[325757]: 2025-12-06 10:22:45.786519443 +0000 UTC m=+0.147492099 container remove 2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-368e35d9-76eb-4980-95bb-4c79010f8e1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:45.848 262572 INFO neutron.agent.dhcp.agent [None req-edb70ac5-071e-4b41-8972-8645237cc14e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:22:45.867 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:46.112 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:46.172 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-722f83c98c5f485a7cb5e9359c332465631e66d9aebdece877c0c1c999a34102-merged.mount: Deactivated successfully.
Dec 06 10:22:46 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2823387e898713651a9bf2506a65ad2207f1dbe9ef312801153dbfa7ea33f5e9-userdata-shm.mount: Deactivated successfully.
Dec 06 10:22:46 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d368e35d9\x2d76eb\x2d4980\x2d95bb\x2d4c79010f8e1c.mount: Deactivated successfully.
Dec 06 10:22:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "format": "json"}]: dispatch
Dec 06 10:22:46 np0005548788.localdomain ceph-mon[293643]: pgmap v440: 177 pgs: 177 active+clean; 195 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 171 KiB/s rd, 1.4 MiB/s wr, 254 op/s
Dec 06 10:22:46 np0005548788.localdomain ceph-mon[293643]: osdmap e201: 6 total, 6 up, 6 in
Dec 06 10:22:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:47.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:47.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:22:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:47.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:22:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:47.019 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:22:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:47.019 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e201 do_prune osdmap full prune enabled
Dec 06 10:22:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e202 e202: 6 total, 6 up, 6 in
Dec 06 10:22:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in
Dec 06 10:22:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/869442583' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.029 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.029 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.029 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4009757146' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.483 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: pgmap v442: 177 pgs: 177 active+clean; 195 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 171 KiB/s rd, 1.4 MiB/s wr, 254 op/s
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: osdmap e202: 6 total, 6 up, 6 in
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3787706483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/869442583' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4009757146' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e202 do_prune osdmap full prune enabled
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e203 e203: 6 total, 6 up, 6 in
Dec 06 10:22:48 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.693 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.694 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11487MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.695 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.695 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.771 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.772 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.790 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:48.918 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2088044712' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:49.244 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:49.251 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:22:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:49.270 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:22:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:49.272 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:22:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:49.273 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e203 do_prune osdmap full prune enabled
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "format": "json"}]: dispatch
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: osdmap e203: 6 total, 6 up, 6 in
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1713529106' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2088044712' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e204 e204: 6 total, 6 up, 6 in
Dec 06 10:22:49 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in
Dec 06 10:22:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:22:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:22:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:22:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:22:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:22:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18745 "" "Go-http-client/1.1"
Dec 06 10:22:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:50.274 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e204 do_prune osdmap full prune enabled
Dec 06 10:22:50 np0005548788.localdomain ceph-mon[293643]: pgmap v445: 177 pgs: 177 active+clean; 196 MiB data, 984 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 51 KiB/s wr, 84 op/s
Dec 06 10:22:50 np0005548788.localdomain ceph-mon[293643]: osdmap e204: 6 total, 6 up, 6 in
Dec 06 10:22:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e205 e205: 6 total, 6 up, 6 in
Dec 06 10:22:50 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:51.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:51.219 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e205 do_prune osdmap full prune enabled
Dec 06 10:22:51 np0005548788.localdomain ceph-mon[293643]: osdmap e205: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e206 e206: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:22:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e206 do_prune osdmap full prune enabled
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e207 e207: 6 total, 6 up, 6 in
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: pgmap v448: 177 pgs: 177 active+clean; 196 MiB data, 984 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 58 KiB/s wr, 95 op/s
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: osdmap e206: 6 total, 6 up, 6 in
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:52 np0005548788.localdomain ceph-mon[293643]: osdmap e207: 6 total, 6 up, 6 in
Dec 06 10:22:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "format": "json"}]: dispatch
Dec 06 10:22:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:53.955 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2991255880' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2991255880' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e207 do_prune osdmap full prune enabled
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e208 e208: 6 total, 6 up, 6 in
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: pgmap v451: 177 pgs: 177 active+clean; 196 MiB data, 969 MiB used, 41 GiB / 42 GiB avail; 202 KiB/s rd, 56 KiB/s wr, 286 op/s
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2991255880' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2991255880' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:55.298 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:55 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:55.299 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:22:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:55.327 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1020764155' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1020764155' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: osdmap e208: 6 total, 6 up, 6 in
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3220423124' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3220423124' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1020764155' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1020764155' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:56 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:22:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:56.221 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:56 np0005548788.localdomain podman[325829]: 2025-12-06 10:22:56.261926602 +0000 UTC m=+0.091541447 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 10:22:56 np0005548788.localdomain podman[325829]: 2025-12-06 10:22:56.275329228 +0000 UTC m=+0.104944053 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:22:56 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:22:56 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:22:56.301 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:22:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "format": "json"}]: dispatch
Dec 06 10:22:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:56 np0005548788.localdomain ceph-mon[293643]: pgmap v453: 177 pgs: 177 active+clean; 196 MiB data, 969 MiB used, 41 GiB / 42 GiB avail; 174 KiB/s rd, 48 KiB/s wr, 246 op/s
Dec 06 10:22:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3555310940' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3555310940' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e208 do_prune osdmap full prune enabled
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e209 e209: 6 total, 6 up, 6 in
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2409640309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2409640309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:22:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:22:58 np0005548788.localdomain podman[325848]: 2025-12-06 10:22:58.273374828 +0000 UTC m=+0.095713476 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:22:58 np0005548788.localdomain podman[325848]: 2025-12-06 10:22:58.313393997 +0000 UTC m=+0.135732675 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:22:58 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:22:58 np0005548788.localdomain podman[325849]: 2025-12-06 10:22:58.317594358 +0000 UTC m=+0.137070947 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:22:58 np0005548788.localdomain podman[325849]: 2025-12-06 10:22:58.398372059 +0000 UTC m=+0.217848698 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:58 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:22:58 np0005548788.localdomain ceph-mon[293643]: pgmap v454: 177 pgs: 177 active+clean; 196 MiB data, 969 MiB used, 41 GiB / 42 GiB avail; 135 KiB/s rd, 37 KiB/s wr, 190 op/s
Dec 06 10:22:58 np0005548788.localdomain ceph-mon[293643]: osdmap e209: 6 total, 6 up, 6 in
Dec 06 10:22:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2409640309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2409640309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:22:58.996 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:00 np0005548788.localdomain ceph-mon[293643]: pgmap v456: 177 pgs: 177 active+clean; 196 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 216 KiB/s rd, 56 KiB/s wr, 301 op/s
Dec 06 10:23:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:01.223 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e209 do_prune osdmap full prune enabled
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e210 e210: 6 total, 6 up, 6 in
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: pgmap v457: 177 pgs: 177 active+clean; 196 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 81 KiB/s rd, 19 KiB/s wr, 110 op/s
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: osdmap e210: 6 total, 6 up, 6 in
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:23:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e210 do_prune osdmap full prune enabled
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e211 e211: 6 total, 6 up, 6 in
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:03 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in
Dec 06 10:23:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:04.038 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:04 np0005548788.localdomain ceph-mon[293643]: pgmap v459: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 34 KiB/s wr, 118 op/s
Dec 06 10:23:04 np0005548788.localdomain ceph-mon[293643]: osdmap e211: 6 total, 6 up, 6 in
Dec 06 10:23:04 np0005548788.localdomain sudo[325888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:23:04 np0005548788.localdomain sudo[325888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:23:04 np0005548788.localdomain sudo[325888]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:04 np0005548788.localdomain sudo[325906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:23:04 np0005548788.localdomain sudo[325906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:23:05 np0005548788.localdomain sudo[325906]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:23:05 np0005548788.localdomain sudo[325956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:23:05 np0005548788.localdomain sudo[325956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:23:05 np0005548788.localdomain sudo[325956]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e211 do_prune osdmap full prune enabled
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e212 e212: 6 total, 6 up, 6 in
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:23:05 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:23:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:06.226 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e212 do_prune osdmap full prune enabled
Dec 06 10:23:06 np0005548788.localdomain ceph-mon[293643]: pgmap v461: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 35 KiB/s wr, 123 op/s
Dec 06 10:23:06 np0005548788.localdomain ceph-mon[293643]: osdmap e212: 6 total, 6 up, 6 in
Dec 06 10:23:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e213 e213: 6 total, 6 up, 6 in
Dec 06 10:23:06 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.500 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:23:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e213 do_prune osdmap full prune enabled
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "4ddcde0f-43b5-4b51-a615-3fe828471f71", "format": "json"}]: dispatch
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: osdmap e213: 6 total, 6 up, 6 in
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e214 e214: 6 total, 6 up, 6 in
Dec 06 10:23:07 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1802975927' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1802975927' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "format": "json"}]: dispatch
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: pgmap v464: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 6.3 KiB/s rd, 25 KiB/s wr, 14 op/s
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: osdmap e214: 6 total, 6 up, 6 in
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1802975927' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1802975927' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:23:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:23:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:09.069 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:10 np0005548788.localdomain ceph-mon[293643]: pgmap v466: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 21 KiB/s wr, 72 op/s
Dec 06 10:23:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "2106c631-2fac-4a48-9eb5-105ae638e038", "format": "json"}]: dispatch
Dec 06 10:23:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:11.228 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e214 do_prune osdmap full prune enabled
Dec 06 10:23:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e215 e215: 6 total, 6 up, 6 in
Dec 06 10:23:11 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:23:12 np0005548788.localdomain podman[325974]: 2025-12-06 10:23:12.277567141 +0000 UTC m=+0.094068115 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:23:12 np0005548788.localdomain podman[325974]: 2025-12-06 10:23:12.318737697 +0000 UTC m=+0.135238651 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:23:12 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e215 do_prune osdmap full prune enabled
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e216 e216: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: pgmap v467: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 20 KiB/s wr, 68 op/s
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: osdmap e215: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548788.localdomain ceph-mon[293643]: osdmap e216: 6 total, 6 up, 6 in
Dec 06 10:23:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:14.105 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "2106c631-2fac-4a48-9eb5-105ae638e038_339c2891-c9dd-4dd5-bc06-64f8eef95887", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "2106c631-2fac-4a48-9eb5-105ae638e038", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:14 np0005548788.localdomain ceph-mon[293643]: pgmap v470: 177 pgs: 177 active+clean; 196 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 44 KiB/s wr, 108 op/s
Dec 06 10:23:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:23:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:23:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:23:15 np0005548788.localdomain podman[325999]: 2025-12-06 10:23:15.270056945 +0000 UTC m=+0.094785047 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:23:15 np0005548788.localdomain podman[326000]: 2025-12-06 10:23:15.320154797 +0000 UTC m=+0.140552044 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:23:15 np0005548788.localdomain podman[326000]: 2025-12-06 10:23:15.32897492 +0000 UTC m=+0.149372217 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:23:15 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:23:15 np0005548788.localdomain podman[326001]: 2025-12-06 10:23:15.381827558 +0000 UTC m=+0.198381467 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Dec 06 10:23:15 np0005548788.localdomain podman[325999]: 2025-12-06 10:23:15.387477453 +0000 UTC m=+0.212205545 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.build-date=20251125)
Dec 06 10:23:15 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:23:15 np0005548788.localdomain podman[326001]: 2025-12-06 10:23:15.423579911 +0000 UTC m=+0.240133780 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:23:15 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:23:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:16.231 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:16 np0005548788.localdomain ceph-mon[293643]: pgmap v471: 177 pgs: 177 active+clean; 196 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 35 KiB/s wr, 86 op/s
Dec 06 10:23:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "4ddcde0f-43b5-4b51-a615-3fe828471f71_bd1921c7-7a11-44e6-a718-d94ea8eab798", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "4ddcde0f-43b5-4b51-a615-3fe828471f71", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:18 np0005548788.localdomain ceph-mon[293643]: pgmap v472: 177 pgs: 177 active+clean; 196 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 18 KiB/s wr, 30 op/s
Dec 06 10:23:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:19.144 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:23:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:23:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:23:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:23:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:23:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18743 "" "Go-http-client/1.1"
Dec 06 10:23:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e216 do_prune osdmap full prune enabled
Dec 06 10:23:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e217 e217: 6 total, 6 up, 6 in
Dec 06 10:23:19 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in
Dec 06 10:23:20 np0005548788.localdomain sshd[326060]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:23:20 np0005548788.localdomain sshd[326060]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 10:23:20 np0005548788.localdomain sshd[326060]: Connection closed by 3.131.215.38 port 37240
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e217 do_prune osdmap full prune enabled
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e218 e218: 6 total, 6 up, 6 in
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: pgmap v473: 177 pgs: 177 active+clean; 196 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 41 KiB/s wr, 75 op/s
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "format": "json"}]: dispatch
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:20 np0005548788.localdomain ceph-mon[293643]: osdmap e217: 6 total, 6 up, 6 in
Dec 06 10:23:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:21.236 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:21 np0005548788.localdomain ceph-mon[293643]: osdmap e218: 6 total, 6 up, 6 in
Dec 06 10:23:21 np0005548788.localdomain ceph-mon[293643]: pgmap v476: 177 pgs: 177 active+clean; 196 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 23 KiB/s wr, 45 op/s
Dec 06 10:23:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1707344585' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1707344585' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:22 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4034920819' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:23 np0005548788.localdomain ceph-mon[293643]: pgmap v477: 177 pgs: 177 active+clean; 261 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 8.0 MiB/s wr, 153 op/s
Dec 06 10:23:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:24.178 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "format": "json"}]: dispatch
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4146455680' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4146455680' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4146455680' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:25 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4146455680' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:25 np0005548788.localdomain ceph-mon[293643]: pgmap v478: 177 pgs: 177 active+clean; 261 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 8.0 MiB/s wr, 153 op/s
Dec 06 10:23:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:26.239 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:27 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:23:27 np0005548788.localdomain podman[326063]: 2025-12-06 10:23:27.27403275 +0000 UTC m=+0.093814416 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:23:27 np0005548788.localdomain podman[326063]: 2025-12-06 10:23:27.29049879 +0000 UTC m=+0.110280436 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:23:27 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:23:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e218 do_prune osdmap full prune enabled
Dec 06 10:23:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e219 e219: 6 total, 6 up, 6 in
Dec 06 10:23:27 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in
Dec 06 10:23:28 np0005548788.localdomain ceph-mon[293643]: pgmap v479: 177 pgs: 177 active+clean; 261 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 75 KiB/s rd, 8.0 MiB/s wr, 108 op/s
Dec 06 10:23:28 np0005548788.localdomain ceph-mon[293643]: osdmap e219: 6 total, 6 up, 6 in
Dec 06 10:23:28 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "f95d5566-befb-4ef3-874d-5a8d6c4a6df5", "format": "json"}]: dispatch
Dec 06 10:23:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:23:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:23:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:23:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:29.180 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:29 np0005548788.localdomain podman[326083]: 2025-12-06 10:23:29.286171346 +0000 UTC m=+0.100317377 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:23:29 np0005548788.localdomain podman[326083]: 2025-12-06 10:23:29.321577643 +0000 UTC m=+0.135723674 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:23:29 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:23:29 np0005548788.localdomain systemd[1]: tmp-crun.uol69A.mount: Deactivated successfully.
Dec 06 10:23:29 np0005548788.localdomain podman[326082]: 2025-12-06 10:23:29.458604108 +0000 UTC m=+0.276354652 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:23:29 np0005548788.localdomain podman[326082]: 2025-12-06 10:23:29.501101284 +0000 UTC m=+0.318851778 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:23:29 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:23:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "format": "json"}]: dispatch
Dec 06 10:23:30 np0005548788.localdomain ceph-mon[293643]: pgmap v481: 177 pgs: 177 active+clean; 641 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 137 KiB/s rd, 53 MiB/s wr, 216 op/s
Dec 06 10:23:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:31.239 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "82ca32b7-c603-4087-ab30-5f8e76d96b29", "format": "json"}]: dispatch
Dec 06 10:23:32 np0005548788.localdomain ceph-mon[293643]: pgmap v482: 177 pgs: 177 active+clean; 641 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 116 KiB/s rd, 44 MiB/s wr, 182 op/s
Dec 06 10:23:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "snap_name": "1386d79b-64b7-4747-8e0b-aeb5f89fda0b", "format": "json"}]: dispatch
Dec 06 10:23:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:34.181 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:34 np0005548788.localdomain ceph-mon[293643]: pgmap v483: 177 pgs: 177 active+clean; 989 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 104 KiB/s rd, 73 MiB/s wr, 183 op/s
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/685384975' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/685384975' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "82ca32b7-c603-4087-ab30-5f8e76d96b29_d91f5f9e-cb0d-49f0-967d-dd0f7dcc891c", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "82ca32b7-c603-4087-ab30-5f8e76d96b29", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: pgmap v484: 177 pgs: 177 active+clean; 989 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 104 KiB/s rd, 73 MiB/s wr, 183 op/s
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "snap_name": "1386d79b-64b7-4747-8e0b-aeb5f89fda0b_b3a51089-88de-4e95-b3d9-848c07c2978e", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "snap_name": "1386d79b-64b7-4747-8e0b-aeb5f89fda0b", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:36.243 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1133027623' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:37 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1133027623' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:38 np0005548788.localdomain ceph-mon[293643]: pgmap v485: 177 pgs: 177 active+clean; 989 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 104 KiB/s rd, 73 MiB/s wr, 183 op/s
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:23:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:23:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:23:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:39.183 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e219 do_prune osdmap full prune enabled
Dec 06 10:23:39 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5aa4551f-daf4-40ec-a318-c443e92f70d8", "format": "json"}]: dispatch
Dec 06 10:23:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3601030272' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3601030272' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e220 e220: 6 total, 6 up, 6 in
Dec 06 10:23:39 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in
Dec 06 10:23:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:40.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e220 do_prune osdmap full prune enabled
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "format": "json"}]: dispatch
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: pgmap v486: 177 pgs: 177 active+clean; 337 MiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 155 KiB/s rd, 82 MiB/s wr, 276 op/s
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: osdmap e220: 6 total, 6 up, 6 in
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/860284658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e221 e221: 6 total, 6 up, 6 in
Dec 06 10:23:40 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in
Dec 06 10:23:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:41.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:41.004 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:23:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:41.244 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e221 do_prune osdmap full prune enabled
Dec 06 10:23:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e222 e222: 6 total, 6 up, 6 in
Dec 06 10:23:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in
Dec 06 10:23:41 np0005548788.localdomain ceph-mon[293643]: osdmap e221: 6 total, 6 up, 6 in
Dec 06 10:23:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2359987376' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:42 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:42Z|00279|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 06 10:23:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:42 np0005548788.localdomain ceph-mon[293643]: pgmap v489: 177 pgs: 177 active+clean; 337 MiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 96 KiB/s rd, 29 MiB/s wr, 175 op/s
Dec 06 10:23:42 np0005548788.localdomain ceph-mon[293643]: osdmap e222: 6 total, 6 up, 6 in
Dec 06 10:23:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:43.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:43.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:43.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:23:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:23:43 np0005548788.localdomain podman[326123]: 2025-12-06 10:23:43.257414517 +0000 UTC m=+0.081310220 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:23:43 np0005548788.localdomain podman[326123]: 2025-12-06 10:23:43.34116486 +0000 UTC m=+0.165060523 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:23:43 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:23:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e222 do_prune osdmap full prune enabled
Dec 06 10:23:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e223 e223: 6 total, 6 up, 6 in
Dec 06 10:23:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in
Dec 06 10:23:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5aa4551f-daf4-40ec-a318-c443e92f70d8_a730a057-9533-4839-a43f-0f73eb78bbd2", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5aa4551f-daf4-40ec-a318-c443e92f70d8", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e52: np0005548790.kvkfyr(active, since 12m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:23:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:23:43.926 262572 INFO neutron.agent.linux.ip_lib [None req-090c9f95-b289-4d27-9785-d7e5439bd8d1 - - - - - -] Device tap16ab0353-4c cannot be used as it has no MAC address
Dec 06 10:23:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:43.957 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:43 np0005548788.localdomain kernel: device tap16ab0353-4c entered promiscuous mode
Dec 06 10:23:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:43Z|00280|binding|INFO|Claiming lport 16ab0353-4ca5-40a7-a86b-3994fc8722ca for this chassis.
Dec 06 10:23:43 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:43Z|00281|binding|INFO|16ab0353-4ca5-40a7-a86b-3994fc8722ca: Claiming unknown
Dec 06 10:23:43 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016623.9684] manager: (tap16ab0353-4c): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Dec 06 10:23:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:43.969 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:43 np0005548788.localdomain systemd-udevd[326158]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:23:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:43.979 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-55ffc629-08a5-404f-87a7-26deb97840dc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ffc629-08a5-404f-87a7-26deb97840dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b51f704fe6204487b0317c3332364cca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a555e286-25fe-4028-bbdb-d66a3efae4d1, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=16ab0353-4ca5-40a7-a86b-3994fc8722ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:23:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:43.981 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 16ab0353-4ca5-40a7-a86b-3994fc8722ca in datapath 55ffc629-08a5-404f-87a7-26deb97840dc bound to our chassis
Dec 06 10:23:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:43.983 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 55ffc629-08a5-404f-87a7-26deb97840dc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:23:43 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:43.984 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[a408f25a-ec84-49dc-b0b8-d9ead3b86dc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:23:43 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:44Z|00282|binding|INFO|Setting lport 16ab0353-4ca5-40a7-a86b-3994fc8722ca ovn-installed in OVS
Dec 06 10:23:44 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:44Z|00283|binding|INFO|Setting lport 16ab0353-4ca5-40a7-a86b-3994fc8722ca up in Southbound
Dec 06 10:23:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:44.004 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:44.006 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:44.013 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tap16ab0353-4c: No such device
Dec 06 10:23:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:44.049 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:44.080 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:44.184 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:44 np0005548788.localdomain ceph-mon[293643]: pgmap v491: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 154 KiB/s rd, 39 MiB/s wr, 281 op/s
Dec 06 10:23:44 np0005548788.localdomain ceph-mon[293643]: osdmap e223: 6 total, 6 up, 6 in
Dec 06 10:23:44 np0005548788.localdomain ceph-mon[293643]: mgrmap e52: np0005548790.kvkfyr(active, since 12m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:23:45 np0005548788.localdomain podman[326230]: 
Dec 06 10:23:45 np0005548788.localdomain podman[326230]: 2025-12-06 10:23:45.043103648 +0000 UTC m=+0.076775449 container create e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:23:45 np0005548788.localdomain systemd[1]: Started libpod-conmon-e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0.scope.
Dec 06 10:23:45 np0005548788.localdomain systemd[1]: tmp-crun.jUICKA.mount: Deactivated successfully.
Dec 06 10:23:45 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:23:45 np0005548788.localdomain podman[326230]: 2025-12-06 10:23:45.011566572 +0000 UTC m=+0.045238393 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:23:45 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dada672f5a9f70b28563c1bdf508933570d30989fd9dcd820b78ff73781ae180/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:23:45 np0005548788.localdomain podman[326230]: 2025-12-06 10:23:45.133835698 +0000 UTC m=+0.167507499 container init e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:23:45 np0005548788.localdomain podman[326230]: 2025-12-06 10:23:45.142399873 +0000 UTC m=+0.176071674 container start e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:23:45 np0005548788.localdomain dnsmasq[326247]: started, version 2.85 cachesize 150
Dec 06 10:23:45 np0005548788.localdomain dnsmasq[326247]: DNS service limited to local subnets
Dec 06 10:23:45 np0005548788.localdomain dnsmasq[326247]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:23:45 np0005548788.localdomain dnsmasq[326247]: warning: no upstream servers configured
Dec 06 10:23:45 np0005548788.localdomain dnsmasq-dhcp[326247]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:23:45 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 0 addresses
Dec 06 10:23:45 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:23:45 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:23:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:23:45.299 262572 INFO neutron.agent.dhcp.agent [None req-987bd531-9798-401f-97fd-323b4eabcbde - - - - - -] DHCP configuration for ports {'03e3daed-e0ad-41ef-b4d5-42d85bf912f3'} is completed
Dec 06 10:23:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:23:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:23:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:23:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:46.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:46 np0005548788.localdomain podman[326249]: 2025-12-06 10:23:46.014132467 +0000 UTC m=+0.081022261 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:23:46 np0005548788.localdomain podman[326250]: 2025-12-06 10:23:46.068105868 +0000 UTC m=+0.131944218 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:23:46 np0005548788.localdomain podman[326250]: 2025-12-06 10:23:46.086488797 +0000 UTC m=+0.150327147 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:23:46 np0005548788.localdomain podman[326249]: 2025-12-06 10:23:46.099664135 +0000 UTC m=+0.166553969 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:23:46 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:23:46 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:23:46 np0005548788.localdomain podman[326248]: 2025-12-06 10:23:46.177079204 +0000 UTC m=+0.246298651 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec 06 10:23:46 np0005548788.localdomain podman[326248]: 2025-12-06 10:23:46.190723776 +0000 UTC m=+0.259943223 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:23:46 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:23:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:46.247 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:47.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:47.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:23:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:47.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:23:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:47.021 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e223 do_prune osdmap full prune enabled
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e224 e224: 6 total, 6 up, 6 in
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: pgmap v493: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 64 KiB/s wr, 49 op/s
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5755ebf1-140d-4767-bf58-cf9be5b35f79", "format": "json"}]: dispatch
Dec 06 10:23:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:23:47.089 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:46Z, description=, device_id=c9e741a0-1e78-4ba5-9ba8-789872d3aa4a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c69a1cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c706e670>], id=a4b47aaa-3328-4f3c-8fe4-22a78ba05982, ip_allocation=immediate, mac_address=fa:16:3e:9f:84:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:23:42Z, description=, dns_domain=, id=55ffc629-08a5-404f-87a7-26deb97840dc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1845353867-network, port_security_enabled=True, project_id=b51f704fe6204487b0317c3332364cca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18201, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3476, status=ACTIVE, subnets=['04d859a0-16a3-4893-aad6-8d6b1a003e1d'], tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:23:43Z, vlan_transparent=None, network_id=55ffc629-08a5-404f-87a7-26deb97840dc, port_security_enabled=False, project_id=b51f704fe6204487b0317c3332364cca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3498, status=DOWN, tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:23:46Z on network 55ffc629-08a5-404f-87a7-26deb97840dc
Dec 06 10:23:47 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 1 addresses
Dec 06 10:23:47 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:23:47 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:23:47 np0005548788.localdomain podman[326321]: 2025-12-06 10:23:47.30431421 +0000 UTC m=+0.055635195 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:23:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:47.444 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:47.445 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:47.445 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e224 do_prune osdmap full prune enabled
Dec 06 10:23:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:23:47.571 262572 INFO neutron.agent.dhcp.agent [None req-99604c4b-9d97-45a3-beb8-cbbd741bdedd - - - - - -] DHCP configuration for ports {'a4b47aaa-3328-4f3c-8fe4-22a78ba05982'} is completed
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e225 e225: 6 total, 6 up, 6 in
Dec 06 10:23:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in
Dec 06 10:23:47 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:23:47.784 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:46Z, description=, device_id=c9e741a0-1e78-4ba5-9ba8-789872d3aa4a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c66db1c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c66db7c0>], id=a4b47aaa-3328-4f3c-8fe4-22a78ba05982, ip_allocation=immediate, mac_address=fa:16:3e:9f:84:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:23:42Z, description=, dns_domain=, id=55ffc629-08a5-404f-87a7-26deb97840dc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1845353867-network, port_security_enabled=True, project_id=b51f704fe6204487b0317c3332364cca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18201, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3476, status=ACTIVE, subnets=['04d859a0-16a3-4893-aad6-8d6b1a003e1d'], tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:23:43Z, vlan_transparent=None, network_id=55ffc629-08a5-404f-87a7-26deb97840dc, port_security_enabled=False, project_id=b51f704fe6204487b0317c3332364cca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3498, status=DOWN, tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:23:46Z on network 55ffc629-08a5-404f-87a7-26deb97840dc
Dec 06 10:23:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:48.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:48 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 1 addresses
Dec 06 10:23:48 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:23:48 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:23:48 np0005548788.localdomain podman[326359]: 2025-12-06 10:23:48.065416315 +0000 UTC m=+0.062644002 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:23:48 np0005548788.localdomain ceph-mon[293643]: osdmap e224: 6 total, 6 up, 6 in
Dec 06 10:23:48 np0005548788.localdomain ceph-mon[293643]: pgmap v495: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 60 KiB/s wr, 47 op/s
Dec 06 10:23:48 np0005548788.localdomain ceph-mon[293643]: osdmap e225: 6 total, 6 up, 6 in
Dec 06 10:23:48 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:23:48.460 262572 INFO neutron.agent.dhcp.agent [None req-f5d9426f-16f0-481e-a856-cbbb7a0dccb3 - - - - - -] DHCP configuration for ports {'a4b47aaa-3328-4f3c-8fe4-22a78ba05982'} is completed
Dec 06 10:23:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:48Z|00284|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:23:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:48Z|00285|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:23:48 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:23:48Z|00286|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:23:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:48.990 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:48.996 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.001 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.049 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2848039413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.186 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3773085003' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:49 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:49 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3773085003' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:23:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:23:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:23:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.655 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:49.656 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:23:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:49.658 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:23:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:23:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19226 "" "Go-http-client/1.1"
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.815 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:49.980 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.027 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.028 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5755ebf1-140d-4767-bf58-cf9be5b35f79_20013db0-3906-4db4-b01b-002cf0a0c6bc", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5755ebf1-140d-4767-bf58-cf9be5b35f79", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: pgmap v497: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 21 KiB/s wr, 35 op/s
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3773085003' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3773085003' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1887499854' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:23:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/585236573' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.546 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.755 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.805 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.807 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11455MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.807 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.808 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.808 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.978 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:23:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:50.978 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.005 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:23:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e225 do_prune osdmap full prune enabled
Dec 06 10:23:51 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/585236573' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e226 e226: 6 total, 6 up, 6 in
Dec 06 10:23:51 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.250 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:23:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2689749222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.535 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.542 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.574 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.576 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:23:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:51.577 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e226 do_prune osdmap full prune enabled
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: osdmap e226: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: pgmap v499: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 21 KiB/s wr, 35 op/s
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2689749222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e227 e227: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e227 do_prune osdmap full prune enabled
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e228 e228: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in
Dec 06 10:23:53 np0005548788.localdomain ceph-mon[293643]: osdmap e227: 6 total, 6 up, 6 in
Dec 06 10:23:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "07253e42-ed74-4565-a93c-b3109a8bee58", "format": "json"}]: dispatch
Dec 06 10:23:53 np0005548788.localdomain ceph-mon[293643]: osdmap e228: 6 total, 6 up, 6 in
Dec 06 10:23:53 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3355742361' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:53 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3355742361' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:53 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:23:53.661 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:23:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:54.223 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:54 np0005548788.localdomain ceph-mon[293643]: pgmap v502: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 70 KiB/s wr, 179 op/s
Dec 06 10:23:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e228 do_prune osdmap full prune enabled
Dec 06 10:23:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e229 e229: 6 total, 6 up, 6 in
Dec 06 10:23:55 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in
Dec 06 10:23:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:56.256 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:56 np0005548788.localdomain ceph-mon[293643]: osdmap e229: 6 total, 6 up, 6 in
Dec 06 10:23:56 np0005548788.localdomain ceph-mon[293643]: pgmap v504: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 66 KiB/s wr, 197 op/s
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3696695018' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3696695018' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e229 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e229 do_prune osdmap full prune enabled
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e230 e230: 6 total, 6 up, 6 in
Dec 06 10:23:57 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in
Dec 06 10:23:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:58.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:58.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:23:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:58.032 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:23:58 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:23:58 np0005548788.localdomain podman[326426]: 2025-12-06 10:23:58.265744485 +0000 UTC m=+0.081281039 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:23:58 np0005548788.localdomain podman[326426]: 2025-12-06 10:23:58.308772688 +0000 UTC m=+0.124309252 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0)
Dec 06 10:23:58 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "07253e42-ed74-4565-a93c-b3109a8bee58_5f296b41-2976-4ca1-90d8-b8619d1ec6e3", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "07253e42-ed74-4565-a93c-b3109a8bee58", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: pgmap v505: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 45 KiB/s wr, 134 op/s
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: osdmap e230: 6 total, 6 up, 6 in
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2773610337' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2773610337' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e230 do_prune osdmap full prune enabled
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e231 e231: 6 total, 6 up, 6 in
Dec 06 10:23:58 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in
Dec 06 10:23:58 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:23:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:23:59.224 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e231 do_prune osdmap full prune enabled
Dec 06 10:23:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e232 e232: 6 total, 6 up, 6 in
Dec 06 10:23:59 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in
Dec 06 10:23:59 np0005548788.localdomain ceph-mon[293643]: osdmap e231: 6 total, 6 up, 6 in
Dec 06 10:24:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:24:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:24:00 np0005548788.localdomain sshd[326446]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:24:00 np0005548788.localdomain podman[326445]: 2025-12-06 10:24:00.270371659 +0000 UTC m=+0.094201309 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:24:00 np0005548788.localdomain podman[326445]: 2025-12-06 10:24:00.279021307 +0000 UTC m=+0.102850947 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:24:00 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:24:00 np0005548788.localdomain podman[326447]: 2025-12-06 10:24:00.376602719 +0000 UTC m=+0.196204409 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:24:00 np0005548788.localdomain podman[326447]: 2025-12-06 10:24:00.388762266 +0000 UTC m=+0.208363976 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:24:00 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:24:00 np0005548788.localdomain ceph-mon[293643]: pgmap v508: 177 pgs: 177 active+clean; 258 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 4.0 MiB/s wr, 179 op/s
Dec 06 10:24:00 np0005548788.localdomain ceph-mon[293643]: osdmap e232: 6 total, 6 up, 6 in
Dec 06 10:24:00 np0005548788.localdomain sshd[326446]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 10:24:00 np0005548788.localdomain sshd[326446]: Connection closed by 3.131.215.38 port 49836
Dec 06 10:24:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:01.283 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/573079347' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/402114055' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/402114055' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e232 do_prune osdmap full prune enabled
Dec 06 10:24:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e233 e233: 6 total, 6 up, 6 in
Dec 06 10:24:02 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in
Dec 06 10:24:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "76242542-0436-4e10-a0ca-932a5ea39e05", "format": "json"}]: dispatch
Dec 06 10:24:02 np0005548788.localdomain ceph-mon[293643]: pgmap v510: 177 pgs: 177 active+clean; 258 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 4.0 MiB/s wr, 179 op/s
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e233 do_prune osdmap full prune enabled
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e234 e234: 6 total, 6 up, 6 in
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: osdmap e233: 6 total, 6 up, 6 in
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1694106485' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1694106485' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:03 np0005548788.localdomain ceph-mon[293643]: osdmap e234: 6 total, 6 up, 6 in
Dec 06 10:24:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:04.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:04.226 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e234 do_prune osdmap full prune enabled
Dec 06 10:24:04 np0005548788.localdomain ceph-mon[293643]: pgmap v512: 177 pgs: 177 active+clean; 337 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 7.7 MiB/s rd, 11 MiB/s wr, 355 op/s
Dec 06 10:24:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e235 e235: 6 total, 6 up, 6 in
Dec 06 10:24:04 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in
Dec 06 10:24:05 np0005548788.localdomain sshd[326487]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:24:05 np0005548788.localdomain ceph-mon[293643]: osdmap e235: 6 total, 6 up, 6 in
Dec 06 10:24:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1159844726' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1159844726' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:05 np0005548788.localdomain sudo[326489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:24:05 np0005548788.localdomain sudo[326489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:05 np0005548788.localdomain sudo[326489]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:05 np0005548788.localdomain sudo[326507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 10:24:05 np0005548788.localdomain sudo[326507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:06.286 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain sudo[326507]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain sudo[326547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:24:06 np0005548788.localdomain sudo[326547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:06 np0005548788.localdomain sudo[326547]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:06 np0005548788.localdomain sudo[326565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:24:06 np0005548788.localdomain sudo[326565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: pgmap v515: 177 pgs: 177 active+clean; 337 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 4.1 MiB/s rd, 7.1 MiB/s wr, 167 op/s
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "76242542-0436-4e10-a0ca-932a5ea39e05_4c1bc32d-22e5-4d62-bff7-71f1039d4c95", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "76242542-0436-4e10-a0ca-932a5ea39e05", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e235 do_prune osdmap full prune enabled
Dec 06 10:24:07 np0005548788.localdomain sudo[326565]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e236 e236: 6 total, 6 up, 6 in
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:07 np0005548788.localdomain sudo[326616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:24:07 np0005548788.localdomain sudo[326616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:07 np0005548788.localdomain sudo[326616]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e236 do_prune osdmap full prune enabled
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e237 e237: 6 total, 6 up, 6 in
Dec 06 10:24:07 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in
Dec 06 10:24:07 np0005548788.localdomain sshd[326487]: Received disconnect from 45.78.194.186 port 40132:11: Bye Bye [preauth]
Dec 06 10:24:07 np0005548788.localdomain sshd[326487]: Disconnected from authenticating user root 45.78.194.186 port 40132 [preauth]
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: pgmap v516: 177 pgs: 177 active+clean; 337 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 6.7 MiB/s wr, 157 op/s
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: osdmap e236: 6 total, 6 up, 6 in
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:24:08 np0005548788.localdomain ceph-mon[293643]: osdmap e237: 6 total, 6 up, 6 in
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:24:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:24:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:09.228 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e237 do_prune osdmap full prune enabled
Dec 06 10:24:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e238 e238: 6 total, 6 up, 6 in
Dec 06 10:24:09 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in
Dec 06 10:24:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:10 np0005548788.localdomain ceph-mon[293643]: pgmap v519: 177 pgs: 177 active+clean; 244 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 32 KiB/s wr, 104 op/s
Dec 06 10:24:10 np0005548788.localdomain ceph-mon[293643]: osdmap e238: 6 total, 6 up, 6 in
Dec 06 10:24:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3765798835' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3765798835' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:11.291 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e238 do_prune osdmap full prune enabled
Dec 06 10:24:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "format": "json"}]: dispatch
Dec 06 10:24:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e239 e239: 6 total, 6 up, 6 in
Dec 06 10:24:11 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "f95d5566-befb-4ef3-874d-5a8d6c4a6df5_77a05e19-399e-4f52-950c-489f333205cf", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "f95d5566-befb-4ef3-874d-5a8d6c4a6df5", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: pgmap v521: 177 pgs: 177 active+clean; 244 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 29 KiB/s wr, 96 op/s
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: osdmap e239: 6 total, 6 up, 6 in
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e239 do_prune osdmap full prune enabled
Dec 06 10:24:12 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:24:12.601 2 INFO neutron.agent.securitygroups_rpc [None req-63e1cfe1-0be8-4c17-9453-907c82bfa210 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group rule updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e240 e240: 6 total, 6 up, 6 in
Dec 06 10:24:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in
Dec 06 10:24:13 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:24:13.351 2 INFO neutron.agent.securitygroups_rpc [None req-9218de2a-a054-45fc-bd21-8f037be37a59 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group rule updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:24:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:24:13 np0005548788.localdomain ceph-mon[293643]: osdmap e240: 6 total, 6 up, 6 in
Dec 06 10:24:13 np0005548788.localdomain podman[326634]: 2025-12-06 10:24:13.70697964 +0000 UTC m=+0.097026537 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:24:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:13 np0005548788.localdomain podman[326634]: 2025-12-06 10:24:13.746583166 +0000 UTC m=+0.136630023 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:24:13 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:24:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:14.229 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:14 np0005548788.localdomain ceph-mon[293643]: pgmap v524: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 KiB/s rd, 75 KiB/s wr, 152 op/s
Dec 06 10:24:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "format": "json"}]: dispatch
Dec 06 10:24:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e240 do_prune osdmap full prune enabled
Dec 06 10:24:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "format": "json"}]: dispatch
Dec 06 10:24:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e241 e241: 6 total, 6 up, 6 in
Dec 06 10:24:15 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in
Dec 06 10:24:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:24:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:24:16 np0005548788.localdomain podman[326659]: 2025-12-06 10:24:16.27514798 +0000 UTC m=+0.089454432 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:24:16 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:24:16 np0005548788.localdomain podman[326659]: 2025-12-06 10:24:16.292056544 +0000 UTC m=+0.106363036 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:24:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:16.294 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:16 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:24:16 np0005548788.localdomain podman[326660]: 2025-12-06 10:24:16.378351367 +0000 UTC m=+0.185446036 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, release=1755695350)
Dec 06 10:24:16 np0005548788.localdomain podman[326660]: 2025-12-06 10:24:16.392590558 +0000 UTC m=+0.199685257 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Dec 06 10:24:16 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:24:16 np0005548788.localdomain podman[326693]: 2025-12-06 10:24:16.444384042 +0000 UTC m=+0.143957050 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:24:16 np0005548788.localdomain podman[326693]: 2025-12-06 10:24:16.482899605 +0000 UTC m=+0.182472593 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec 06 10:24:16 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:24:16 np0005548788.localdomain ceph-mon[293643]: pgmap v525: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 42 KiB/s wr, 47 op/s
Dec 06 10:24:16 np0005548788.localdomain ceph-mon[293643]: osdmap e241: 6 total, 6 up, 6 in
Dec 06 10:24:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e241 do_prune osdmap full prune enabled
Dec 06 10:24:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e242 e242: 6 total, 6 up, 6 in
Dec 06 10:24:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in
Dec 06 10:24:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 06 10:24:17 np0005548788.localdomain ceph-mon[293643]: osdmap e242: 6 total, 6 up, 6 in
Dec 06 10:24:18 np0005548788.localdomain ceph-mon[293643]: pgmap v527: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 42 KiB/s wr, 47 op/s
Dec 06 10:24:18 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1847092686' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:19 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:24:19.085 2 INFO neutron.agent.securitygroups_rpc [req-6cf9c8b9-f82f-4729-827a-87ee94dc739b req-d70b880e-b383-4ad3-91f4-06f3e667f577 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group member updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:24:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:24:19.116 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:24:18Z, description=, device_id=b59377c8-c3d7-452b-8305-d2853ef47bb4, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c677f370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c677fac0>], id=c8391efe-eabf-46a0-94e6-c12eb660cfb2, ip_allocation=immediate, mac_address=fa:16:3e:ec:95:9c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:23:42Z, description=, dns_domain=, id=55ffc629-08a5-404f-87a7-26deb97840dc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1845353867-network, port_security_enabled=True, project_id=b51f704fe6204487b0317c3332364cca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18201, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3476, status=ACTIVE, subnets=['04d859a0-16a3-4893-aad6-8d6b1a003e1d'], tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:23:43Z, vlan_transparent=None, network_id=55ffc629-08a5-404f-87a7-26deb97840dc, port_security_enabled=True, project_id=b51f704fe6204487b0317c3332364cca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d407968b-b8de-45cd-a244-3bf62d3c0357'], standard_attr_id=3530, status=DOWN, tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:24:18Z on network 55ffc629-08a5-404f-87a7-26deb97840dc
Dec 06 10:24:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:19.231 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:19 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 2 addresses
Dec 06 10:24:19 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:24:19 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:24:19 np0005548788.localdomain podman[326739]: 2025-12-06 10:24:19.335450894 +0000 UTC m=+0.050204637 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:24:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:24:19.538 262572 INFO neutron.agent.dhcp.agent [None req-93e0989e-5806-4f0d-9adc-121e80c33bae - - - - - -] DHCP configuration for ports {'c8391efe-eabf-46a0-94e6-c12eb660cfb2'} is completed
Dec 06 10:24:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:24:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:24:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:24:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:24:19 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:24:19.665 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005548790.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:24:18Z, description=, device_id=b59377c8-c3d7-452b-8305-d2853ef47bb4, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c667abb0>], dns_domain=, dns_name=tempest-volumesbackupstest-instance-739598656, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c667ab20>], id=c8391efe-eabf-46a0-94e6-c12eb660cfb2, ip_allocation=immediate, mac_address=fa:16:3e:ec:95:9c, name=, network_id=55ffc629-08a5-404f-87a7-26deb97840dc, port_security_enabled=True, project_id=b51f704fe6204487b0317c3332364cca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['d407968b-b8de-45cd-a244-3bf62d3c0357'], standard_attr_id=3530, status=DOWN, tags=[], tenant_id=b51f704fe6204487b0317c3332364cca, updated_at=2025-12-06T10:24:19Z on network 55ffc629-08a5-404f-87a7-26deb97840dc
Dec 06 10:24:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:24:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19232 "" "Go-http-client/1.1"
Dec 06 10:24:19 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 2 addresses
Dec 06 10:24:19 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:24:19 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:24:19 np0005548788.localdomain podman[326777]: 2025-12-06 10:24:19.899905428 +0000 UTC m=+0.063288342 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:24:20 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:24:20.170 262572 INFO neutron.agent.dhcp.agent [None req-68bdb121-7a76-4cf7-82a9-7663232e8aa6 - - - - - -] DHCP configuration for ports {'c8391efe-eabf-46a0-94e6-c12eb660cfb2'} is completed
Dec 06 10:24:20 np0005548788.localdomain ceph-mon[293643]: pgmap v529: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 88 KiB/s wr, 46 op/s
Dec 06 10:24:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:21.330 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:21 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "077e73af-9063-4209-9319-e18e1a460598", "format": "json"}]: dispatch
Dec 06 10:24:21 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3448344129' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:21 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2893884725' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e242 do_prune osdmap full prune enabled
Dec 06 10:24:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 e243: 6 total, 6 up, 6 in
Dec 06 10:24:22 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in
Dec 06 10:24:22 np0005548788.localdomain ceph-mon[293643]: pgmap v530: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 43 KiB/s wr, 2 op/s
Dec 06 10:24:22 np0005548788.localdomain ceph-mon[293643]: osdmap e243: 6 total, 6 up, 6 in
Dec 06 10:24:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:23 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:24.043 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:24.087 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:24.136 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:24.235 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:24 np0005548788.localdomain ceph-mon[293643]: pgmap v532: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.9 MiB/s wr, 63 op/s
Dec 06 10:24:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "format": "json"}]: dispatch
Dec 06 10:24:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "format": "json"}]: dispatch
Dec 06 10:24:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:25 np0005548788.localdomain ceph-mon[293643]: pgmap v533: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.7 MiB/s wr, 60 op/s
Dec 06 10:24:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:26.333 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:26.574 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:28 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 06 10:24:28 np0005548788.localdomain ceph-mon[293643]: pgmap v534: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.3 MiB/s wr, 49 op/s
Dec 06 10:24:28 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:29 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:24:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:29.237 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:29 np0005548788.localdomain systemd[1]: tmp-crun.a07C5B.mount: Deactivated successfully.
Dec 06 10:24:29 np0005548788.localdomain podman[326800]: 2025-12-06 10:24:29.263820281 +0000 UTC m=+0.090800613 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:24:29 np0005548788.localdomain podman[326800]: 2025-12-06 10:24:29.280728444 +0000 UTC m=+0.107708736 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:29 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:24:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:29.590 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "format": "json"}]: dispatch
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: pgmap v535: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.2 MiB/s wr, 124 op/s
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.676095) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670676241, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2586, "num_deletes": 279, "total_data_size": 3592890, "memory_usage": 3655424, "flush_reason": "Manual Compaction"}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670696408, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 3502146, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33418, "largest_seqno": 36002, "table_properties": {"data_size": 3490518, "index_size": 7557, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26999, "raw_average_key_size": 22, "raw_value_size": 3466567, "raw_average_value_size": 2917, "num_data_blocks": 316, "num_entries": 1188, "num_filter_entries": 1188, "num_deletions": 279, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016536, "oldest_key_time": 1765016536, "file_creation_time": 1765016670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 20360 microseconds, and 8689 cpu microseconds.
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.696464) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 3502146 bytes OK
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.696497) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.698225) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.698247) EVENT_LOG_v1 {"time_micros": 1765016670698241, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.698272) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 3581445, prev total WAL file size 3581445, number of live WAL files 2.
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.699322) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(3420KB)], [63(16MB)]
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670699441, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 21166286, "oldest_snapshot_seqno": -1}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 13564 keys, 19525128 bytes, temperature: kUnknown
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670811707, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 19525128, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19448334, "index_size": 41813, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33925, "raw_key_size": 364920, "raw_average_key_size": 26, "raw_value_size": 19218318, "raw_average_value_size": 1416, "num_data_blocks": 1551, "num_entries": 13564, "num_filter_entries": 13564, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.812060) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 19525128 bytes
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.814069) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.4 rd, 173.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 16.8 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(11.6) write-amplify(5.6) OK, records in: 14128, records dropped: 564 output_compression: NoCompression
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.814101) EVENT_LOG_v1 {"time_micros": 1765016670814087, "job": 38, "event": "compaction_finished", "compaction_time_micros": 112353, "compaction_time_cpu_micros": 53829, "output_level": 6, "num_output_files": 1, "total_output_size": 19525128, "num_input_records": 14128, "num_output_records": 13564, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670814753, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670816926, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.699168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817099) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817102) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:24:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:24:31 np0005548788.localdomain podman[326820]: 2025-12-06 10:24:31.263634446 +0000 UTC m=+0.083059695 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:24:31 np0005548788.localdomain podman[326819]: 2025-12-06 10:24:31.329707913 +0000 UTC m=+0.152700012 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:24:31 np0005548788.localdomain podman[326820]: 2025-12-06 10:24:31.364626104 +0000 UTC m=+0.184051343 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:24:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:31.368 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:31 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:24:31 np0005548788.localdomain podman[326819]: 2025-12-06 10:24:31.417472721 +0000 UTC m=+0.240464880 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:24:31 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "format": "json"}]: dispatch
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:32.644 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:32 np0005548788.localdomain ceph-mon[293643]: pgmap v536: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.2 MiB/s wr, 124 op/s
Dec 06 10:24:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "auth_id": "Joe", "tenant_id": "14fcd30962314973b2c11b49f89b4cb4", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:33 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:34.238 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:34 np0005548788.localdomain ceph-mon[293643]: pgmap v537: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Dec 06 10:24:34 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:34 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "format": "json"}]: dispatch
Dec 06 10:24:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:36.370 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:36 np0005548788.localdomain ceph-mon[293643]: pgmap v538: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 62 KiB/s wr, 68 op/s
Dec 06 10:24:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "format": "json"}]: dispatch
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.662587) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677662630, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 354, "num_deletes": 257, "total_data_size": 93613, "memory_usage": 100424, "flush_reason": "Manual Compaction"}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677666082, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 91458, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36003, "largest_seqno": 36356, "table_properties": {"data_size": 89299, "index_size": 270, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5639, "raw_average_key_size": 18, "raw_value_size": 84804, "raw_average_value_size": 272, "num_data_blocks": 12, "num_entries": 311, "num_filter_entries": 311, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016671, "oldest_key_time": 1765016671, "file_creation_time": 1765016677, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 3517 microseconds, and 827 cpu microseconds.
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.666109) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 91458 bytes OK
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.666125) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.668533) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.668547) EVENT_LOG_v1 {"time_micros": 1765016677668542, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.668558) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 91190, prev total WAL file size 91190, number of live WAL files 2.
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.668982) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323730' seq:72057594037927935, type:22 .. '6C6F676D0034353233' seq:0, type:0; will stop at (end)
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(89KB)], [66(18MB)]
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677669072, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 19616586, "oldest_snapshot_seqno": -1}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 13345 keys, 19187676 bytes, temperature: kUnknown
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677776975, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 19187676, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19112747, "index_size": 40460, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33413, "raw_key_size": 361369, "raw_average_key_size": 27, "raw_value_size": 18886790, "raw_average_value_size": 1415, "num_data_blocks": 1487, "num_entries": 13345, "num_filter_entries": 13345, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016677, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.777473) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 19187676 bytes
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.779497) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.6 rd, 177.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.6 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(424.3) write-amplify(209.8) OK, records in: 13875, records dropped: 530 output_compression: NoCompression
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.779538) EVENT_LOG_v1 {"time_micros": 1765016677779520, "job": 40, "event": "compaction_finished", "compaction_time_micros": 108014, "compaction_time_cpu_micros": 52854, "output_level": 6, "num_output_files": 1, "total_output_size": 19187676, "num_input_records": 13875, "num_output_records": 13345, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677779725, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677782412, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.668841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.782569) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.782577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.782581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.782584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:24:37.782587) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "format": "json"}]: dispatch
Dec 06 10:24:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:38 np0005548788.localdomain ceph-mon[293643]: pgmap v539: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 62 KiB/s wr, 68 op/s
Dec 06 10:24:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:24:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:24:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3543682375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3543682375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:39.240 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "Joe", "tenant_id": "438e893229f742e78fe8e62ef6ea17d5", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3543682375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3543682375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3756278959' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:40 np0005548788.localdomain ceph-mon[293643]: pgmap v540: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 133 op/s
Dec 06 10:24:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/701765729' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:41.026 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:41.027 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:41.027 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:24:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:41.374 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "format": "json"}]: dispatch
Dec 06 10:24:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:24:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: pgmap v541: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 326 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "tempest-cephx-id-649576020", "tenant_id": "438e893229f742e78fe8e62ef6ea17d5", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-649576020", "format": "json"} : dispatch
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:42 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:43.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:43 np0005548788.localdomain ceph-mon[293643]: pgmap v542: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 70 op/s
Dec 06 10:24:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "format": "json"}]: dispatch
Dec 06 10:24:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:44 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:24:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:44.242 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:44 np0005548788.localdomain podman[326858]: 2025-12-06 10:24:44.255641975 +0000 UTC m=+0.078013797 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:24:44 np0005548788.localdomain podman[326858]: 2025-12-06 10:24:44.298681958 +0000 UTC m=+0.121053720 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:24:44 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:24:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:45.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:46.378 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:47.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: pgmap v543: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:24:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:24:47 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:24:47 np0005548788.localdomain podman[326884]: 2025-12-06 10:24:47.265029231 +0000 UTC m=+0.090079831 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:24:47 np0005548788.localdomain podman[326884]: 2025-12-06 10:24:47.294308038 +0000 UTC m=+0.119358598 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:24:47 np0005548788.localdomain podman[326891]: 2025-12-06 10:24:47.308526689 +0000 UTC m=+0.122925419 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal)
Dec 06 10:24:47 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:24:47 np0005548788.localdomain podman[326891]: 2025-12-06 10:24:47.326184976 +0000 UTC m=+0.140583796 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, vcs-type=git, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Dec 06 10:24:47 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:24:47 np0005548788.localdomain podman[326885]: 2025-12-06 10:24:47.36505834 +0000 UTC m=+0.180581615 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:24:47 np0005548788.localdomain podman[326885]: 2025-12-06 10:24:47.376657509 +0000 UTC m=+0.192180874 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:24:47 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:24:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:24:47.445 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:24:47.445 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:24:47.446 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:48.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:48.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:24:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:48.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:24:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:48.032 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:24:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:48.032 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "aeb4e71b-e797-4ddb-8668-e9c13b45f222", "format": "json"}]: dispatch
Dec 06 10:24:48 np0005548788.localdomain ceph-mon[293643]: pgmap v544: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Dec 06 10:24:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"} v 0)
Dec 06 10:24:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"} : dispatch
Dec 06 10:24:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"}]': finished
Dec 06 10:24:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:49.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:49.243 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:24:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:24:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:24:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:24:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:24:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19243 "" "Go-http-client/1.1"
Dec 06 10:24:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "tempest-cephx-id-649576020", "format": "json"}]: dispatch
Dec 06 10:24:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-649576020", "format": "json"} : dispatch
Dec 06 10:24:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"} : dispatch
Dec 06 10:24:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"} : dispatch
Dec 06 10:24:49 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"}]': finished
Dec 06 10:24:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "tempest-cephx-id-649576020", "format": "json"}]: dispatch
Dec 06 10:24:50 np0005548788.localdomain ceph-mon[293643]: pgmap v545: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 70 op/s
Dec 06 10:24:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "format": "json"}]: dispatch
Dec 06 10:24:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.024 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.049 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.050 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.051 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.051 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.051 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.381 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/425768214' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.517 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.729 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.731 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11410MB free_disk=41.700347900390625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.732 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.732 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "aeb4e71b-e797-4ddb-8668-e9c13b45f222_44a54937-b38a-4109-92db-e338a6e6c4a4", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "aeb4e71b-e797-4ddb-8668-e9c13b45f222", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2358963560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/425768214' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.803 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.804 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.879 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.957 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.957 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:24:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:51.978 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0)
Dec 06 10:24:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.002 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.021 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1025362414' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.495 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.502 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.525 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.528 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:24:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:52.528 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: pgmap v546: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.7 KiB/s rd, 67 KiB/s wr, 5 op/s
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1474548798' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:52 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1025362414' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:54.246 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:54 np0005548788.localdomain ceph-mon[293643]: pgmap v547: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s rd, 124 KiB/s wr, 11 op/s
Dec 06 10:24:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e243 do_prune osdmap full prune enabled
Dec 06 10:24:55 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Dec 06 10:24:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e244 e244: 6 total, 6 up, 6 in
Dec 06 10:24:55 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in
Dec 06 10:24:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:56.384 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:56 np0005548788.localdomain ceph-mon[293643]: pgmap v548: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 100 KiB/s wr, 8 op/s
Dec 06 10:24:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "admin", "tenant_id": "14fcd30962314973b2c11b49f89b4cb4", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:56 np0005548788.localdomain ceph-mon[293643]: osdmap e244: 6 total, 6 up, 6 in
Dec 06 10:24:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "cdf82d1a-cb71-4bc8-a483-dd9f0663d1ab", "format": "json"}]: dispatch
Dec 06 10:24:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2198419892' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:58 np0005548788.localdomain ceph-mon[293643]: pgmap v550: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 120 KiB/s wr, 9 op/s
Dec 06 10:24:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2198419892' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:24:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:24:59.249 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "david", "tenant_id": "14fcd30962314973b2c11b49f89b4cb4", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: pgmap v551: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 99 KiB/s wr, 8 op/s
Dec 06 10:24:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "9ca8d539-a020-426e-a70b-3e1ee1b92584", "format": "json"}]: dispatch
Dec 06 10:25:00 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:25:00 np0005548788.localdomain systemd[1]: tmp-crun.iz9Az5.mount: Deactivated successfully.
Dec 06 10:25:00 np0005548788.localdomain podman[326991]: 2025-12-06 10:25:00.256258669 +0000 UTC m=+0.080982508 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec 06 10:25:00 np0005548788.localdomain podman[326991]: 2025-12-06 10:25:00.272708009 +0000 UTC m=+0.097431848 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:25:00 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:25:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1792782973' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:01.394 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e244 do_prune osdmap full prune enabled
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e245 e245: 6 total, 6 up, 6 in
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: pgmap v552: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 99 KiB/s wr, 8 op/s
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in
Dec 06 10:25:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:25:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:25:02 np0005548788.localdomain podman[327011]: 2025-12-06 10:25:02.267179549 +0000 UTC m=+0.087884233 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 10:25:02 np0005548788.localdomain podman[327011]: 2025-12-06 10:25:02.272533324 +0000 UTC m=+0.093237998 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:25:02 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:25:02 np0005548788.localdomain podman[327010]: 2025-12-06 10:25:02.323914036 +0000 UTC m=+0.148588834 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:25:02 np0005548788.localdomain podman[327010]: 2025-12-06 10:25:02.33564078 +0000 UTC m=+0.160315618 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:25:02 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e245 do_prune osdmap full prune enabled
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e246 e246: 6 total, 6 up, 6 in
Dec 06 10:25:02 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in
Dec 06 10:25:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "format": "json"}]: dispatch
Dec 06 10:25:03 np0005548788.localdomain ceph-mon[293643]: osdmap e245: 6 total, 6 up, 6 in
Dec 06 10:25:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "format": "json"}]: dispatch
Dec 06 10:25:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:03 np0005548788.localdomain ceph-mon[293643]: osdmap e246: 6 total, 6 up, 6 in
Dec 06 10:25:03 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:25:03Z|00287|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Dec 06 10:25:04 np0005548788.localdomain ceph-mon[293643]: pgmap v555: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 95 KiB/s wr, 34 op/s
Dec 06 10:25:04 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "bd3fee73-c2d0-4d8d-bc68-793e93881208", "format": "json"}]: dispatch
Dec 06 10:25:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4089585165' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:04.251 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e246 do_prune osdmap full prune enabled
Dec 06 10:25:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e247 e247: 6 total, 6 up, 6 in
Dec 06 10:25:05 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e247 do_prune osdmap full prune enabled
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e248 e248: 6 total, 6 up, 6 in
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: osdmap e247: 6 total, 6 up, 6 in
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404", "format": "json"}]: dispatch
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: pgmap v557: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 67 KiB/s wr, 39 op/s
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "auth_id": "david", "tenant_id": "438e893229f742e78fe8e62ef6ea17d5", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:25:06 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 06 10:25:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:06.396 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:07 np0005548788.localdomain ceph-mon[293643]: osdmap e248: 6 total, 6 up, 6 in
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:25:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:07 np0005548788.localdomain sudo[327050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:25:07 np0005548788.localdomain sudo[327050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:25:07 np0005548788.localdomain sudo[327050]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:07 np0005548788.localdomain sudo[327068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:25:07 np0005548788.localdomain sudo[327068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:25:08 np0005548788.localdomain sudo[327068]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:08 np0005548788.localdomain ceph-mon[293643]: pgmap v559: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 76 KiB/s wr, 45 op/s
Dec 06 10:25:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:25:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:25:08 np0005548788.localdomain sudo[327118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:25:08 np0005548788.localdomain sudo[327118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:25:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:25:08 np0005548788.localdomain sudo[327118]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e248 do_prune osdmap full prune enabled
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e249 e249: 6 total, 6 up, 6 in
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in
Dec 06 10:25:09 np0005548788.localdomain sshd[327136]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:25:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:09.254 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:09 np0005548788.localdomain sshd[327136]: error: kex_exchange_identification: banner line contains invalid characters
Dec 06 10:25:09 np0005548788.localdomain sshd[327136]: error: send_error: write: Broken pipe
Dec 06 10:25:09 np0005548788.localdomain sshd[327136]: banner exchange: Connection from 3.131.215.38 port 49984: invalid format
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "541781d2-de06-47af-8a8c-92b3f31186b7", "format": "json"}]: dispatch
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404", "target_sub_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:25:09 np0005548788.localdomain ceph-mon[293643]: osdmap e249: 6 total, 6 up, 6 in
Dec 06 10:25:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e53: np0005548790.kvkfyr(active, since 13m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:25:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:10 np0005548788.localdomain ceph-mon[293643]: pgmap v561: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 34 KiB/s wr, 44 op/s
Dec 06 10:25:10 np0005548788.localdomain ceph-mon[293643]: mgrmap e53: np0005548790.kvkfyr(active, since 13m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:25:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:11.402 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e249 do_prune osdmap full prune enabled
Dec 06 10:25:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/4192589746' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e250 e250: 6 total, 6 up, 6 in
Dec 06 10:25:11 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0)
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e250 do_prune osdmap full prune enabled
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: pgmap v562: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 33 KiB/s wr, 43 op/s
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: osdmap e250: 6 total, 6 up, 6 in
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e251 e251: 6 total, 6 up, 6 in
Dec 06 10:25:12 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in
Dec 06 10:25:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "da9cb17d-a418-4d10-ae1e-46dd9b618283", "format": "json"}]: dispatch
Dec 06 10:25:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:13 np0005548788.localdomain ceph-mon[293643]: osdmap e251: 6 total, 6 up, 6 in
Dec 06 10:25:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:14.256 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:14 np0005548788.localdomain ceph-mon[293643]: pgmap v565: 177 pgs: 177 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 156 KiB/s wr, 116 op/s
Dec 06 10:25:15 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:25:15 np0005548788.localdomain podman[327138]: 2025-12-06 10:25:15.273909613 +0000 UTC m=+0.093590630 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:25:15 np0005548788.localdomain podman[327138]: 2025-12-06 10:25:15.351750274 +0000 UTC m=+0.171431321 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:25:15 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:25:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e251 do_prune osdmap full prune enabled
Dec 06 10:25:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e252 e252: 6 total, 6 up, 6 in
Dec 06 10:25:15 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in
Dec 06 10:25:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:16.424 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:16 np0005548788.localdomain ceph-mon[293643]: pgmap v566: 177 pgs: 177 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 119 KiB/s wr, 69 op/s
Dec 06 10:25:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "da9cb17d-a418-4d10-ae1e-46dd9b618283_59a2f9e2-3ade-456e-8a51-63c6c5f92484", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "da9cb17d-a418-4d10-ae1e-46dd9b618283", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:16 np0005548788.localdomain ceph-mon[293643]: osdmap e252: 6 total, 6 up, 6 in
Dec 06 10:25:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "format": "json"}]: dispatch
Dec 06 10:25:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e252 do_prune osdmap full prune enabled
Dec 06 10:25:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e253 e253: 6 total, 6 up, 6 in
Dec 06 10:25:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in
Dec 06 10:25:17 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:17 np0005548788.localdomain ceph-mon[293643]: osdmap e253: 6 total, 6 up, 6 in
Dec 06 10:25:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:25:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:25:18 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:25:18 np0005548788.localdomain podman[327163]: 2025-12-06 10:25:18.275387885 +0000 UTC m=+0.092656571 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:25:18 np0005548788.localdomain podman[327165]: 2025-12-06 10:25:18.285883289 +0000 UTC m=+0.092986021 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Dec 06 10:25:18 np0005548788.localdomain podman[327163]: 2025-12-06 10:25:18.290188843 +0000 UTC m=+0.107457539 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:25:18 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:25:18 np0005548788.localdomain podman[327165]: 2025-12-06 10:25:18.303933579 +0000 UTC m=+0.111036381 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Dec 06 10:25:18 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:25:18 np0005548788.localdomain podman[327164]: 2025-12-06 10:25:18.388085985 +0000 UTC m=+0.199422798 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:25:18 np0005548788.localdomain podman[327164]: 2025-12-06 10:25:18.424863084 +0000 UTC m=+0.236199957 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:25:18 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:25:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e253 do_prune osdmap full prune enabled
Dec 06 10:25:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e254 e254: 6 total, 6 up, 6 in
Dec 06 10:25:18 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in
Dec 06 10:25:18 np0005548788.localdomain ceph-mon[293643]: pgmap v568: 177 pgs: 177 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 122 KiB/s wr, 71 op/s
Dec 06 10:25:18 np0005548788.localdomain ceph-mon[293643]: osdmap e254: 6 total, 6 up, 6 in
Dec 06 10:25:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:19.257 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:25:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:25:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:25:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:25:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:25:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19244 "" "Go-http-client/1.1"
Dec 06 10:25:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "541781d2-de06-47af-8a8c-92b3f31186b7_1ac163ca-79ea-43b6-8028-d76d24ca4cd1", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "541781d2-de06-47af-8a8c-92b3f31186b7", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548788.localdomain ceph-mon[293643]: pgmap v571: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 64 KiB/s wr, 56 op/s
Dec 06 10:25:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e254 do_prune osdmap full prune enabled
Dec 06 10:25:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e255 e255: 6 total, 6 up, 6 in
Dec 06 10:25:20 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in
Dec 06 10:25:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:21.431 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:21.834 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:21.833 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:25:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:21.834 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:25:21 np0005548788.localdomain ceph-mon[293643]: osdmap e255: 6 total, 6 up, 6 in
Dec 06 10:25:21 np0005548788.localdomain ceph-mon[293643]: pgmap v573: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 70 KiB/s wr, 61 op/s
Dec 06 10:25:22 np0005548788.localdomain neutron_sriov_agent[255571]: 2025-12-06 10:25:22.657 2 INFO neutron.agent.securitygroups_rpc [req-63e33143-79ba-4452-b217-6b4868995963 req-6d925882-c432-4b30-bcfa-4ea2e9401f50 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group member updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:25:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e255 do_prune osdmap full prune enabled
Dec 06 10:25:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e256 e256: 6 total, 6 up, 6 in
Dec 06 10:25:22 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in
Dec 06 10:25:22 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 1 addresses
Dec 06 10:25:22 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:25:22 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:25:22 np0005548788.localdomain podman[327241]: 2025-12-06 10:25:22.887884377 +0000 UTC m=+0.053126837 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e256 do_prune osdmap full prune enabled
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "bd3fee73-c2d0-4d8d-bc68-793e93881208_9652976b-d5e6-4b47-ae83-6f26b1212f0e", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "bd3fee73-c2d0-4d8d-bc68-793e93881208", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: osdmap e256: 6 total, 6 up, 6 in
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3694335426' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e257 e257: 6 total, 6 up, 6 in
Dec 06 10:25:23 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in
Dec 06 10:25:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:24.258 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:24 np0005548788.localdomain ceph-mon[293643]: pgmap v575: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 169 KiB/s wr, 130 op/s
Dec 06 10:25:24 np0005548788.localdomain ceph-mon[293643]: osdmap e257: 6 total, 6 up, 6 in
Dec 06 10:25:25 np0005548788.localdomain sshd[327262]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:25:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e257 do_prune osdmap full prune enabled
Dec 06 10:25:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e258 e258: 6 total, 6 up, 6 in
Dec 06 10:25:25 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in
Dec 06 10:25:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:26.426 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e258 do_prune osdmap full prune enabled
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "9ca8d539-a020-426e-a70b-3e1ee1b92584_844e7d16-6de7-405a-b7f6-e1408c2dd627", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: pgmap v577: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 95 KiB/s wr, 65 op/s
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "9ca8d539-a020-426e-a70b-3e1ee1b92584", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "admin", "format": "json"}]: dispatch
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: osdmap e258: 6 total, 6 up, 6 in
Dec 06 10:25:26 np0005548788.localdomain sshd[327262]: Invalid user ubuntu from 43.163.93.82 port 53756
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e259 e259: 6 total, 6 up, 6 in
Dec 06 10:25:26 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548788.localdomain sshd[327262]: Received disconnect from 43.163.93.82 port 53756:11:  [preauth]
Dec 06 10:25:27 np0005548788.localdomain sshd[327262]: Disconnected from invalid user ubuntu 43.163.93.82 port 53756 [preauth]
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e259 do_prune osdmap full prune enabled
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e260 e260: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "format": "json"}]: dispatch
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: osdmap e259: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548788.localdomain ceph-mon[293643]: osdmap e260: 6 total, 6 up, 6 in
Dec 06 10:25:28 np0005548788.localdomain ceph-mon[293643]: pgmap v580: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 123 KiB/s wr, 85 op/s
Dec 06 10:25:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2541831024' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:28 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/2541831024' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:28 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:28.836 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:25:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:25:29Z|00288|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:25:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:25:29Z|00289|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:25:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:25:29Z|00290|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.054 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.056 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.060 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.122 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain dnsmasq[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/addn_hosts - 0 addresses
Dec 06 10:25:29 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/host
Dec 06 10:25:29 np0005548788.localdomain podman[327284]: 2025-12-06 10:25:29.225402474 +0000 UTC m=+0.069326918 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:25:29 np0005548788.localdomain dnsmasq-dhcp[326247]: read /var/lib/neutron/dhcp/55ffc629-08a5-404f-87a7-26deb97840dc/opts
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.260 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.427 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain kernel: device tap16ab0353-4c left promiscuous mode
Dec 06 10:25:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:25:29Z|00291|binding|INFO|Releasing lport 16ab0353-4ca5-40a7-a86b-3994fc8722ca from this chassis (sb_readonly=0)
Dec 06 10:25:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:25:29Z|00292|binding|INFO|Setting lport 16ab0353-4ca5-40a7-a86b-3994fc8722ca down in Southbound
Dec 06 10:25:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:29.445 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:29.461 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-55ffc629-08a5-404f-87a7-26deb97840dc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55ffc629-08a5-404f-87a7-26deb97840dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b51f704fe6204487b0317c3332364cca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a555e286-25fe-4028-bbdb-d66a3efae4d1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=16ab0353-4ca5-40a7-a86b-3994fc8722ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:25:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:29.463 159620 INFO neutron.agent.ovn.metadata.agent [-] Port 16ab0353-4ca5-40a7-a86b-3994fc8722ca in datapath 55ffc629-08a5-404f-87a7-26deb97840dc unbound from our chassis
Dec 06 10:25:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:29.465 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55ffc629-08a5-404f-87a7-26deb97840dc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:25:29 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:29.470 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[178a21e4-65a2-42bb-aee5-61719b9fda08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:25:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "cdf82d1a-cb71-4bc8-a483-dd9f0663d1ab_60347acc-96b8-4ecf-9b3d-01ad73eeabab", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "cdf82d1a-cb71-4bc8-a483-dd9f0663d1ab", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:30 np0005548788.localdomain ceph-mon[293643]: pgmap v582: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 76 KiB/s wr, 61 op/s
Dec 06 10:25:31 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:25:31 np0005548788.localdomain podman[327307]: 2025-12-06 10:25:31.255380114 +0000 UTC m=+0.082605830 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:25:31 np0005548788.localdomain podman[327307]: 2025-12-06 10:25:31.272628138 +0000 UTC m=+0.099853844 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:25:31 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:25:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:31.429 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:31 np0005548788.localdomain dnsmasq[326247]: exiting on receipt of SIGTERM
Dec 06 10:25:31 np0005548788.localdomain podman[327341]: 2025-12-06 10:25:31.966689867 +0000 UTC m=+0.065112118 container kill e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:25:31 np0005548788.localdomain systemd[1]: libpod-e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0.scope: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain podman[327354]: 2025-12-06 10:25:32.048633695 +0000 UTC m=+0.063153827 container died e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: tmp-crun.8A7Bpa.mount: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain podman[327354]: 2025-12-06 10:25:32.097186939 +0000 UTC m=+0.111707041 container cleanup e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: libpod-conmon-e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0.scope: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain podman[327356]: 2025-12-06 10:25:32.179318683 +0000 UTC m=+0.188354765 container remove e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-55ffc629-08a5-404f-87a7-26deb97840dc, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:25:32 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:25:32.217 262572 INFO neutron.agent.dhcp.agent [None req-71390ec2-01a2-47d7-820d-f1ca1a960426 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:25:32 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:25:32.218 262572 INFO neutron.agent.dhcp.agent [None req-71390ec2-01a2-47d7-820d-f1ca1a960426 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-dada672f5a9f70b28563c1bdf508933570d30989fd9dcd820b78ff73781ae180-merged.mount: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e3394f627866cee1388b856b2784294134448142672f5a72cc3722c9638433d0-userdata-shm.mount: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d55ffc629\x2d08a5\x2d404f\x2d87a7\x2d26deb97840dc.mount: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:32.295 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:25:32 np0005548788.localdomain podman[327383]: 2025-12-06 10:25:32.425627843 +0000 UTC m=+0.104399965 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:25:32 np0005548788.localdomain podman[327383]: 2025-12-06 10:25:32.456293152 +0000 UTC m=+0.135065294 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain podman[327401]: 2025-12-06 10:25:32.519756288 +0000 UTC m=+0.084164228 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:25:32 np0005548788.localdomain podman[327401]: 2025-12-06 10:25:32.530215492 +0000 UTC m=+0.094623462 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:25:32 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e260 do_prune osdmap full prune enabled
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e261 e261: 6 total, 6 up, 6 in
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: pgmap v583: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 71 KiB/s wr, 57 op/s
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "format": "json"}]: dispatch
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:32 np0005548788.localdomain ceph-mon[293643]: osdmap e261: 6 total, 6 up, 6 in
Dec 06 10:25:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e261 do_prune osdmap full prune enabled
Dec 06 10:25:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e262 e262: 6 total, 6 up, 6 in
Dec 06 10:25:33 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in
Dec 06 10:25:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:34.262 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:34 np0005548788.localdomain ceph-mon[293643]: pgmap v585: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 123 KiB/s wr, 60 op/s
Dec 06 10:25:34 np0005548788.localdomain ceph-mon[293643]: osdmap e262: 6 total, 6 up, 6 in
Dec 06 10:25:35 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:35 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:36.432 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 5429 writes, 37K keys, 5428 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 5429 writes, 5428 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2336 writes, 11K keys, 2335 commit groups, 1.0 writes per commit group, ingest: 12.62 MB, 0.02 MB/s
                                                           Interval WAL: 2336 writes, 2335 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    137.8      0.35              0.14        20    0.017       0      0       0.0       0.0
                                                             L6      1/0   18.30 MB   0.0      0.4     0.0      0.3       0.3      0.0       0.0   6.9    180.0    165.2      2.00              0.95        19    0.105    237K   9826       0.0       0.0
                                                            Sum      1/0   18.30 MB   0.0      0.4     0.0      0.3       0.4      0.1       0.0   7.9    153.3    161.2      2.34              1.09        39    0.060    237K   9826       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0  13.9    163.5    163.4      0.95              0.48        16    0.059    108K   4227       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.4     0.0      0.3       0.3      0.0       0.0   0.0    180.0    165.2      2.00              0.95        19    0.105    237K   9826       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    138.8      0.34              0.14        19    0.018       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.047, interval 0.011
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.37 GB write, 0.31 MB/s write, 0.35 GB read, 0.30 MB/s read, 2.3 seconds
                                                           Interval compaction: 0.15 GB write, 0.26 MB/s write, 0.15 GB read, 0.26 MB/s read, 0.9 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x564f1a09b350#2 capacity: 304.00 MB usage: 60.22 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000442 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(3947,58.71 MB,19.3121%) FilterBlock(39,677.92 KB,0.217774%) IndexBlock(39,866.73 KB,0.278428%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e262 do_prune osdmap full prune enabled
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: pgmap v587: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 106 KiB/s wr, 52 op/s
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e263 e263: 6 total, 6 up, 6 in
Dec 06 10:25:36 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e263: 6 total, 6 up, 6 in
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e263 do_prune osdmap full prune enabled
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e264 e264: 6 total, 6 up, 6 in
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e264: 6 total, 6 up, 6 in
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "format": "json"}]: dispatch
Dec 06 10:25:37 np0005548788.localdomain ceph-mon[293643]: osdmap e263: 6 total, 6 up, 6 in
Dec 06 10:25:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e264 do_prune osdmap full prune enabled
Dec 06 10:25:38 np0005548788.localdomain ceph-mon[293643]: pgmap v589: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s rd, 63 KiB/s wr, 9 op/s
Dec 06 10:25:38 np0005548788.localdomain ceph-mon[293643]: osdmap e264: 6 total, 6 up, 6 in
Dec 06 10:25:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e265 e265: 6 total, 6 up, 6 in
Dec 06 10:25:38 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e265: 6 total, 6 up, 6 in
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:25:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:25:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:25:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:39.264 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:39 np0005548788.localdomain ceph-mon[293643]: osdmap e265: 6 total, 6 up, 6 in
Dec 06 10:25:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3052221500' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3052221500' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e265 do_prune osdmap full prune enabled
Dec 06 10:25:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e266 e266: 6 total, 6 up, 6 in
Dec 06 10:25:40 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e266: 6 total, 6 up, 6 in
Dec 06 10:25:40 np0005548788.localdomain ceph-mon[293643]: pgmap v592: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 51 KiB/s wr, 36 op/s
Dec 06 10:25:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 06 10:25:40 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/302515343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:41.436 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e266 do_prune osdmap full prune enabled
Dec 06 10:25:41 np0005548788.localdomain ceph-mon[293643]: osdmap e266: 6 total, 6 up, 6 in
Dec 06 10:25:41 np0005548788.localdomain ceph-mon[293643]: pgmap v594: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 63 KiB/s wr, 45 op/s
Dec 06 10:25:41 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2812440448' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e267 e267: 6 total, 6 up, 6 in
Dec 06 10:25:41 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e267: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:42.511 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:42.512 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:42.512 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:25:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e267 do_prune osdmap full prune enabled
Dec 06 10:25:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e268 e268: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e268: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548788.localdomain ceph-mon[293643]: osdmap e267: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548788.localdomain ceph-mon[293643]: osdmap e268: 6 total, 6 up, 6 in
Dec 06 10:25:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e268 do_prune osdmap full prune enabled
Dec 06 10:25:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e269 e269: 6 total, 6 up, 6 in
Dec 06 10:25:43 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e269: 6 total, 6 up, 6 in
Dec 06 10:25:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:44.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:44.266 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "format": "json"}]: dispatch
Dec 06 10:25:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:44 np0005548788.localdomain ceph-mon[293643]: pgmap v597: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 76 KiB/s rd, 57 KiB/s wr, 105 op/s
Dec 06 10:25:44 np0005548788.localdomain ceph-mon[293643]: osdmap e269: 6 total, 6 up, 6 in
Dec 06 10:25:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:45 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1501997510' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1501997510' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:46.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:46 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:25:46 np0005548788.localdomain podman[327424]: 2025-12-06 10:25:46.261909985 +0000 UTC m=+0.084769887 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller)
Dec 06 10:25:46 np0005548788.localdomain podman[327424]: 2025-12-06 10:25:46.324142912 +0000 UTC m=+0.147002774 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:25:46 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:25:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:46.437 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:46 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "format": "json"}]: dispatch
Dec 06 10:25:46 np0005548788.localdomain ceph-mon[293643]: pgmap v599: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 58 KiB/s wr, 106 op/s
Dec 06 10:25:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:47.445 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:47.446 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:25:47.446 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e269 do_prune osdmap full prune enabled
Dec 06 10:25:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e270 e270: 6 total, 6 up, 6 in
Dec 06 10:25:47 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e270: 6 total, 6 up, 6 in
Dec 06 10:25:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:48.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:48 np0005548788.localdomain ceph-mon[293643]: pgmap v600: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 43 KiB/s wr, 78 op/s
Dec 06 10:25:48 np0005548788.localdomain ceph-mon[293643]: osdmap e270: 6 total, 6 up, 6 in
Dec 06 10:25:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:49.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:49.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:25:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:49.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:25:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:49.027 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:25:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:25:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:25:49 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:25:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:49.269 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:49 np0005548788.localdomain podman[327449]: 2025-12-06 10:25:49.281899789 +0000 UTC m=+0.104770078 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:25:49 np0005548788.localdomain podman[327450]: 2025-12-06 10:25:49.324472857 +0000 UTC m=+0.143650621 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:25:49 np0005548788.localdomain podman[327450]: 2025-12-06 10:25:49.338637386 +0000 UTC m=+0.157815190 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:25:49 np0005548788.localdomain podman[327449]: 2025-12-06 10:25:49.346969274 +0000 UTC m=+0.169839553 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:25:49 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:25:49 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:25:49 np0005548788.localdomain podman[327451]: 2025-12-06 10:25:49.428790088 +0000 UTC m=+0.244008569 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public)
Dec 06 10:25:49 np0005548788.localdomain podman[327451]: 2025-12-06 10:25:49.446640912 +0000 UTC m=+0.261859423 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:25:49 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:25:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:25:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:25:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:25:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:25:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:25:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Dec 06 10:25:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "format": "json"}]: dispatch
Dec 06 10:25:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:50.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:50.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "format": "json"}]: dispatch
Dec 06 10:25:50 np0005548788.localdomain ceph-mon[293643]: pgmap v602: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 81 KiB/s wr, 101 op/s
Dec 06 10:25:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:51.441 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e270 do_prune osdmap full prune enabled
Dec 06 10:25:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e271 e271: 6 total, 6 up, 6 in
Dec 06 10:25:52 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e271: 6 total, 6 up, 6 in
Dec 06 10:25:52 np0005548788.localdomain ceph-mon[293643]: pgmap v603: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 35 KiB/s wr, 24 op/s
Dec 06 10:25:52 np0005548788.localdomain ceph-mon[293643]: osdmap e271: 6 total, 6 up, 6 in
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.034 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.034 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.035 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.035 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.036 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2061225661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.512 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.725 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.727 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11423MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.727 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.728 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "format": "json"}]: dispatch
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4186351195' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2061225661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3506710697' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.800 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.800 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:25:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:53.828 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:25:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:54.272 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:25:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3655531887' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:54.291 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:25:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:54.297 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:25:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:54.318 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:25:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:54.320 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:25:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:54.321 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:54 np0005548788.localdomain ceph-mon[293643]: pgmap v605: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 94 KiB/s wr, 27 op/s
Dec 06 10:25:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3655531887' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:56.443 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548788.localdomain ceph-mon[293643]: pgmap v606: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 94 KiB/s wr, 27 op/s
Dec 06 10:25:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:58 np0005548788.localdomain ceph-mon[293643]: pgmap v607: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 78 KiB/s wr, 22 op/s
Dec 06 10:25:58 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:25:59.275 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "format": "json"}]: dispatch
Dec 06 10:26:00 np0005548788.localdomain ceph-mon[293643]: pgmap v608: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 87 KiB/s wr, 4 op/s
Dec 06 10:26:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:01.445 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:01 np0005548788.localdomain ceph-mon[293643]: pgmap v609: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 87 KiB/s wr, 4 op/s
Dec 06 10:26:02 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:26:02 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:02Z|00293|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 06 10:26:02 np0005548788.localdomain podman[327554]: 2025-12-06 10:26:02.257895793 +0000 UTC m=+0.086121609 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:26:02 np0005548788.localdomain podman[327554]: 2025-12-06 10:26:02.295696183 +0000 UTC m=+0.123921989 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:26:02 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:26:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:26:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:26:03 np0005548788.localdomain podman[327572]: 2025-12-06 10:26:03.264383809 +0000 UTC m=+0.089444282 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:26:03 np0005548788.localdomain podman[327572]: 2025-12-06 10:26:03.275979198 +0000 UTC m=+0.101039631 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:26:03 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:26:03 np0005548788.localdomain podman[327573]: 2025-12-06 10:26:03.367730871 +0000 UTC m=+0.190031478 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:26:03 np0005548788.localdomain podman[327573]: 2025-12-06 10:26:03.377303187 +0000 UTC m=+0.199603754 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:26:03 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:26:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "format": "json"}]: dispatch
Dec 06 10:26:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:04.277 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:04 np0005548788.localdomain ceph-mon[293643]: pgmap v610: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 386 B/s rd, 117 KiB/s wr, 6 op/s
Dec 06 10:26:04 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:04 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "format": "json"}]: dispatch
Dec 06 10:26:04 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:06.447 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:06 np0005548788.localdomain ceph-mon[293643]: pgmap v611: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 64 KiB/s wr, 3 op/s
Dec 06 10:26:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:26:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:08 np0005548788.localdomain ceph-mon[293643]: pgmap v612: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 64 KiB/s wr, 3 op/s
Dec 06 10:26:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "format": "json"}]: dispatch
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:26:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:26:09 np0005548788.localdomain sudo[327611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:26:09 np0005548788.localdomain sudo[327611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:26:09 np0005548788.localdomain sudo[327611]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:09 np0005548788.localdomain sudo[327629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:26:09 np0005548788.localdomain sudo[327629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:26:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:09.279 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404_42a3d50e-9358-4b7a-9bc8-ccb63c964302", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:09 np0005548788.localdomain sudo[327629]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:26:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:26:10 np0005548788.localdomain sudo[327678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:26:10 np0005548788.localdomain sudo[327678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:26:10 np0005548788.localdomain sudo[327678]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:10 np0005548788.localdomain ceph-mon[293643]: pgmap v613: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 103 KiB/s wr, 6 op/s
Dec 06 10:26:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:26:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:26:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:26:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:26:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:11.449 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: pgmap v614: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 70 KiB/s wr, 4 op/s
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:26:12 np0005548788.localdomain sshd[327696]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:12 np0005548788.localdomain sshd[327696]: error: kex_exchange_identification: banner line contains invalid characters
Dec 06 10:26:12 np0005548788.localdomain sshd[327696]: banner exchange: Connection from 3.131.215.38 port 38462: invalid format
Dec 06 10:26:13 np0005548788.localdomain ceph-mon[293643]: pgmap v615: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 113 KiB/s wr, 7 op/s
Dec 06 10:26:13 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e54: np0005548790.kvkfyr(active, since 14m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:26:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:14.281 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:14 np0005548788.localdomain ceph-mon[293643]: mgrmap e54: np0005548790.kvkfyr(active, since 14m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:26:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e271 do_prune osdmap full prune enabled
Dec 06 10:26:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e272 e272: 6 total, 6 up, 6 in
Dec 06 10:26:15 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e272: 6 total, 6 up, 6 in
Dec 06 10:26:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "format": "json"}]: dispatch
Dec 06 10:26:15 np0005548788.localdomain ceph-mon[293643]: pgmap v616: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 82 KiB/s wr, 5 op/s
Dec 06 10:26:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:16.451 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:16 np0005548788.localdomain ceph-mon[293643]: osdmap e272: 6 total, 6 up, 6 in
Dec 06 10:26:17 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:26:17 np0005548788.localdomain podman[327697]: 2025-12-06 10:26:17.271618268 +0000 UTC m=+0.096344755 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:26:17 np0005548788.localdomain podman[327697]: 2025-12-06 10:26:17.328417318 +0000 UTC m=+0.153143815 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:26:17 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:26:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:18 np0005548788.localdomain ceph-mon[293643]: pgmap v618: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 99 KiB/s wr, 6 op/s
Dec 06 10:26:19 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:19 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:19.283 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:26:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:26:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:26:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:26:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:26:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Dec 06 10:26:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: tmp-crun.UWErUX.mount: Deactivated successfully.
Dec 06 10:26:20 np0005548788.localdomain podman[327724]: 2025-12-06 10:26:20.263317677 +0000 UTC m=+0.075066656 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:26:20 np0005548788.localdomain podman[327722]: 2025-12-06 10:26:20.329027703 +0000 UTC m=+0.150701759 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:26:20 np0005548788.localdomain podman[327723]: 2025-12-06 10:26:20.292798711 +0000 UTC m=+0.110269947 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:26:20 np0005548788.localdomain podman[327724]: 2025-12-06 10:26:20.348716463 +0000 UTC m=+0.160465492 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:26:20 np0005548788.localdomain podman[327722]: 2025-12-06 10:26:20.370127596 +0000 UTC m=+0.191801662 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:26:20 np0005548788.localdomain podman[327723]: 2025-12-06 10:26:20.377847315 +0000 UTC m=+0.195318501 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:26:20 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:26:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548788.localdomain ceph-mon[293643]: pgmap v619: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 78 KiB/s wr, 5 op/s
Dec 06 10:26:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:21.477 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.245 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.246 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:26:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:22.247 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:22 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:22.707 262572 INFO neutron.agent.linux.ip_lib [None req-77ee448e-80b8-46f0-a77c-c6217d4634b7 - - - - - -] Device tapf1015743-48 cannot be used as it has no MAC address
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e272 do_prune osdmap full prune enabled
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 e273: 6 total, 6 up, 6 in
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e273: 6 total, 6 up, 6 in
Dec 06 10:26:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:22.770 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548788.localdomain kernel: device tapf1015743-48 entered promiscuous mode
Dec 06 10:26:22 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016782.7823] manager: (tapf1015743-48): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Dec 06 10:26:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:22.780 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:22Z|00294|binding|INFO|Claiming lport f1015743-4855-4add-ab03-88793d49dc10 for this chassis.
Dec 06 10:26:22 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:22Z|00295|binding|INFO|f1015743-4855-4add-ab03-88793d49dc10: Claiming unknown
Dec 06 10:26:22 np0005548788.localdomain systemd-udevd[327793]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.797 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3342399b00ae40b48123295a9604de67', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c9353f2-a356-48e8-b32e-df81baab7fff, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=f1015743-4855-4add-ab03-88793d49dc10) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.799 159620 INFO neutron.agent.ovn.metadata.agent [-] Port f1015743-4855-4add-ab03-88793d49dc10 in datapath 444cfc7e-454e-46f1-b7a7-cf9f7b6307a6 bound to our chassis
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.801 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port c30dc39e-c800-439e-af36-1fea02e78721 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.801 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:26:22 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:22.803 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[f467b356-d25a-4aba-a5be-10e68e15c741]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: pgmap v620: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 78 KiB/s wr, 5 op/s
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:22 np0005548788.localdomain ceph-mon[293643]: osdmap e273: 6 total, 6 up, 6 in
Dec 06 10:26:22 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:22Z|00296|binding|INFO|Setting lport f1015743-4855-4add-ab03-88793d49dc10 ovn-installed in OVS
Dec 06 10:26:22 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:22Z|00297|binding|INFO|Setting lport f1015743-4855-4add-ab03-88793d49dc10 up in Southbound
Dec 06 10:26:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:22.817 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapf1015743-48: No such device
Dec 06 10:26:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:22.856 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:22.893 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "format": "json"}]: dispatch
Dec 06 10:26:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve49", "tenant_id": "e0991b50d433489b9122b5c71fdb2883", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:23 np0005548788.localdomain podman[327865]: 
Dec 06 10:26:23 np0005548788.localdomain podman[327865]: 2025-12-06 10:26:23.859049037 +0000 UTC m=+0.093300172 container create 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:26:23 np0005548788.localdomain systemd[1]: Started libpod-conmon-764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4.scope.
Dec 06 10:26:23 np0005548788.localdomain podman[327865]: 2025-12-06 10:26:23.815395034 +0000 UTC m=+0.049646169 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:26:23 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:26:23 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f83954fde62e3e9d25e17050f638ec921ff90ab4c92ab0cb340b51112a6c64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:26:23 np0005548788.localdomain podman[327865]: 2025-12-06 10:26:23.94566982 +0000 UTC m=+0.179920945 container init 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:26:23 np0005548788.localdomain podman[327865]: 2025-12-06 10:26:23.960499529 +0000 UTC m=+0.194750654 container start 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:26:23 np0005548788.localdomain dnsmasq[327883]: started, version 2.85 cachesize 150
Dec 06 10:26:23 np0005548788.localdomain dnsmasq[327883]: DNS service limited to local subnets
Dec 06 10:26:23 np0005548788.localdomain dnsmasq[327883]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:26:23 np0005548788.localdomain dnsmasq[327883]: warning: no upstream servers configured
Dec 06 10:26:23 np0005548788.localdomain dnsmasq-dhcp[327883]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:26:23 np0005548788.localdomain dnsmasq[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/addn_hosts - 0 addresses
Dec 06 10:26:23 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/host
Dec 06 10:26:23 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/opts
Dec 06 10:26:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:24.286 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:24.381 262572 INFO neutron.agent.dhcp.agent [None req-b62a58b3-f1f5-4275-9ef6-1965499daf8b - - - - - -] DHCP configuration for ports {'282ddb29-b699-4edd-b7b3-472e997868f3'} is completed
Dec 06 10:26:24 np0005548788.localdomain ceph-mon[293643]: pgmap v622: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 130 KiB/s wr, 7 op/s
Dec 06 10:26:24 np0005548788.localdomain systemd[1]: tmp-crun.setWOp.mount: Deactivated successfully.
Dec 06 10:26:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:25.209 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:24Z, description=, device_id=b0de0aa3-0513-45a7-a160-43d6176211a5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6641580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6641b50>], id=537706fe-436e-4436-ac8a-25a115a38c10, ip_allocation=immediate, mac_address=fa:16:3e:37:70:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:26:20Z, description=, dns_domain=, id=444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-217253886-network, port_security_enabled=True, project_id=3342399b00ae40b48123295a9604de67, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4935, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3746, status=ACTIVE, subnets=['d2e2b4c5-4a7f-4303-b7c0-223b603a33e1'], tags=[], tenant_id=3342399b00ae40b48123295a9604de67, updated_at=2025-12-06T10:26:21Z, vlan_transparent=None, network_id=444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, port_security_enabled=False, project_id=3342399b00ae40b48123295a9604de67, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3759, status=DOWN, tags=[], tenant_id=3342399b00ae40b48123295a9604de67, updated_at=2025-12-06T10:26:24Z on network 444cfc7e-454e-46f1-b7a7-cf9f7b6307a6
Dec 06 10:26:25 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:25.250 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:26:25 np0005548788.localdomain dnsmasq[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/addn_hosts - 1 addresses
Dec 06 10:26:25 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/host
Dec 06 10:26:25 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/opts
Dec 06 10:26:25 np0005548788.localdomain podman[327901]: 2025-12-06 10:26:25.481324907 +0000 UTC m=+0.065680426 container kill 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:26:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:25.739 262572 INFO neutron.agent.dhcp.agent [None req-03d33e6c-7fb3-4cd4-832f-3f6658c5e5b4 - - - - - -] DHCP configuration for ports {'537706fe-436e-4436-ac8a-25a115a38c10'} is completed
Dec 06 10:26:25 np0005548788.localdomain sshd[327922]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:25 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Dec 06 10:26:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:25.940 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:24Z, description=, device_id=b0de0aa3-0513-45a7-a160-43d6176211a5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c65ea340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c67ad910>], id=537706fe-436e-4436-ac8a-25a115a38c10, ip_allocation=immediate, mac_address=fa:16:3e:37:70:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:26:20Z, description=, dns_domain=, id=444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-217253886-network, port_security_enabled=True, project_id=3342399b00ae40b48123295a9604de67, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4935, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3746, status=ACTIVE, subnets=['d2e2b4c5-4a7f-4303-b7c0-223b603a33e1'], tags=[], tenant_id=3342399b00ae40b48123295a9604de67, updated_at=2025-12-06T10:26:21Z, vlan_transparent=None, network_id=444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, port_security_enabled=False, project_id=3342399b00ae40b48123295a9604de67, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3759, status=DOWN, tags=[], tenant_id=3342399b00ae40b48123295a9604de67, updated_at=2025-12-06T10:26:24Z on network 444cfc7e-454e-46f1-b7a7-cf9f7b6307a6
Dec 06 10:26:26 np0005548788.localdomain podman[327942]: 2025-12-06 10:26:26.167014807 +0000 UTC m=+0.063954692 container kill 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:26:26 np0005548788.localdomain dnsmasq[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/addn_hosts - 1 addresses
Dec 06 10:26:26 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/host
Dec 06 10:26:26 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/opts
Dec 06 10:26:26 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:26.464 262572 INFO neutron.agent.dhcp.agent [None req-3be584fd-9b2e-415a-a0dc-d5fbf59ff7b6 - - - - - -] DHCP configuration for ports {'537706fe-436e-4436-ac8a-25a115a38c10'} is completed
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.478 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:26Z|00298|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:26:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:26Z|00299|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:26:26 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:26Z|00300|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.678 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.692 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.697 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.703 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.778 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:26.783 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: pgmap v623: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 435 B/s rd, 111 KiB/s wr, 6 op/s
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve48", "tenant_id": "e0991b50d433489b9122b5c71fdb2883", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "format": "json"}]: dispatch
Dec 06 10:26:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:27.723 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:27.756 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:27 np0005548788.localdomain sshd[327964]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:28.513 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:28 np0005548788.localdomain ceph-mon[293643]: pgmap v624: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 104 KiB/s wr, 5 op/s
Dec 06 10:26:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:29.287 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:29 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0)
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 06 10:26:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve48", "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: pgmap v625: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 126 KiB/s wr, 7 op/s
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve48", "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:31.483 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "format": "json"}]: dispatch
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: pgmap v626: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 126 KiB/s wr, 7 op/s
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:32 np0005548788.localdomain podman[327966]: 2025-12-06 10:26:32.937337779 +0000 UTC m=+0.082917119 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:26:32 np0005548788.localdomain podman[327966]: 2025-12-06 10:26:32.95351552 +0000 UTC m=+0.099094850 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:26:32 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve47", "tenant_id": "e0991b50d433489b9122b5c71fdb2883", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:33 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:26:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:26:34 np0005548788.localdomain systemd[1]: tmp-crun.9OCruN.mount: Deactivated successfully.
Dec 06 10:26:34 np0005548788.localdomain podman[327988]: 2025-12-06 10:26:34.277685247 +0000 UTC m=+0.095335784 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:26:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:34.289 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:34 np0005548788.localdomain podman[327987]: 2025-12-06 10:26:34.324852088 +0000 UTC m=+0.146027584 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:26:34 np0005548788.localdomain podman[327988]: 2025-12-06 10:26:34.337269372 +0000 UTC m=+0.154919919 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:26:34 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:26:34 np0005548788.localdomain podman[327987]: 2025-12-06 10:26:34.363619388 +0000 UTC m=+0.184794874 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:26:34 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:26:34 np0005548788.localdomain ceph-mon[293643]: pgmap v627: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 870 B/s rd, 208 KiB/s wr, 13 op/s
Dec 06 10:26:34 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:35 np0005548788.localdomain ceph-mon[293643]: pgmap v628: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 119 KiB/s wr, 8 op/s
Dec 06 10:26:36 np0005548788.localdomain sshd[327922]: Connection closed by 3.131.215.38 port 46786 [preauth]
Dec 06 10:26:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:36Z|00301|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:26:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:36Z|00302|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:26:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:36Z|00303|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:26:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:36.482 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:36.484 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:36.499 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:36.505 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0)
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Dec 06 10:26:36 np0005548788.localdomain dnsmasq[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/addn_hosts - 0 addresses
Dec 06 10:26:36 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/host
Dec 06 10:26:36 np0005548788.localdomain dnsmasq-dhcp[327883]: read /var/lib/neutron/dhcp/444cfc7e-454e-46f1-b7a7-cf9f7b6307a6/opts
Dec 06 10:26:36 np0005548788.localdomain podman[328047]: 2025-12-06 10:26:36.644542562 +0000 UTC m=+0.064024155 container kill 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:26:36 np0005548788.localdomain kernel: device tapf1015743-48 left promiscuous mode
Dec 06 10:26:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:36.867 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:36Z|00304|binding|INFO|Releasing lport f1015743-4855-4add-ab03-88793d49dc10 from this chassis (sb_readonly=0)
Dec 06 10:26:36 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:26:36Z|00305|binding|INFO|Setting lport f1015743-4855-4add-ab03-88793d49dc10 down in Southbound
Dec 06 10:26:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:36.875 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3342399b00ae40b48123295a9604de67', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c9353f2-a356-48e8-b32e-df81baab7fff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=f1015743-4855-4add-ab03-88793d49dc10) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:26:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:36.877 159620 INFO neutron.agent.ovn.metadata.agent [-] Port f1015743-4855-4add-ab03-88793d49dc10 in datapath 444cfc7e-454e-46f1-b7a7-cf9f7b6307a6 unbound from our chassis
Dec 06 10:26:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:36.880 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:26:36 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:36.881 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[b52636a4-0432-4cff-abe9-ffa21a1c7740]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:26:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:36.900 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve47", "format": "json"}]: dispatch
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Dec 06 10:26:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve47", "format": "json"}]: dispatch
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: pgmap v629: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 119 KiB/s wr, 8 op/s
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:26:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:26:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:26:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:26:38 np0005548788.localdomain dnsmasq[327883]: exiting on receipt of SIGTERM
Dec 06 10:26:38 np0005548788.localdomain podman[328085]: 2025-12-06 10:26:38.908258261 +0000 UTC m=+0.086840571 container kill 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:26:38 np0005548788.localdomain systemd[1]: libpod-764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4.scope: Deactivated successfully.
Dec 06 10:26:39 np0005548788.localdomain podman[328099]: 2025-12-06 10:26:39.004881624 +0000 UTC m=+0.080930448 container died 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:26:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4-userdata-shm.mount: Deactivated successfully.
Dec 06 10:26:39 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-e6f83954fde62e3e9d25e17050f638ec921ff90ab4c92ab0cb340b51112a6c64-merged.mount: Deactivated successfully.
Dec 06 10:26:39 np0005548788.localdomain podman[328099]: 2025-12-06 10:26:39.036000277 +0000 UTC m=+0.112049061 container cleanup 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:26:39 np0005548788.localdomain systemd[1]: libpod-conmon-764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4.scope: Deactivated successfully.
Dec 06 10:26:39 np0005548788.localdomain podman[328106]: 2025-12-06 10:26:39.084117379 +0000 UTC m=+0.144360613 container remove 764c549df9091f945c194e379015650e767432ecb61cf8fdce45d84a693190f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-444cfc7e-454e-46f1-b7a7-cf9f7b6307a6, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:26:39 np0005548788.localdomain sshd[327964]: Received disconnect from 45.78.194.186 port 52188:11: Bye Bye [preauth]
Dec 06 10:26:39 np0005548788.localdomain sshd[327964]: Disconnected from authenticating user root 45.78.194.186 port 52188 [preauth]
Dec 06 10:26:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:39.245 262572 INFO neutron.agent.dhcp.agent [None req-02eb6a66-45f4-4bc1-8ace-f9b928a94d1c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:26:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:26:39.246 262572 INFO neutron.agent.dhcp.agent [None req-02eb6a66-45f4-4bc1-8ace-f9b928a94d1c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:26:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:39.290 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3136332362' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:26:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3136332362' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:26:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:39.423 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:39 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d444cfc7e\x2d454e\x2d46f1\x2db7a7\x2dcf9f7b6307a6.mount: Deactivated successfully.
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: pgmap v630: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 183 KiB/s wr, 12 op/s
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0)
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve49", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve49", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:41.504 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "format": "json"}]: dispatch
Dec 06 10:26:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:42 np0005548788.localdomain ceph-mon[293643]: pgmap v631: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 142 KiB/s wr, 9 op/s
Dec 06 10:26:42 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1615917791' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:43.323 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:43.323 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:43.324 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:26:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4068335658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:43 np0005548788.localdomain ceph-mon[293643]: pgmap v632: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 207 KiB/s wr, 14 op/s
Dec 06 10:26:43 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "snap_name": "1d0736b7-9f9a-4eab-ab98-cf736d1a3e1a", "format": "json"}]: dispatch
Dec 06 10:26:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:26:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:43 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:26:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:44.291 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.919235) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016804919394, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2422, "num_deletes": 266, "total_data_size": 3101440, "memory_usage": 3150848, "flush_reason": "Manual Compaction"}
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016804938753, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3017632, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36357, "largest_seqno": 38778, "table_properties": {"data_size": 3007152, "index_size": 6537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25415, "raw_average_key_size": 22, "raw_value_size": 2984935, "raw_average_value_size": 2613, "num_data_blocks": 280, "num_entries": 1142, "num_filter_entries": 1142, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016677, "oldest_key_time": 1765016677, "file_creation_time": 1765016804, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 19594 microseconds, and 7644 cpu microseconds.
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.938878) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3017632 bytes OK
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.938936) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.941094) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.941150) EVENT_LOG_v1 {"time_micros": 1765016804941142, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.941178) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 3090616, prev total WAL file size 3090616, number of live WAL files 2.
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.942385) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2946KB)], [69(18MB)]
Dec 06 10:26:44 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016804942459, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 22205308, "oldest_snapshot_seqno": -1}
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 13940 keys, 20512793 bytes, temperature: kUnknown
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016805048923, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 20512793, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20432489, "index_size": 44363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34885, "raw_key_size": 375669, "raw_average_key_size": 26, "raw_value_size": 20194859, "raw_average_value_size": 1448, "num_data_blocks": 1642, "num_entries": 13940, "num_filter_entries": 13940, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016804, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.049330) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 20512793 bytes
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.051406) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.8 rd, 192.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 18.3 +0.0 blob) out(19.6 +0.0 blob), read-write-amplify(14.2) write-amplify(6.8) OK, records in: 14487, records dropped: 547 output_compression: NoCompression
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.051438) EVENT_LOG_v1 {"time_micros": 1765016805051424, "job": 42, "event": "compaction_finished", "compaction_time_micros": 106352, "compaction_time_cpu_micros": 54457, "output_level": 6, "num_output_files": 1, "total_output_size": 20512793, "num_input_records": 14487, "num_output_records": 13940, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016805052117, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016805055053, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:44.942284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.055282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.055294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.055298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.055301) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:26:45.055304) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:46.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:46.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:46.507 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: pgmap v633: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 129 KiB/s wr, 9 op/s
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:47.446 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:47.447 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:26:47.447 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:47 np0005548788.localdomain ceph-mgr[286998]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3354697053
Dec 06 10:26:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:48 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:26:48 np0005548788.localdomain podman[328128]: 2025-12-06 10:26:48.239796839 +0000 UTC m=+0.069127083 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:26:48 np0005548788.localdomain podman[328128]: 2025-12-06 10:26:48.284747941 +0000 UTC m=+0.114078145 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:26:48 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:26:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:48 np0005548788.localdomain ceph-mon[293643]: pgmap v634: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 129 KiB/s wr, 9 op/s
Dec 06 10:26:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:49.293 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:26:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:26:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:26:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:26:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:26:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18770 "" "Go-http-client/1.1"
Dec 06 10:26:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "snap_name": "1d0736b7-9f9a-4eab-ab98-cf736d1a3e1a_5759c99a-9a0c-4c9b-8de2-da85e4830d9b", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "snap_name": "1d0736b7-9f9a-4eab-ab98-cf736d1a3e1a", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:50.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:50.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:26:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:50.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:26:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:50.021 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:26:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:50.021 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: pgmap v635: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 191 KiB/s wr, 13 op/s
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:50 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:26:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:51.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:51.027 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:51.027 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:26:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:26:51 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:26:51 np0005548788.localdomain podman[328154]: 2025-12-06 10:26:51.262412735 +0000 UTC m=+0.081940719 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:26:51 np0005548788.localdomain podman[328154]: 2025-12-06 10:26:51.274782118 +0000 UTC m=+0.094310112 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:26:51 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:26:51 np0005548788.localdomain podman[328153]: 2025-12-06 10:26:51.366015514 +0000 UTC m=+0.191180503 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:26:51 np0005548788.localdomain podman[328155]: 2025-12-06 10:26:51.329985648 +0000 UTC m=+0.146216450 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:26:51 np0005548788.localdomain podman[328153]: 2025-12-06 10:26:51.40688409 +0000 UTC m=+0.232049019 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:26:51 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:26:51 np0005548788.localdomain podman[328155]: 2025-12-06 10:26:51.463673669 +0000 UTC m=+0.279904471 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:26:51 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:26:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:51.508 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "format": "json"}]: dispatch
Dec 06 10:26:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:52 np0005548788.localdomain ceph-mon[293643]: pgmap v636: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 126 KiB/s wr, 9 op/s
Dec 06 10:26:52 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1038771449' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1752695086' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:53 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:54.295 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:54 np0005548788.localdomain ceph-mon[293643]: pgmap v637: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 177 KiB/s wr, 13 op/s
Dec 06 10:26:54 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.145 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.146 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.146 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.147 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.147 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1759336759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.612 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.810 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.812 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11410MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.813 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.813 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e273 do_prune osdmap full prune enabled
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.898 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.899 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: pgmap v638: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 112 KiB/s wr, 7 op/s
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1759336759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e274 e274: 6 total, 6 up, 6 in
Dec 06 10:26:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:55.923 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:26:55 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e274: 6 total, 6 up, 6 in
Dec 06 10:26:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:26:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/792248445' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:56.381 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:26:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:56.389 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:26:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:56.410 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:26:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:56.412 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:26:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:56.412 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:56.509 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:56 np0005548788.localdomain ceph-mon[293643]: osdmap e274: 6 total, 6 up, 6 in
Dec 06 10:26:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/792248445' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: pgmap v640: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 134 KiB/s wr, 9 op/s
Dec 06 10:26:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:26:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:26:59.298 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:00 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "format": "json"}]: dispatch
Dec 06 10:27:00 np0005548788.localdomain ceph-mon[293643]: pgmap v641: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 117 KiB/s wr, 8 op/s
Dec 06 10:27:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:01.535 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e274 do_prune osdmap full prune enabled
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e275 e275: 6 total, 6 up, 6 in
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e275: 6 total, 6 up, 6 in
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: pgmap v642: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 117 KiB/s wr, 8 op/s
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:02 np0005548788.localdomain ceph-mon[293643]: osdmap e275: 6 total, 6 up, 6 in
Dec 06 10:27:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:27:03 np0005548788.localdomain podman[328261]: 2025-12-06 10:27:03.263910154 +0000 UTC m=+0.087192682 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 06 10:27:03 np0005548788.localdomain podman[328261]: 2025-12-06 10:27:03.278773564 +0000 UTC m=+0.102056092 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:27:03 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:27:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "format": "json"}]: dispatch
Dec 06 10:27:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "format": "json"}]: dispatch
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:04.303 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: pgmap v644: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 172 KiB/s wr, 11 op/s
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:27:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:27:05 np0005548788.localdomain podman[328280]: 2025-12-06 10:27:05.27473852 +0000 UTC m=+0.093570660 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:27:05 np0005548788.localdomain podman[328281]: 2025-12-06 10:27:05.318309179 +0000 UTC m=+0.134155556 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 06 10:27:05 np0005548788.localdomain podman[328280]: 2025-12-06 10:27:05.337651429 +0000 UTC m=+0.156483539 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:27:05 np0005548788.localdomain podman[328281]: 2025-12-06 10:27:05.349387502 +0000 UTC m=+0.165233799 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:27:05 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:27:05 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:27:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:06.537 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "snap_name": "1034db86-0931-4bff-b9cd-e904ae3ce178", "format": "json"}]: dispatch
Dec 06 10:27:06 np0005548788.localdomain ceph-mon[293643]: pgmap v645: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 325 B/s rd, 146 KiB/s wr, 10 op/s
Dec 06 10:27:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "format": "json"}]: dispatch
Dec 06 10:27:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.501 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:27:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: pgmap v646: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 138 KiB/s wr, 9 op/s
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:27:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:27:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "snap_name": "1034db86-0931-4bff-b9cd-e904ae3ce178_8e2229e7-2c10-4e6d-9970-a62905b25ae2", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "snap_name": "1034db86-0931-4bff-b9cd-e904ae3ce178", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:09.304 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "format": "json"}]: dispatch
Dec 06 10:27:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:09 np0005548788.localdomain ceph-mon[293643]: pgmap v647: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 168 KiB/s wr, 9 op/s
Dec 06 10:27:10 np0005548788.localdomain sudo[328319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:27:10 np0005548788.localdomain sudo[328319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:27:10 np0005548788.localdomain sudo[328319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:10 np0005548788.localdomain sudo[328337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:27:10 np0005548788.localdomain sudo[328337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:27:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e275 do_prune osdmap full prune enabled
Dec 06 10:27:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e276 e276: 6 total, 6 up, 6 in
Dec 06 10:27:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e276: 6 total, 6 up, 6 in
Dec 06 10:27:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:27:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:11 np0005548788.localdomain sudo[328337]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:27:11 np0005548788.localdomain sudo[328386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:27:11 np0005548788.localdomain sudo[328386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:27:11 np0005548788.localdomain sudo[328386]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:11.538 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: osdmap e276: 6 total, 6 up, 6 in
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: pgmap v649: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 239 B/s rd, 197 KiB/s wr, 11 op/s
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "format": "json"}]: dispatch
Dec 06 10:27:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:27:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:27:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:12 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:27:12Z|00306|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Dec 06 10:27:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:27:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a57baf93-1000-4372-9325-859e73a86488", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a57baf93-1000-4372-9325-859e73a86488", "format": "json"}]: dispatch
Dec 06 10:27:13 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:14.307 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: pgmap v650: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 173 KiB/s wr, 10 op/s
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548788.localdomain ceph-mon[293643]: pgmap v651: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 173 KiB/s wr, 10 op/s
Dec 06 10:27:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a57baf93-1000-4372-9325-859e73a86488", "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a57baf93-1000-4372-9325-859e73a86488", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:16.542 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e276 do_prune osdmap full prune enabled
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e277 e277: 6 total, 6 up, 6 in
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e277: 6 total, 6 up, 6 in
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: pgmap v652: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 173 KiB/s wr, 10 op/s
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: osdmap e277: 6 total, 6 up, 6 in
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:19 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:27:19 np0005548788.localdomain podman[328404]: 2025-12-06 10:27:19.275006222 +0000 UTC m=+0.089690369 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:27:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:19.309 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:19 np0005548788.localdomain podman[328404]: 2025-12-06 10:27:19.315719533 +0000 UTC m=+0.130403640 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:27:19 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:27:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:27:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:27:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:27:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:27:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:27:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18773 "" "Go-http-client/1.1"
Dec 06 10:27:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "snap_name": "7884d7aa-ac96-42e0-bc99-f42bebf4c1fc", "format": "json"}]: dispatch
Dec 06 10:27:20 np0005548788.localdomain ceph-mon[293643]: pgmap v654: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 731 B/s rd, 197 KiB/s wr, 12 op/s
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:21.542 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: tmp-crun.ABV4Vx.mount: Deactivated successfully.
Dec 06 10:27:22 np0005548788.localdomain podman[328428]: 2025-12-06 10:27:22.310444496 +0000 UTC m=+0.129133891 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:27:22 np0005548788.localdomain podman[328427]: 2025-12-06 10:27:22.258965151 +0000 UTC m=+0.081562358 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:27:22 np0005548788.localdomain podman[328429]: 2025-12-06 10:27:22.281240331 +0000 UTC m=+0.095660004 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:27:22 np0005548788.localdomain podman[328428]: 2025-12-06 10:27:22.347640918 +0000 UTC m=+0.166330293 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:27:22 np0005548788.localdomain podman[328427]: 2025-12-06 10:27:22.389869456 +0000 UTC m=+0.212466603 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:27:22 np0005548788.localdomain podman[328429]: 2025-12-06 10:27:22.413326163 +0000 UTC m=+0.227745826 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Dec 06 10:27:22 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:27:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:22 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:22 np0005548788.localdomain ceph-mon[293643]: pgmap v655: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 165 KiB/s wr, 10 op/s
Dec 06 10:27:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "snap_name": "7884d7aa-ac96-42e0-bc99-f42bebf4c1fc_a26a2613-79e3-4fad-87ed-7fada9c58a20", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "snap_name": "7884d7aa-ac96-42e0-bc99-f42bebf4c1fc", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:23 np0005548788.localdomain ceph-mon[293643]: pgmap v656: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 157 KiB/s wr, 10 op/s
Dec 06 10:27:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:24.310 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:24 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e277 do_prune osdmap full prune enabled
Dec 06 10:27:25 np0005548788.localdomain ceph-mon[293643]: pgmap v657: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 157 KiB/s wr, 10 op/s
Dec 06 10:27:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "format": "json"}]: dispatch
Dec 06 10:27:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e278 e278: 6 total, 6 up, 6 in
Dec 06 10:27:25 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e278: 6 total, 6 up, 6 in
Dec 06 10:27:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:26.547 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:26 np0005548788.localdomain ceph-mon[293643]: osdmap e278: 6 total, 6 up, 6 in
Dec 06 10:27:26 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:26 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "format": "json"}]: dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: pgmap v659: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 536 B/s rd, 165 KiB/s wr, 10 op/s
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:27 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:29.314 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:29 np0005548788.localdomain ceph-mon[293643]: pgmap v660: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 146 KiB/s wr, 9 op/s
Dec 06 10:27:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:27:30.519 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:27:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:27:30.521 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:27:30 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:27:30.521 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:27:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:30.561 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:30 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:30 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "format": "json"}]: dispatch
Dec 06 10:27:31 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:27:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:27:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:31.550 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: pgmap v661: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 146 KiB/s wr, 9 op/s
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e278 do_prune osdmap full prune enabled
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 e279: 6 total, 6 up, 6 in
Dec 06 10:27:32 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e279: 6 total, 6 up, 6 in
Dec 06 10:27:33 np0005548788.localdomain ceph-mon[293643]: osdmap e279: 6 total, 6 up, 6 in
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:34 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:27:34 np0005548788.localdomain podman[328487]: 2025-12-06 10:27:34.258591002 +0000 UTC m=+0.083158007 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:27:34 np0005548788.localdomain podman[328487]: 2025-12-06 10:27:34.27372672 +0000 UTC m=+0.098293745 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:27:34 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:27:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:34.318 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: pgmap v663: 177 pgs: 177 active+clean; 218 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 191 KiB/s wr, 11 op/s
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:27:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:27:36 np0005548788.localdomain systemd[1]: tmp-crun.MNYj1j.mount: Deactivated successfully.
Dec 06 10:27:36 np0005548788.localdomain podman[328508]: 2025-12-06 10:27:36.405797673 +0000 UTC m=+0.230693887 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Dec 06 10:27:36 np0005548788.localdomain systemd[1]: tmp-crun.B0o0JI.mount: Deactivated successfully.
Dec 06 10:27:36 np0005548788.localdomain podman[328507]: 2025-12-06 10:27:36.379510059 +0000 UTC m=+0.208555831 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:27:36 np0005548788.localdomain podman[328508]: 2025-12-06 10:27:36.440607361 +0000 UTC m=+0.265503605 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 10:27:36 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:27:36 np0005548788.localdomain podman[328507]: 2025-12-06 10:27:36.463509931 +0000 UTC m=+0.292555703 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:27:36 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:27:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:36.594 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:36 np0005548788.localdomain ceph-mon[293643]: pgmap v664: 177 pgs: 177 active+clean; 218 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 434 B/s rd, 162 KiB/s wr, 10 op/s
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: pgmap v665: 177 pgs: 177 active+clean; 218 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 153 KiB/s wr, 9 op/s
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:37 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:27:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:27:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:27:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:39.321 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1777930601' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:27:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1777930601' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:27:39 np0005548788.localdomain ceph-mon[293643]: pgmap v666: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 184 KiB/s wr, 10 op/s
Dec 06 10:27:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "format": "json"}]: dispatch
Dec 06 10:27:40 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:41.597 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:41 np0005548788.localdomain ceph-mon[293643]: pgmap v667: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 184 KiB/s wr, 10 op/s
Dec 06 10:27:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/585070944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3710319401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: pgmap v668: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 485 B/s rd, 272 KiB/s wr, 16 op/s
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:44.322 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:44.413 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:44.414 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:44.414 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:46.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:46 np0005548788.localdomain ceph-mon[293643]: pgmap v669: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 168 KiB/s wr, 9 op/s
Dec 06 10:27:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:46.623 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:27:47.447 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:27:47.448 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:27:47.448 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:48.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: pgmap v670: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 168 KiB/s wr, 9 op/s
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:49.325 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:27:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:27:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:27:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:27:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:27:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18784 "" "Go-http-client/1.1"
Dec 06 10:27:49 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:50 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:27:50 np0005548788.localdomain podman[328549]: 2025-12-06 10:27:50.255559202 +0000 UTC m=+0.078205564 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 06 10:27:50 np0005548788.localdomain podman[328549]: 2025-12-06 10:27:50.342714331 +0000 UTC m=+0.165360713 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:27:50 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:27:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:27:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:50 np0005548788.localdomain ceph-mon[293643]: pgmap v671: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 249 KiB/s wr, 14 op/s
Dec 06 10:27:50 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:51.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:51.004 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:27:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:51.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:27:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:51.023 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:51.639 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "format": "json"}]: dispatch
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:52.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:52 np0005548788.localdomain ceph-mon[293643]: pgmap v672: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 166 KiB/s wr, 10 op/s
Dec 06 10:27:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:53.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:53.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:27:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:27:53 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:27:53 np0005548788.localdomain podman[328576]: 2025-12-06 10:27:53.276049292 +0000 UTC m=+0.093482967 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:27:53 np0005548788.localdomain podman[328576]: 2025-12-06 10:27:53.292590104 +0000 UTC m=+0.110023739 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 06 10:27:53 np0005548788.localdomain podman[328577]: 2025-12-06 10:27:53.335757472 +0000 UTC m=+0.149237675 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:27:53 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:27:53 np0005548788.localdomain podman[328577]: 2025-12-06 10:27:53.373740978 +0000 UTC m=+0.187221181 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:27:53 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:27:53 np0005548788.localdomain podman[328578]: 2025-12-06 10:27:53.39576228 +0000 UTC m=+0.205606379 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Dec 06 10:27:53 np0005548788.localdomain podman[328578]: 2025-12-06 10:27:53.436637307 +0000 UTC m=+0.246481416 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64)
Dec 06 10:27:53 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:27:53 np0005548788.localdomain ceph-mon[293643]: pgmap v673: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 235 KiB/s wr, 14 op/s
Dec 06 10:27:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:54.329 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3684322741' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:55 np0005548788.localdomain ceph-mon[293643]: pgmap v674: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 149 KiB/s wr, 9 op/s
Dec 06 10:27:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1957927647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.027 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.027 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:27:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:27:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1864458173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.479 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.661 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.717 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.720 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11397MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.721 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.721 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.821 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.822 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:27:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:56.849 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:27:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1864458173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2284770246' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:57.324 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:27:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:57.331 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:57.350 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:27:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:57.353 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:27:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:57.353 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2284770246' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: pgmap v675: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 149 KiB/s wr, 9 op/s
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:27:59.331 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:00 np0005548788.localdomain ceph-mon[293643]: pgmap v676: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 252 KiB/s wr, 15 op/s
Dec 06 10:28:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:01.704 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:02 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:02 np0005548788.localdomain ceph-mon[293643]: pgmap v677: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 171 KiB/s wr, 10 op/s
Dec 06 10:28:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:28:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:04.335 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: pgmap v678: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 235 KiB/s wr, 15 op/s
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:05 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:28:05 np0005548788.localdomain podman[328683]: 2025-12-06 10:28:05.253751458 +0000 UTC m=+0.080592466 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:28:05 np0005548788.localdomain podman[328683]: 2025-12-06 10:28:05.294906737 +0000 UTC m=+0.121747785 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec 06 10:28:05 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:28:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:05 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:06.735 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:06 np0005548788.localdomain ceph-mon[293643]: pgmap v679: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 167 KiB/s wr, 10 op/s
Dec 06 10:28:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:28:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:28:07 np0005548788.localdomain podman[328700]: 2025-12-06 10:28:07.247615313 +0000 UTC m=+0.074570691 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:28:07 np0005548788.localdomain systemd[1]: tmp-crun.RqCUTw.mount: Deactivated successfully.
Dec 06 10:28:07 np0005548788.localdomain systemd-journald[47853]: Data hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.0 (53727 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Dec 06 10:28:07 np0005548788.localdomain systemd-journald[47853]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 10:28:07 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:28:07 np0005548788.localdomain podman[328701]: 2025-12-06 10:28:07.277923937 +0000 UTC m=+0.098528539 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:28:07 np0005548788.localdomain podman[328701]: 2025-12-06 10:28:07.286559054 +0000 UTC m=+0.107163636 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:28:07 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:28:07 np0005548788.localdomain podman[328700]: 2025-12-06 10:28:07.330700794 +0000 UTC m=+0.157656132 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:28:07 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:28:07 np0005548788.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: pgmap v680: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 167 KiB/s wr, 10 op/s
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:28:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:28:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:09.337 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:09 np0005548788.localdomain ceph-mon[293643]: pgmap v681: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 263 KiB/s wr, 17 op/s
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:10 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:11 np0005548788.localdomain sudo[328740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:28:11 np0005548788.localdomain sudo[328740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:28:11 np0005548788.localdomain sudo[328740]: pam_unix(sudo:session): session closed for user root
Dec 06 10:28:11 np0005548788.localdomain sudo[328758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:28:11 np0005548788.localdomain sudo[328758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:28:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:11.735 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:11 np0005548788.localdomain ceph-mon[293643]: pgmap v682: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 160 KiB/s wr, 11 op/s
Dec 06 10:28:12 np0005548788.localdomain sudo[328758]: pam_unix(sudo:session): session closed for user root
Dec 06 10:28:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:28:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:28:12 np0005548788.localdomain sudo[328809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:28:12 np0005548788.localdomain sudo[328809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:28:12 np0005548788.localdomain sudo[328809]: pam_unix(sudo:session): session closed for user root
Dec 06 10:28:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:28:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:28:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:28:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: pgmap v683: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 219 KiB/s wr, 14 op/s
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:14.339 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:15 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:16 np0005548788.localdomain ceph-mon[293643]: pgmap v684: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 155 KiB/s wr, 9 op/s
Dec 06 10:28:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:16.850 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: pgmap v685: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 155 KiB/s wr, 9 op/s
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:19.343 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:28:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:28:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:28:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:28:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:28:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1"
Dec 06 10:28:20 np0005548788.localdomain ceph-mon[293643]: pgmap v686: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 254 KiB/s wr, 16 op/s
Dec 06 10:28:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:20 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:20 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:21 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:28:21 np0005548788.localdomain podman[328827]: 2025-12-06 10:28:21.25929407 +0000 UTC m=+0.085614860 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller)
Dec 06 10:28:21 np0005548788.localdomain podman[328827]: 2025-12-06 10:28:21.339782382 +0000 UTC m=+0.166103132 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:28:21 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:21.850 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:22 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:22 np0005548788.localdomain ceph-mon[293643]: pgmap v687: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 158 KiB/s wr, 10 op/s
Dec 06 10:28:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:28:24 np0005548788.localdomain podman[328853]: 2025-12-06 10:28:24.261919196 +0000 UTC m=+0.085230188 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:28:24 np0005548788.localdomain podman[328853]: 2025-12-06 10:28:24.281531521 +0000 UTC m=+0.104842483 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} v 0)
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:24.346 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:24 np0005548788.localdomain podman[328854]: 2025-12-06 10:28:24.378414599 +0000 UTC m=+0.193323462 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:28:24 np0005548788.localdomain podman[328854]: 2025-12-06 10:28:24.392000928 +0000 UTC m=+0.206909751 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: tmp-crun.NKDkQx.mount: Deactivated successfully.
Dec 06 10:28:24 np0005548788.localdomain podman[328860]: 2025-12-06 10:28:24.436143789 +0000 UTC m=+0.244980095 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:28:24 np0005548788.localdomain podman[328860]: 2025-12-06 10:28:24.476679728 +0000 UTC m=+0.285516024 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git)
Dec 06 10:28:24 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: pgmap v688: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 217 KiB/s wr, 14 op/s
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:24 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:28:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548788.localdomain ceph-mon[293643]: pgmap v689: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 158 KiB/s wr, 11 op/s
Dec 06 10:28:26 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:26.853 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:27 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:27 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:28 np0005548788.localdomain ceph-mon[293643]: pgmap v690: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 158 KiB/s wr, 11 op/s
Dec 06 10:28:28 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:28 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:29.350 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "format": "json"}]: dispatch
Dec 06 10:28:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "force": true, "format": "json"}]: dispatch
Dec 06 10:28:30 np0005548788.localdomain ceph-mon[293643]: pgmap v691: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 240 KiB/s wr, 16 op/s
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:31 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:28:31 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:31.857 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:32 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:32.511 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:28:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:32.512 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:32 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:32.513 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:28:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:32 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:32 np0005548788.localdomain ceph-mon[293643]: pgmap v692: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 141 KiB/s wr, 9 op/s
Dec 06 10:28:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:33 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:33.516 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:28:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:34.354 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:34 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: pgmap v693: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 168 KiB/s wr, 12 op/s
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:34 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:35 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:35 np0005548788.localdomain ceph-mon[293643]: pgmap v694: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:28:36 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:28:36 np0005548788.localdomain podman[328915]: 2025-12-06 10:28:36.257144655 +0000 UTC m=+0.081238596 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 10:28:36 np0005548788.localdomain podman[328915]: 2025-12-06 10:28:36.26963818 +0000 UTC m=+0.093732111 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:28:36 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:28:36 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:36.857 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:28:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:37 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:28:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:28:38 np0005548788.localdomain podman[328935]: 2025-12-06 10:28:38.258438229 +0000 UTC m=+0.079964657 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:28:38 np0005548788.localdomain podman[328935]: 2025-12-06 10:28:38.265884948 +0000 UTC m=+0.087411316 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:28:38 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:28:38 np0005548788.localdomain systemd[1]: tmp-crun.EuVKaF.mount: Deactivated successfully.
Dec 06 10:28:38 np0005548788.localdomain podman[328936]: 2025-12-06 10:28:38.319875523 +0000 UTC m=+0.135724566 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 06 10:28:38 np0005548788.localdomain podman[328936]: 2025-12-06 10:28:38.328788018 +0000 UTC m=+0.144637061 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:28:38 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: pgmap v695: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:38 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:28:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:28:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:28:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:28:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/12719951' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:28:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:28:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/12719951' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:28:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:39.356 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:39 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:39.415 262572 INFO neutron.agent.linux.ip_lib [None req-56d80406-9a8b-4de0-8e00-432101417ab7 - - - - - -] Device tapb3b2538c-a0 cannot be used as it has no MAC address
Dec 06 10:28:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:39.438 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:39 np0005548788.localdomain kernel: device tapb3b2538c-a0 entered promiscuous mode
Dec 06 10:28:39 np0005548788.localdomain NetworkManager[5968]: <info>  [1765016919.4470] manager: (tapb3b2538c-a0): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Dec 06 10:28:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:39Z|00307|binding|INFO|Claiming lport b3b2538c-a022-4385-8d06-7ab984abd6e9 for this chassis.
Dec 06 10:28:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:39Z|00308|binding|INFO|b3b2538c-a022-4385-8d06-7ab984abd6e9: Claiming unknown
Dec 06 10:28:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:39.447 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:39 np0005548788.localdomain systemd-udevd[328985]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:28:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:39.463 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-dfc9804c-d9cd-4069-8a50-08732c020210', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfc9804c-d9cd-4069-8a50-08732c020210', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0130cf6296f34bac9ca26d4953c9e4d0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c7e1586-e9e7-4b08-b4cc-c678a826f070, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=b3b2538c-a022-4385-8d06-7ab984abd6e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:28:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:39.465 159620 INFO neutron.agent.ovn.metadata.agent [-] Port b3b2538c-a022-4385-8d06-7ab984abd6e9 in datapath dfc9804c-d9cd-4069-8a50-08732c020210 bound to our chassis
Dec 06 10:28:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:39.467 159620 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network dfc9804c-d9cd-4069-8a50-08732c020210 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:28:39 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:39.471 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[543ae85f-6cbb-4e6d-97a7-d1673cecc576]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/12719951' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:28:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/12719951' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:28:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:39Z|00309|binding|INFO|Setting lport b3b2538c-a022-4385-8d06-7ab984abd6e9 ovn-installed in OVS
Dec 06 10:28:39 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:39Z|00310|binding|INFO|Setting lport b3b2538c-a022-4385-8d06-7ab984abd6e9 up in Southbound
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:39.483 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapb3b2538c-a0: No such device
Dec 06 10:28:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:39.518 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:39.549 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:40 np0005548788.localdomain ceph-mon[293643]: pgmap v696: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 172 KiB/s wr, 11 op/s
Dec 06 10:28:40 np0005548788.localdomain podman[329056]: 
Dec 06 10:28:40 np0005548788.localdomain podman[329056]: 2025-12-06 10:28:40.898840678 +0000 UTC m=+0.100102248 container create e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:28:40 np0005548788.localdomain systemd[1]: Started libpod-conmon-e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce.scope.
Dec 06 10:28:40 np0005548788.localdomain podman[329056]: 2025-12-06 10:28:40.852069005 +0000 UTC m=+0.053330615 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:28:40 np0005548788.localdomain systemd[1]: tmp-crun.IiluPH.mount: Deactivated successfully.
Dec 06 10:28:40 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:28:40 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22c6c00389c058d207c0e4faaf9cac5419b7498d0fa5355fe1d691383a788d68/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:28:40 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:40 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:40 np0005548788.localdomain podman[329056]: 2025-12-06 10:28:40.998594224 +0000 UTC m=+0.199855794 container init e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:28:41 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:41 np0005548788.localdomain podman[329056]: 2025-12-06 10:28:41.00629526 +0000 UTC m=+0.207556830 container start e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:28:41 np0005548788.localdomain dnsmasq[329074]: started, version 2.85 cachesize 150
Dec 06 10:28:41 np0005548788.localdomain dnsmasq[329074]: DNS service limited to local subnets
Dec 06 10:28:41 np0005548788.localdomain dnsmasq[329074]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:28:41 np0005548788.localdomain dnsmasq[329074]: warning: no upstream servers configured
Dec 06 10:28:41 np0005548788.localdomain dnsmasq-dhcp[329074]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:28:41 np0005548788.localdomain dnsmasq[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/addn_hosts - 0 addresses
Dec 06 10:28:41 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/host
Dec 06 10:28:41 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/opts
Dec 06 10:28:41 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:41.205 262572 INFO neutron.agent.dhcp.agent [None req-93d56dcc-a2af-4e60-b5d4-471b15ef498b - - - - - -] DHCP configuration for ports {'e3c3ab46-f7c8-4911-b29c-506c62fa3201'} is completed
Dec 06 10:28:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:41 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:41 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:41.860 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:42 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:42 np0005548788.localdomain ceph-mon[293643]: pgmap v697: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 90 KiB/s wr, 6 op/s
Dec 06 10:28:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:43 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1883521674' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:43 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:43.530 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:43Z, description=, device_id=2b4c1404-e334-461c-914c-573c510f7280, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6617100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c6617880>], id=38c49637-9511-44ef-a494-a5a423a527fd, ip_allocation=immediate, mac_address=fa:16:3e:64:0b:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:28:37Z, description=, dns_domain=, id=dfc9804c-d9cd-4069-8a50-08732c020210, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-1595585081-network, port_security_enabled=True, project_id=0130cf6296f34bac9ca26d4953c9e4d0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56402, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3879, status=ACTIVE, subnets=['cc461326-a9f0-47dc-9060-0bdbb0e658df'], tags=[], tenant_id=0130cf6296f34bac9ca26d4953c9e4d0, updated_at=2025-12-06T10:28:38Z, vlan_transparent=None, network_id=dfc9804c-d9cd-4069-8a50-08732c020210, port_security_enabled=False, project_id=0130cf6296f34bac9ca26d4953c9e4d0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3886, status=DOWN, tags=[], tenant_id=0130cf6296f34bac9ca26d4953c9e4d0, updated_at=2025-12-06T10:28:43Z on network dfc9804c-d9cd-4069-8a50-08732c020210
Dec 06 10:28:43 np0005548788.localdomain dnsmasq[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/addn_hosts - 1 addresses
Dec 06 10:28:43 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/host
Dec 06 10:28:43 np0005548788.localdomain podman[329092]: 2025-12-06 10:28:43.739527782 +0000 UTC m=+0.051407476 container kill e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:28:43 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/opts
Dec 06 10:28:44 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:44.105 262572 INFO neutron.agent.dhcp.agent [None req-9c2cb31e-9a3c-44ca-8f99-cf0f4669b1fb - - - - - -] DHCP configuration for ports {'38c49637-9511-44ef-a494-a5a423a527fd'} is completed
Dec 06 10:28:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:44.355 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:44.357 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:44.358 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:28:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:44.360 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: pgmap v698: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 123 KiB/s wr, 8 op/s
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3637666271' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:44 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:44 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:44.717 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:43Z, description=, device_id=2b4c1404-e334-461c-914c-573c510f7280, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c66974f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c676f400>], id=38c49637-9511-44ef-a494-a5a423a527fd, ip_allocation=immediate, mac_address=fa:16:3e:64:0b:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:28:37Z, description=, dns_domain=, id=dfc9804c-d9cd-4069-8a50-08732c020210, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-1595585081-network, port_security_enabled=True, project_id=0130cf6296f34bac9ca26d4953c9e4d0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56402, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3879, status=ACTIVE, subnets=['cc461326-a9f0-47dc-9060-0bdbb0e658df'], tags=[], tenant_id=0130cf6296f34bac9ca26d4953c9e4d0, updated_at=2025-12-06T10:28:38Z, vlan_transparent=None, network_id=dfc9804c-d9cd-4069-8a50-08732c020210, port_security_enabled=False, project_id=0130cf6296f34bac9ca26d4953c9e4d0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3886, status=DOWN, tags=[], tenant_id=0130cf6296f34bac9ca26d4953c9e4d0, updated_at=2025-12-06T10:28:43Z on network dfc9804c-d9cd-4069-8a50-08732c020210
Dec 06 10:28:44 np0005548788.localdomain dnsmasq[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/addn_hosts - 1 addresses
Dec 06 10:28:44 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/host
Dec 06 10:28:44 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/opts
Dec 06 10:28:44 np0005548788.localdomain podman[329130]: 2025-12-06 10:28:44.906055439 +0000 UTC m=+0.038790567 container kill e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:28:45 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:45.168 262572 INFO neutron.agent.dhcp.agent [None req-06fe6d8b-02b6-4899-9ef1-87765b20a52c - - - - - -] DHCP configuration for ports {'38c49637-9511-44ef-a494-a5a423a527fd'} is completed
Dec 06 10:28:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:45 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:46 np0005548788.localdomain ceph-mon[293643]: pgmap v699: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 96 KiB/s wr, 5 op/s
Dec 06 10:28:46 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:46Z|00311|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:28:46 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:46Z|00312|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:28:46 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:46Z|00313|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.559 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.575 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.581 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.661 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.673 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.687 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:46.863 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:47.448 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:28:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:47.449 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:28:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:47.449 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:28:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:47.569 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:47.632 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:47 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:48.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:48.007 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:48.407 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:48 np0005548788.localdomain ceph-mon[293643]: pgmap v700: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 96 KiB/s wr, 5 op/s
Dec 06 10:28:48 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:48 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:49.363 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:49 np0005548788.localdomain sshd[329155]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.600110) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929600168, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 2507, "num_deletes": 254, "total_data_size": 2085141, "memory_usage": 2134936, "flush_reason": "Manual Compaction"}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929613404, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 2002207, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38779, "largest_seqno": 41285, "table_properties": {"data_size": 1992001, "index_size": 6139, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26470, "raw_average_key_size": 22, "raw_value_size": 1969480, "raw_average_value_size": 1637, "num_data_blocks": 266, "num_entries": 1203, "num_filter_entries": 1203, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016805, "oldest_key_time": 1765016805, "file_creation_time": 1765016929, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 13333 microseconds, and 6016 cpu microseconds.
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:28:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:28:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.613452) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 2002207 bytes OK
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.613477) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.615575) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.615662) EVENT_LOG_v1 {"time_micros": 1765016929615644, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.615703) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 2073734, prev total WAL file size 2073734, number of live WAL files 2.
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.617313) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(1955KB)], [72(19MB)]
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929617366, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 22515000, "oldest_snapshot_seqno": -1}
Dec 06 10:28:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:28:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156745 "" "Go-http-client/1.1"
Dec 06 10:28:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:28:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19259 "" "Go-http-client/1.1"
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14612 keys, 20699864 bytes, temperature: kUnknown
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929740744, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 20699864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20614551, "index_size": 47708, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36549, "raw_key_size": 391733, "raw_average_key_size": 26, "raw_value_size": 20364750, "raw_average_value_size": 1393, "num_data_blocks": 1774, "num_entries": 14612, "num_filter_entries": 14612, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016929, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.741142) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 20699864 bytes
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.743072) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.3 rd, 167.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 19.6 +0.0 blob) out(19.7 +0.0 blob), read-write-amplify(21.6) write-amplify(10.3) OK, records in: 15143, records dropped: 531 output_compression: NoCompression
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.743106) EVENT_LOG_v1 {"time_micros": 1765016929743092, "job": 44, "event": "compaction_finished", "compaction_time_micros": 123493, "compaction_time_cpu_micros": 64836, "output_level": 6, "num_output_files": 1, "total_output_size": 20699864, "num_input_records": 15143, "num_output_records": 14612, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929743614, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929746669, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.616705) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.746804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.746814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.746817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.746820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:28:49.746824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:50Z|00314|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:28:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:50Z|00315|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:28:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:50Z|00316|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:28:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:50.215 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:50.218 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:50.235 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:50 np0005548788.localdomain dnsmasq[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/addn_hosts - 0 addresses
Dec 06 10:28:50 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/host
Dec 06 10:28:50 np0005548788.localdomain dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/dfc9804c-d9cd-4069-8a50-08732c020210/opts
Dec 06 10:28:50 np0005548788.localdomain podman[329173]: 2025-12-06 10:28:50.395308364 +0000 UTC m=+0.070508944 container kill e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:28:50 np0005548788.localdomain ceph-mon[293643]: pgmap v701: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 149 KiB/s wr, 9 op/s
Dec 06 10:28:50 np0005548788.localdomain kernel: device tapb3b2538c-a0 left promiscuous mode
Dec 06 10:28:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:50Z|00317|binding|INFO|Releasing lport b3b2538c-a022-4385-8d06-7ab984abd6e9 from this chassis (sb_readonly=0)
Dec 06 10:28:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:28:50Z|00318|binding|INFO|Setting lport b3b2538c-a022-4385-8d06-7ab984abd6e9 down in Southbound
Dec 06 10:28:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:50.651 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:50.658 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-dfc9804c-d9cd-4069-8a50-08732c020210', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfc9804c-d9cd-4069-8a50-08732c020210', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0130cf6296f34bac9ca26d4953c9e4d0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c7e1586-e9e7-4b08-b4cc-c678a826f070, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=b3b2538c-a022-4385-8d06-7ab984abd6e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:28:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:50.660 159620 INFO neutron.agent.ovn.metadata.agent [-] Port b3b2538c-a022-4385-8d06-7ab984abd6e9 in datapath dfc9804c-d9cd-4069-8a50-08732c020210 unbound from our chassis
Dec 06 10:28:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:50.662 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dfc9804c-d9cd-4069-8a50-08732c020210, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:28:50 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:28:50.664 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[390d2f8d-9646-4ed9-8355-a62e612fa79b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:28:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:50.673 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:50 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:28:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:50 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:51.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:51.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:28:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:51.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:28:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:51.030 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:28:51 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:51 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:51.866 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:52.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:52.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:52.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:28:52 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:28:52 np0005548788.localdomain podman[329206]: 2025-12-06 10:28:52.287597726 +0000 UTC m=+0.102460670 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:28:52 np0005548788.localdomain dnsmasq[329074]: exiting on receipt of SIGTERM
Dec 06 10:28:52 np0005548788.localdomain podman[329218]: 2025-12-06 10:28:52.342073006 +0000 UTC m=+0.122251481 container kill e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:28:52 np0005548788.localdomain systemd[1]: libpod-e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce.scope: Deactivated successfully.
Dec 06 10:28:52 np0005548788.localdomain podman[329206]: 2025-12-06 10:28:52.380347176 +0000 UTC m=+0.195210120 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:28:52 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:28:52 np0005548788.localdomain podman[329251]: 2025-12-06 10:28:52.404176901 +0000 UTC m=+0.040353645 container died e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:28:52 np0005548788.localdomain podman[329251]: 2025-12-06 10:28:52.458244107 +0000 UTC m=+0.094420851 container remove e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dfc9804c-d9cd-4069-8a50-08732c020210, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:28:52 np0005548788.localdomain systemd[1]: libpod-conmon-e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce.scope: Deactivated successfully.
Dec 06 10:28:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:52.549 262572 INFO neutron.agent.dhcp.agent [None req-9ce819af-c144-42c0-92d0-9c11e88a01c0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:28:52 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:28:52.549 262572 INFO neutron.agent.dhcp.agent [None req-9ce819af-c144-42c0-92d0-9c11e88a01c0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:28:52 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:52 np0005548788.localdomain ceph-mon[293643]: pgmap v702: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:28:52 np0005548788.localdomain sshd[329155]: Received disconnect from 45.78.194.186 port 33562:11: Bye Bye [preauth]
Dec 06 10:28:52 np0005548788.localdomain sshd[329155]: Disconnected from authenticating user root 45.78.194.186 port 33562 [preauth]
Dec 06 10:28:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:52.790 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:53.040 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:53 np0005548788.localdomain systemd[1]: tmp-crun.ybdJSV.mount: Deactivated successfully.
Dec 06 10:28:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-22c6c00389c058d207c0e4faaf9cac5419b7498d0fa5355fe1d691383a788d68-merged.mount: Deactivated successfully.
Dec 06 10:28:53 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0e47d971c999dca0f2b2ddb4b6a9b77a0ead43dc60f9130dee3c1bc3bd245ce-userdata-shm.mount: Deactivated successfully.
Dec 06 10:28:53 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2ddfc9804c\x2dd9cd\x2d4069\x2d8a50\x2d08732c020210.mount: Deactivated successfully.
Dec 06 10:28:53 np0005548788.localdomain ceph-mon[293643]: pgmap v703: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:54.373 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:54 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:55.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:28:55 np0005548788.localdomain podman[329278]: 2025-12-06 10:28:55.278344776 +0000 UTC m=+0.102566344 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: tmp-crun.p05r3Y.mount: Deactivated successfully.
Dec 06 10:28:55 np0005548788.localdomain podman[329279]: 2025-12-06 10:28:55.32550882 +0000 UTC m=+0.146722294 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:28:55 np0005548788.localdomain podman[329280]: 2025-12-06 10:28:55.367304059 +0000 UTC m=+0.180290980 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Dec 06 10:28:55 np0005548788.localdomain podman[329279]: 2025-12-06 10:28:55.390726911 +0000 UTC m=+0.211940345 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:28:55 np0005548788.localdomain podman[329280]: 2025-12-06 10:28:55.410853761 +0000 UTC m=+0.223840692 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:28:55 np0005548788.localdomain podman[329278]: 2025-12-06 10:28:55.442485637 +0000 UTC m=+0.266707235 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:28:55 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:28:55 np0005548788.localdomain ceph-mon[293643]: pgmap v704: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:28:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1248594145' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:56.001 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:56.867 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4015030383' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: pgmap v705: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:57 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.026 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.028 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.028 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:28:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:28:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1148555436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.472 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.694 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.696 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11370MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.696 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.697 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.782 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.782 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:28:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:58.906 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:28:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1148555436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:28:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3936924066' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.360 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.366 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.376 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.388 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.391 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.391 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.393 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.393 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:28:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:28:59.415 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:29:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3936924066' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:00 np0005548788.localdomain ceph-mon[293643]: pgmap v706: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 128 KiB/s wr, 8 op/s
Dec 06 10:29:01 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:29:01 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:29:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:01.870 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:02 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:02 np0005548788.localdomain ceph-mon[293643]: pgmap v707: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:29:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:03 np0005548788.localdomain ceph-mon[293643]: pgmap v708: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:04.418 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:29:04 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:05 np0005548788.localdomain ceph-mon[293643]: pgmap v709: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:29:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:06.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:06 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:06.903 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:07 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:29:07 np0005548788.localdomain podman[329382]: 2025-12-06 10:29:07.256932231 +0000 UTC m=+0.083920898 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:29:07 np0005548788.localdomain podman[329382]: 2025-12-06 10:29:07.274804462 +0000 UTC m=+0.101793169 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:29:07 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:29:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:29:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:08 np0005548788.localdomain ceph-mon[293643]: pgmap v710: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:29:08 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:29:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:08 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:29:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:29:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:29:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:29:09 np0005548788.localdomain podman[329403]: 2025-12-06 10:29:09.256345037 +0000 UTC m=+0.077387347 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:29:09 np0005548788.localdomain podman[329402]: 2025-12-06 10:29:09.305649067 +0000 UTC m=+0.128994538 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:29:09 np0005548788.localdomain podman[329402]: 2025-12-06 10:29:09.315840781 +0000 UTC m=+0.139186252 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:29:09 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:29:09 np0005548788.localdomain podman[329403]: 2025-12-06 10:29:09.340455011 +0000 UTC m=+0.161497301 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:29:09 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:29:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:09.420 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:10 np0005548788.localdomain ceph-mon[293643]: pgmap v711: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 138 KiB/s wr, 9 op/s
Dec 06 10:29:10 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:11 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:29:11 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:11.942 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:12 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:12 np0005548788.localdomain ceph-mon[293643]: pgmap v712: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:29:12 np0005548788.localdomain sudo[329444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:29:12 np0005548788.localdomain sudo[329444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:29:12 np0005548788.localdomain sudo[329444]: pam_unix(sudo:session): session closed for user root
Dec 06 10:29:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:12 np0005548788.localdomain sudo[329462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:29:12 np0005548788.localdomain sudo[329462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:29:13 np0005548788.localdomain sudo[329462]: pam_unix(sudo:session): session closed for user root
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:29:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:13.874 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:13 np0005548788.localdomain sudo[329511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:29:13 np0005548788.localdomain sudo[329511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:29:13 np0005548788.localdomain sudo[329511]: pam_unix(sudo:session): session closed for user root
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: pgmap v713: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:29:13 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} v 0)
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:14.472 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:14 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:15 np0005548788.localdomain ceph-mon[293643]: pgmap v714: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:29:16 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:16.941 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:29:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:29:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:29:18 np0005548788.localdomain ceph-mon[293643]: pgmap v715: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:29:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:29:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:29:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:29:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "format": "json"}]: dispatch
Dec 06 10:29:19 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:29:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:19.475 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:29:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:29:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:29:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:29:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:29:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18783 "" "Go-http-client/1.1"
Dec 06 10:29:20 np0005548788.localdomain ceph-mon[293643]: pgmap v716: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 108 KiB/s wr, 7 op/s
Dec 06 10:29:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:21.977 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]} v 0)
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]} : dispatch
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]}]': finished
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: pgmap v717: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 55 KiB/s wr, 3 op/s
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]} : dispatch
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]} : dispatch
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]}]': finished
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:22 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:29:22Z|00319|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Dec 06 10:29:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:23 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:29:23 np0005548788.localdomain podman[329529]: 2025-12-06 10:29:23.261300826 +0000 UTC m=+0.081566785 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:29:23 np0005548788.localdomain podman[329529]: 2025-12-06 10:29:23.335660649 +0000 UTC m=+0.155926618 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:29:23 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:29:23 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "auth_id": "bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:29:24 np0005548788.localdomain ceph-mon[293643]: pgmap v718: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 95 KiB/s wr, 5 op/s
Dec 06 10:29:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:24.515 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:25 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]} v 0)
Dec 06 10:29:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]} : dispatch
Dec 06 10:29:25 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]}]': finished
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:29:26 np0005548788.localdomain podman[329553]: 2025-12-06 10:29:26.266640987 +0000 UTC m=+0.092230915 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: tmp-crun.cJ2gIu.mount: Deactivated successfully.
Dec 06 10:29:26 np0005548788.localdomain podman[329555]: 2025-12-06 10:29:26.324739219 +0000 UTC m=+0.138082449 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Dec 06 10:29:26 np0005548788.localdomain podman[329554]: 2025-12-06 10:29:26.305271948 +0000 UTC m=+0.126711968 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:29:26 np0005548788.localdomain podman[329555]: 2025-12-06 10:29:26.364766252 +0000 UTC m=+0.178109462 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc.)
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:29:26 np0005548788.localdomain podman[329554]: 2025-12-06 10:29:26.390026741 +0000 UTC m=+0.211466801 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:29:26 np0005548788.localdomain podman[329553]: 2025-12-06 10:29:26.406810439 +0000 UTC m=+0.232400357 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:29:26 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: pgmap v719: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 71 KiB/s wr, 4 op/s
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]} : dispatch
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]} : dispatch
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]}]': finished
Dec 06 10:29:26 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:27.021 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:28 np0005548788.localdomain ceph-mon[293643]: pgmap v720: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 71 KiB/s wr, 4 op/s
Dec 06 10:29:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0)
Dec 06 10:29:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 06 10:29:28 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Dec 06 10:29:29 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 06 10:29:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 06 10:29:29 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Dec 06 10:29:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:29.518 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:30 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:30 np0005548788.localdomain ceph-mon[293643]: pgmap v721: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 105 KiB/s wr, 6 op/s
Dec 06 10:29:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:32.024 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:32 np0005548788.localdomain ceph-mon[293643]: pgmap v722: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s wr, 4 op/s
Dec 06 10:29:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "format": "json"}]: dispatch
Dec 06 10:29:33 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "force": true, "format": "json"}]: dispatch
Dec 06 10:29:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:34.522 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:34 np0005548788.localdomain ceph-mon[293643]: pgmap v723: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 119 KiB/s wr, 6 op/s
Dec 06 10:29:36 np0005548788.localdomain ceph-mon[293643]: pgmap v724: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 79 KiB/s wr, 4 op/s
Dec 06 10:29:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6fddad9a-edda-44e9-b738-5688693ea723", "format": "json"}]: dispatch
Dec 06 10:29:36 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "force": true, "format": "json"}]: dispatch
Dec 06 10:29:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:37.026 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:38 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:29:38 np0005548788.localdomain systemd[1]: tmp-crun.Fxkops.mount: Deactivated successfully.
Dec 06 10:29:38 np0005548788.localdomain podman[329615]: 2025-12-06 10:29:38.276475852 +0000 UTC m=+0.095040011 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:29:38 np0005548788.localdomain podman[329615]: 2025-12-06 10:29:38.286451259 +0000 UTC m=+0.105015398 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:29:38 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:29:38 np0005548788.localdomain ceph-mon[293643]: pgmap v725: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 79 KiB/s wr, 4 op/s
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:29:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:29:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:29:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:29:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3284518139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:29:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:29:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3284518139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:29:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:39.524 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3284518139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:29:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3284518139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:29:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:29:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:29:40 np0005548788.localdomain podman[329636]: 2025-12-06 10:29:40.251570328 +0000 UTC m=+0.078850482 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:29:40 np0005548788.localdomain podman[329636]: 2025-12-06 10:29:40.26074105 +0000 UTC m=+0.088021234 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:29:40 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:29:40 np0005548788.localdomain podman[329637]: 2025-12-06 10:29:40.318674637 +0000 UTC m=+0.141050060 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 06 10:29:40 np0005548788.localdomain podman[329637]: 2025-12-06 10:29:40.328617533 +0000 UTC m=+0.150992976 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:29:40 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:29:40 np0005548788.localdomain ceph-mon[293643]: pgmap v726: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 103 KiB/s wr, 5 op/s
Dec 06 10:29:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:29:40.692 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:29:40 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:29:40.694 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:29:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:40.695 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:42.065 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:42 np0005548788.localdomain ceph-mon[293643]: pgmap v727: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 69 KiB/s wr, 3 op/s
Dec 06 10:29:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:42.841 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:43 np0005548788.localdomain ceph-mon[293643]: pgmap v728: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 80 KiB/s wr, 4 op/s
Dec 06 10:29:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:44.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:44.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:29:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:44.551 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:44 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1217810012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:45.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:45 np0005548788.localdomain ceph-mon[293643]: pgmap v729: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 35 KiB/s wr, 2 op/s
Dec 06 10:29:45 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3042594742' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:47.073 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:29:47.449 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:29:47.450 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:29:47.450 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.939553) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016987939611, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1051, "num_deletes": 257, "total_data_size": 769276, "memory_usage": 788440, "flush_reason": "Manual Compaction"}
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016987947827, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 741105, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41286, "largest_seqno": 42336, "table_properties": {"data_size": 736493, "index_size": 2083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11423, "raw_average_key_size": 19, "raw_value_size": 726591, "raw_average_value_size": 1268, "num_data_blocks": 92, "num_entries": 573, "num_filter_entries": 573, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016930, "oldest_key_time": 1765016930, "file_creation_time": 1765016987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 8314 microseconds, and 3718 cpu microseconds.
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.947873) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 741105 bytes OK
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.947901) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.949887) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.949912) EVENT_LOG_v1 {"time_micros": 1765016987949904, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.949938) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 764134, prev total WAL file size 764458, number of live WAL files 2.
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.950594) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353232' seq:72057594037927935, type:22 .. '6C6F676D0034373735' seq:0, type:0; will stop at (end)
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(723KB)], [75(19MB)]
Dec 06 10:29:47 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016987950653, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 21440969, "oldest_snapshot_seqno": -1}
Dec 06 10:29:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:48.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 14650 keys, 21308411 bytes, temperature: kUnknown
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016988063109, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 21308411, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21221386, "index_size": 49292, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36677, "raw_key_size": 393797, "raw_average_key_size": 26, "raw_value_size": 20969522, "raw_average_value_size": 1431, "num_data_blocks": 1837, "num_entries": 14650, "num_filter_entries": 14650, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765016987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.063627) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 21308411 bytes
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.065599) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.2 rd, 189.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 19.7 +0.0 blob) out(20.3 +0.0 blob), read-write-amplify(57.7) write-amplify(28.8) OK, records in: 15185, records dropped: 535 output_compression: NoCompression
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.065629) EVENT_LOG_v1 {"time_micros": 1765016988065615, "job": 46, "event": "compaction_finished", "compaction_time_micros": 112704, "compaction_time_cpu_micros": 56385, "output_level": 6, "num_output_files": 1, "total_output_size": 21308411, "num_input_records": 15185, "num_output_records": 14650, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016988065902, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016988069273, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:47.950494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.069398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.069406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.069408) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.069412) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:29:48.069414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548788.localdomain ceph-mon[293643]: pgmap v730: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 35 KiB/s wr, 2 op/s
Dec 06 10:29:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:49.002 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:49.555 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:29:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:29:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:29:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:29:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:29:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Dec 06 10:29:49 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:29:49.696 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:29:50 np0005548788.localdomain ceph-mon[293643]: pgmap v731: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 39 KiB/s wr, 2 op/s
Dec 06 10:29:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:51.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:51.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:29:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:51.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:29:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:51.022 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:29:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:52.117 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:52 np0005548788.localdomain ceph-mon[293643]: pgmap v732: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:29:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:53.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:53.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:53 np0005548788.localdomain ceph-mon[293643]: pgmap v733: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:29:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:29:54 np0005548788.localdomain podman[329677]: 2025-12-06 10:29:54.261465171 +0000 UTC m=+0.087308703 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:29:54 np0005548788.localdomain podman[329677]: 2025-12-06 10:29:54.342354745 +0000 UTC m=+0.168198237 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 06 10:29:54 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:29:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:54.557 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:55.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:56 np0005548788.localdomain ceph-mon[293643]: pgmap v734: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s wr, 0 op/s
Dec 06 10:29:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/522862524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:57.161 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: tmp-crun.PTo9Fv.mount: Deactivated successfully.
Dec 06 10:29:57 np0005548788.localdomain podman[329703]: 2025-12-06 10:29:57.248785846 +0000 UTC m=+0.064218641 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:29:57 np0005548788.localdomain podman[329703]: 2025-12-06 10:29:57.260457946 +0000 UTC m=+0.075890741 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: tmp-crun.CCrByt.mount: Deactivated successfully.
Dec 06 10:29:57 np0005548788.localdomain podman[329702]: 2025-12-06 10:29:57.312361146 +0000 UTC m=+0.129246256 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:29:57 np0005548788.localdomain podman[329702]: 2025-12-06 10:29:57.322087816 +0000 UTC m=+0.138972946 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:29:57 np0005548788.localdomain podman[329704]: 2025-12-06 10:29:57.411614286 +0000 UTC m=+0.221823790 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 06 10:29:57 np0005548788.localdomain podman[329704]: 2025-12-06 10:29:57.423576635 +0000 UTC m=+0.233786169 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6)
Dec 06 10:29:57 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:29:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1038709677' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:29:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:29:58 np0005548788.localdomain ceph-mon[293643]: pgmap v735: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s wr, 0 op/s
Dec 06 10:29:58 np0005548788.localdomain ceph-mon[293643]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.033 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.033 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.034 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.034 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.034 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:29:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:29:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4167972808' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.545 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:29:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:29:59 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "format": "json"}]: dispatch
Dec 06 10:29:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4167972808' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.561 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.766 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.768 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11370MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.769 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.769 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.854 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:29:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:29:59.855 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:30:00 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [WRN] : overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.112 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing inventories for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.234 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating ProviderTree inventory for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.235 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Updating inventory in ProviderTree for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.252 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing aggregate associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.280 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Refreshing trait associations for resource provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1, traits: HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE4A,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.296 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:30:00 np0005548788.localdomain ceph-mon[293643]: pgmap v736: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Dec 06 10:30:00 np0005548788.localdomain ceph-mon[293643]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:30:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:30:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2145475030' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.796 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.802 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.818 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.820 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:30:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:00.821 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:30:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2145475030' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:02.209 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:03 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "snap_name": "651d1145-bde4-4592-9ceb-573b615a4af1", "format": "json"}]: dispatch
Dec 06 10:30:03 np0005548788.localdomain ceph-mon[293643]: pgmap v737: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s wr, 0 op/s
Dec 06 10:30:04 np0005548788.localdomain ceph-mon[293643]: pgmap v738: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:04.596 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "snap_name": "651d1145-bde4-4592-9ceb-573b615a4af1_41e045a4-96eb-4b50-9798-8ab25b9deb95", "force": true, "format": "json"}]: dispatch
Dec 06 10:30:06 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "snap_name": "651d1145-bde4-4592-9ceb-573b615a4af1", "force": true, "format": "json"}]: dispatch
Dec 06 10:30:06 np0005548788.localdomain ceph-mon[293643]: pgmap v739: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:07.248 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:08 np0005548788.localdomain ceph-mon[293643]: pgmap v740: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:30:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:30:09 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:30:09 np0005548788.localdomain podman[329809]: 2025-12-06 10:30:09.264866615 +0000 UTC m=+0.088989625 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:30:09 np0005548788.localdomain podman[329809]: 2025-12-06 10:30:09.275176132 +0000 UTC m=+0.099299132 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:30:09 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:30:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "format": "json"}]: dispatch
Dec 06 10:30:09 np0005548788.localdomain ceph-mon[293643]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "force": true, "format": "json"}]: dispatch
Dec 06 10:30:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:09.599 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:30:09 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 24K writes, 91K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.01 MB/s
                                                          Cumulative WAL: 24K writes, 8729 syncs, 2.79 writes per sync, written: 0.08 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 42K keys, 11K commit groups, 1.0 writes per commit group, ingest: 41.50 MB, 0.07 MB/s
                                                          Interval WAL: 11K writes, 4451 syncs, 2.48 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:30:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e279 do_prune osdmap full prune enabled
Dec 06 10:30:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e280 e280: 6 total, 6 up, 6 in
Dec 06 10:30:10 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e280: 6 total, 6 up, 6 in
Dec 06 10:30:10 np0005548788.localdomain ceph-mon[293643]: pgmap v741: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 45 KiB/s wr, 2 op/s
Dec 06 10:30:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:30:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:30:11 np0005548788.localdomain podman[329829]: 2025-12-06 10:30:11.276476927 +0000 UTC m=+0.100414337 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:30:11 np0005548788.localdomain podman[329829]: 2025-12-06 10:30:11.286627419 +0000 UTC m=+0.110564829 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 10:30:11 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:30:11 np0005548788.localdomain podman[329828]: 2025-12-06 10:30:11.380423162 +0000 UTC m=+0.205636932 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:30:11 np0005548788.localdomain podman[329828]: 2025-12-06 10:30:11.414971146 +0000 UTC m=+0.240184946 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:30:11 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:30:11 np0005548788.localdomain ceph-mon[293643]: osdmap e280: 6 total, 6 up, 6 in
Dec 06 10:30:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:12.251 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:12 np0005548788.localdomain ceph-mon[293643]: pgmap v743: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 44 KiB/s wr, 3 op/s
Dec 06 10:30:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:30:13 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 22K writes, 86K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 22K writes, 7663 syncs, 2.96 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 38.78 MB, 0.06 MB/s
                                                          Interval WAL: 12K writes, 4934 syncs, 2.55 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:30:13 np0005548788.localdomain ceph-mon[293643]: pgmap v744: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 62 KiB/s wr, 3 op/s
Dec 06 10:30:14 np0005548788.localdomain sudo[329871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:30:14 np0005548788.localdomain sudo[329871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:30:14 np0005548788.localdomain sudo[329871]: pam_unix(sudo:session): session closed for user root
Dec 06 10:30:14 np0005548788.localdomain sudo[329889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:30:14 np0005548788.localdomain sudo[329889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:30:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:14.642 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:14 np0005548788.localdomain sudo[329889]: pam_unix(sudo:session): session closed for user root
Dec 06 10:30:15 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:30:15 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:30:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:30:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:30:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:30:15 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:30:15 np0005548788.localdomain sudo[329939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:30:15 np0005548788.localdomain sudo[329939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:30:15 np0005548788.localdomain sudo[329939]: pam_unix(sudo:session): session closed for user root
Dec 06 10:30:16 np0005548788.localdomain ceph-mon[293643]: pgmap v745: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 62 KiB/s wr, 3 op/s
Dec 06 10:30:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:17.297 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:30:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:30:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e280 do_prune osdmap full prune enabled
Dec 06 10:30:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 e281: 6 total, 6 up, 6 in
Dec 06 10:30:17 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e281: 6 total, 6 up, 6 in
Dec 06 10:30:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:30:18 np0005548788.localdomain ceph-mon[293643]: pgmap v746: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 62 KiB/s wr, 3 op/s
Dec 06 10:30:18 np0005548788.localdomain ceph-mon[293643]: osdmap e281: 6 total, 6 up, 6 in
Dec 06 10:30:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:30:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:30:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:30:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:30:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:19.645 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:30:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18786 "" "Go-http-client/1.1"
Dec 06 10:30:20 np0005548788.localdomain ceph-mon[293643]: pgmap v748: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 230 B/s rd, 54 KiB/s wr, 2 op/s
Dec 06 10:30:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:20.769 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:30:20 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:20.771 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:30:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:20.804 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:21 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:21.537 262572 INFO neutron.agent.linux.ip_lib [None req-5740969a-858a-4e5a-a189-3c92b5b53a43 - - - - - -] Device tapc9eb6a02-3c cannot be used as it has no MAC address
Dec 06 10:30:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:21.559 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:21 np0005548788.localdomain kernel: device tapc9eb6a02-3c entered promiscuous mode
Dec 06 10:30:21 np0005548788.localdomain NetworkManager[5968]: <info>  [1765017021.5668] manager: (tapc9eb6a02-3c): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Dec 06 10:30:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:21.567 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:21 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:21Z|00320|binding|INFO|Claiming lport c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 for this chassis.
Dec 06 10:30:21 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:21Z|00321|binding|INFO|c9eb6a02-3cb9-410b-b918-58fc1af7c1b4: Claiming unknown
Dec 06 10:30:21 np0005548788.localdomain systemd-udevd[329967]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:30:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:21.578 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-9529e132-05a0-41f3-8c25-fc78fb407e55', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9529e132-05a0-41f3-8c25-fc78fb407e55', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f629f53128e4fc5819c111669f728b2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b595d0-03c7-4891-b6ea-6128858579c3, chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=c9eb6a02-3cb9-410b-b918-58fc1af7c1b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:30:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:21.581 159620 INFO neutron.agent.ovn.metadata.agent [-] Port c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 in datapath 9529e132-05a0-41f3-8c25-fc78fb407e55 bound to our chassis
Dec 06 10:30:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:21.583 159620 DEBUG neutron.agent.ovn.metadata.agent [-] Port 75f3e62e-3a0b-4f44-b823-dff9048f679d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:30:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:21.584 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9529e132-05a0-41f3-8c25-fc78fb407e55, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:30:21 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:21.585 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[c7000d9e-0ef8-41d6-92d0-ba860d8d2033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:21.610 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:21 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:21Z|00322|binding|INFO|Setting lport c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 ovn-installed in OVS
Dec 06 10:30:21 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:21Z|00323|binding|INFO|Setting lport c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 up in Southbound
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:21.613 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain virtnodedevd[229173]: ethtool ioctl error on tapc9eb6a02-3c: No such device
Dec 06 10:30:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:21.650 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:21 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:21.679 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:22.338 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: pgmap v749: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 48 KiB/s wr, 2 op/s
Dec 06 10:30:22 np0005548788.localdomain podman[330038]: 
Dec 06 10:30:22 np0005548788.localdomain podman[330038]: 2025-12-06 10:30:22.551508431 +0000 UTC m=+0.096974681 container create 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:30:22 np0005548788.localdomain systemd[1]: Started libpod-conmon-6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8.scope.
Dec 06 10:30:22 np0005548788.localdomain podman[330038]: 2025-12-06 10:30:22.505153552 +0000 UTC m=+0.050619852 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:30:22 np0005548788.localdomain systemd[1]: tmp-crun.Gz2GlS.mount: Deactivated successfully.
Dec 06 10:30:22 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:30:22 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f92e463d605b943806f92e9018afbb9b47796c4b9ead5d2a07526d6c0eff82bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:30:22 np0005548788.localdomain podman[330038]: 2025-12-06 10:30:22.630858297 +0000 UTC m=+0.176324577 container init 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:30:22 np0005548788.localdomain podman[330038]: 2025-12-06 10:30:22.641935259 +0000 UTC m=+0.187401519 container start 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:30:22 np0005548788.localdomain dnsmasq[330056]: started, version 2.85 cachesize 150
Dec 06 10:30:22 np0005548788.localdomain dnsmasq[330056]: DNS service limited to local subnets
Dec 06 10:30:22 np0005548788.localdomain dnsmasq[330056]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:30:22 np0005548788.localdomain dnsmasq[330056]: warning: no upstream servers configured
Dec 06 10:30:22 np0005548788.localdomain dnsmasq-dhcp[330056]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:30:22 np0005548788.localdomain dnsmasq[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/addn_hosts - 0 addresses
Dec 06 10:30:22 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/host
Dec 06 10:30:22 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/opts
Dec 06 10:30:22 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:22.800 262572 INFO neutron.agent.dhcp.agent [None req-d04d42fa-4260-46fe-819c-c189ecc157eb - - - - - -] DHCP configuration for ports {'175688c8-4bab-4718-a563-f62543aaa610'} is completed
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.974579) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017022974655, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 663, "num_deletes": 251, "total_data_size": 454950, "memory_usage": 466528, "flush_reason": "Manual Compaction"}
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017022981514, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 346964, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42337, "largest_seqno": 42999, "table_properties": {"data_size": 344004, "index_size": 879, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8408, "raw_average_key_size": 21, "raw_value_size": 337544, "raw_average_value_size": 843, "num_data_blocks": 39, "num_entries": 400, "num_filter_entries": 400, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016987, "oldest_key_time": 1765016987, "file_creation_time": 1765017022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 7006 microseconds, and 2642 cpu microseconds.
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.981589) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 346964 bytes OK
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.981614) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.983671) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.983694) EVENT_LOG_v1 {"time_micros": 1765017022983687, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.983716) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 451438, prev total WAL file size 451762, number of live WAL files 2.
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.984307) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323535' seq:72057594037927935, type:22 .. '6D6772737461740034353036' seq:0, type:0; will stop at (end)
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(338KB)], [78(20MB)]
Dec 06 10:30:22 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017022984388, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 21655375, "oldest_snapshot_seqno": -1}
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 14539 keys, 19629824 bytes, temperature: kUnknown
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017023077978, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 19629824, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19547948, "index_size": 44463, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36357, "raw_key_size": 391770, "raw_average_key_size": 26, "raw_value_size": 19302331, "raw_average_value_size": 1327, "num_data_blocks": 1635, "num_entries": 14539, "num_filter_entries": 14539, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765017022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.078432) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 19629824 bytes
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.080341) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 231.1 rd, 209.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 20.3 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(119.0) write-amplify(56.6) OK, records in: 15050, records dropped: 511 output_compression: NoCompression
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.080373) EVENT_LOG_v1 {"time_micros": 1765017023080359, "job": 48, "event": "compaction_finished", "compaction_time_micros": 93702, "compaction_time_cpu_micros": 42700, "output_level": 6, "num_output_files": 1, "total_output_size": 19629824, "num_input_records": 15050, "num_output_records": 14539, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017023080609, "job": 48, "event": "table_file_deletion", "file_number": 80}
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017023083948, "job": 48, "event": "table_file_deletion", "file_number": 78}
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:22.984131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.084049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.084058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.084062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.084065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:30:23.084068) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548788.localdomain ceph-mon[293643]: pgmap v750: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:24.349 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:24Z, description=, device_id=574c884b-be04-4d2b-b229-29068bb8b5d2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c673b4c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c673b5e0>], id=95712e0f-bb36-44be-8120-8b2605969aa8, ip_allocation=immediate, mac_address=fa:16:3e:d4:74:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:30:18Z, description=, dns_domain=, id=9529e132-05a0-41f3-8c25-fc78fb407e55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-401682385-network, port_security_enabled=True, project_id=8f629f53128e4fc5819c111669f728b2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56883, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3942, status=ACTIVE, subnets=['b29a7136-f445-41d8-b19e-2802b3a877ca'], tags=[], tenant_id=8f629f53128e4fc5819c111669f728b2, updated_at=2025-12-06T10:30:19Z, vlan_transparent=None, network_id=9529e132-05a0-41f3-8c25-fc78fb407e55, port_security_enabled=False, project_id=8f629f53128e4fc5819c111669f728b2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3950, status=DOWN, tags=[], tenant_id=8f629f53128e4fc5819c111669f728b2, updated_at=2025-12-06T10:30:24Z on network 9529e132-05a0-41f3-8c25-fc78fb407e55
Dec 06 10:30:24 np0005548788.localdomain dnsmasq[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/addn_hosts - 1 addresses
Dec 06 10:30:24 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/host
Dec 06 10:30:24 np0005548788.localdomain podman[330074]: 2025-12-06 10:30:24.574009109 +0000 UTC m=+0.062201089 container kill 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:30:24 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/opts
Dec 06 10:30:24 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:30:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:24.685 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:24 np0005548788.localdomain podman[330087]: 2025-12-06 10:30:24.72842592 +0000 UTC m=+0.121388414 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:30:24 np0005548788.localdomain podman[330087]: 2025-12-06 10:30:24.768085803 +0000 UTC m=+0.161048307 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:30:24 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:30:24 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:24.880 262572 INFO neutron.agent.dhcp.agent [None req-754bf941-01b9-49c3-b91a-b0055d345ec9 - - - - - -] DHCP configuration for ports {'95712e0f-bb36-44be-8120-8b2605969aa8'} is completed
Dec 06 10:30:25 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:25.772 159620 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61ffd9e7-81c6-44c4-94c0-846d9931f97c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:30:25 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:25.959 262572 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:24Z, description=, device_id=574c884b-be04-4d2b-b229-29068bb8b5d2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c664f3d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f18c664fd60>], id=95712e0f-bb36-44be-8120-8b2605969aa8, ip_allocation=immediate, mac_address=fa:16:3e:d4:74:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:30:18Z, description=, dns_domain=, id=9529e132-05a0-41f3-8c25-fc78fb407e55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-401682385-network, port_security_enabled=True, project_id=8f629f53128e4fc5819c111669f728b2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56883, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3942, status=ACTIVE, subnets=['b29a7136-f445-41d8-b19e-2802b3a877ca'], tags=[], tenant_id=8f629f53128e4fc5819c111669f728b2, updated_at=2025-12-06T10:30:19Z, vlan_transparent=None, network_id=9529e132-05a0-41f3-8c25-fc78fb407e55, port_security_enabled=False, project_id=8f629f53128e4fc5819c111669f728b2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3950, status=DOWN, tags=[], tenant_id=8f629f53128e4fc5819c111669f728b2, updated_at=2025-12-06T10:30:24Z on network 9529e132-05a0-41f3-8c25-fc78fb407e55
Dec 06 10:30:26 np0005548788.localdomain ceph-mon[293643]: pgmap v751: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:26 np0005548788.localdomain podman[330134]: 2025-12-06 10:30:26.619721673 +0000 UTC m=+0.043908635 container kill 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:30:26 np0005548788.localdomain dnsmasq[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/addn_hosts - 1 addresses
Dec 06 10:30:26 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/host
Dec 06 10:30:26 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/opts
Dec 06 10:30:26 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:26.883 262572 INFO neutron.agent.dhcp.agent [None req-9bbed0ef-b87c-4447-884f-7735a5d039d8 - - - - - -] DHCP configuration for ports {'95712e0f-bb36-44be-8120-8b2605969aa8'} is completed
Dec 06 10:30:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:27.384 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:27Z|00324|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 10:30:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:27Z|00325|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 10:30:27 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:27Z|00326|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 10:30:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:27.941 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:27.957 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:27.963 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:27.967 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:27 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:27.980 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:28.048 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:30:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:30:28 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:30:28 np0005548788.localdomain podman[330158]: 2025-12-06 10:30:28.249340658 +0000 UTC m=+0.079296276 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:30:28 np0005548788.localdomain podman[330157]: 2025-12-06 10:30:28.268294972 +0000 UTC m=+0.098794157 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:30:28 np0005548788.localdomain podman[330157]: 2025-12-06 10:30:28.279699384 +0000 UTC m=+0.110198559 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:30:28 np0005548788.localdomain podman[330159]: 2025-12-06 10:30:28.323072631 +0000 UTC m=+0.145746734 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:30:28 np0005548788.localdomain podman[330158]: 2025-12-06 10:30:28.332974606 +0000 UTC m=+0.162930264 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:30:28 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:30:28 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:30:28 np0005548788.localdomain podman[330159]: 2025-12-06 10:30:28.439232332 +0000 UTC m=+0.261906485 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Dec 06 10:30:28 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:30:28 np0005548788.localdomain ceph-mon[293643]: pgmap v752: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:28 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:28.954 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:29.047 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:29.689 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:29.748 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:30 np0005548788.localdomain ceph-mon[293643]: pgmap v753: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Dec 06 10:30:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:32.422 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:32 np0005548788.localdomain ceph-mon[293643]: pgmap v754: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:32 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:33 np0005548788.localdomain ceph-mon[293643]: pgmap v755: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:34 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:34.720 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:36 np0005548788.localdomain ceph-mon[293643]: pgmap v756: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:37.452 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:37 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:38 np0005548788.localdomain ceph-mon[293643]: pgmap v757: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:30:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:30:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:30:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:30:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3426178596' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:30:39 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:30:39 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3426178596' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:30:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3426178596' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:30:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3426178596' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:30:39 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:39.747 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:40 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:30:40 np0005548788.localdomain systemd[1]: tmp-crun.oXbLoY.mount: Deactivated successfully.
Dec 06 10:30:40 np0005548788.localdomain podman[330220]: 2025-12-06 10:30:40.268477429 +0000 UTC m=+0.095697842 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:30:40 np0005548788.localdomain podman[330220]: 2025-12-06 10:30:40.286759282 +0000 UTC m=+0.113979705 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:30:40 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:30:40 np0005548788.localdomain ceph-mon[293643]: pgmap v758: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:30:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:30:42 np0005548788.localdomain systemd[1]: tmp-crun.I7Rota.mount: Deactivated successfully.
Dec 06 10:30:42 np0005548788.localdomain podman[330239]: 2025-12-06 10:30:42.275147858 +0000 UTC m=+0.090484321 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:30:42 np0005548788.localdomain podman[330239]: 2025-12-06 10:30:42.286761426 +0000 UTC m=+0.102097889 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:30:42 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:30:42 np0005548788.localdomain podman[330240]: 2025-12-06 10:30:42.373881302 +0000 UTC m=+0.186333776 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:30:42 np0005548788.localdomain podman[330240]: 2025-12-06 10:30:42.408645344 +0000 UTC m=+0.221097768 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:30:42 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:30:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:42.490 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:42 np0005548788.localdomain ceph-mon[293643]: pgmap v759: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:42 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:44 np0005548788.localdomain ceph-mon[293643]: pgmap v760: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:44 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:44.750 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e281 do_prune osdmap full prune enabled
Dec 06 10:30:45 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e282 e282: 6 total, 6 up, 6 in
Dec 06 10:30:45 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e282: 6 total, 6 up, 6 in
Dec 06 10:30:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:45.822 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:45.823 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:30:46 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:46.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e282 do_prune osdmap full prune enabled
Dec 06 10:30:46 np0005548788.localdomain ceph-mon[293643]: osdmap e282: 6 total, 6 up, 6 in
Dec 06 10:30:46 np0005548788.localdomain ceph-mon[293643]: pgmap v762: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:46 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/510296205' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:46 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e283 e283: 6 total, 6 up, 6 in
Dec 06 10:30:46 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e283: 6 total, 6 up, 6 in
Dec 06 10:30:47 np0005548788.localdomain ceph-mon[293643]: osdmap e283: 6 total, 6 up, 6 in
Dec 06 10:30:47 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/495746363' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:47.450 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:30:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:47.450 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:30:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:47.451 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:30:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:47.492 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:47 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:48 np0005548788.localdomain ceph-mon[293643]: pgmap v764: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:30:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:30:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:30:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156746 "" "Go-http-client/1.1"
Dec 06 10:30:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:30:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19266 "" "Go-http-client/1.1"
Dec 06 10:30:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:49.753 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:50.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:50 np0005548788.localdomain ceph-mon[293643]: pgmap v765: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Dec 06 10:30:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:50Z|00327|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 10:30:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:50Z|00328|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 10:30:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:50Z|00329|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 10:30:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:50.649 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:50.656 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:50.665 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:50 np0005548788.localdomain dnsmasq[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/addn_hosts - 0 addresses
Dec 06 10:30:50 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/host
Dec 06 10:30:50 np0005548788.localdomain dnsmasq-dhcp[330056]: read /var/lib/neutron/dhcp/9529e132-05a0-41f3-8c25-fc78fb407e55/opts
Dec 06 10:30:50 np0005548788.localdomain podman[330299]: 2025-12-06 10:30:50.798954713 +0000 UTC m=+0.052617154 container kill 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:30:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:50.996 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:50 np0005548788.localdomain kernel: device tapc9eb6a02-3c left promiscuous mode
Dec 06 10:30:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:50Z|00330|binding|INFO|Releasing lport c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 from this chassis (sb_readonly=0)
Dec 06 10:30:50 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:30:50Z|00331|binding|INFO|Setting lport c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 down in Southbound
Dec 06 10:30:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:51.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:51.020 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:51 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:51.137 159620 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548788.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpbf0256c6-f802-5ff4-ad4a-1f56b41b3447-9529e132-05a0-41f3-8c25-fc78fb407e55', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9529e132-05a0-41f3-8c25-fc78fb407e55', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f629f53128e4fc5819c111669f728b2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548788.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08b595d0-03c7-4891-b6ea-6128858579c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>], logical_port=c9eb6a02-3cb9-410b-b918-58fc1af7c1b4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3cb0d9b9a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:30:51 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:51.139 159620 INFO neutron.agent.ovn.metadata.agent [-] Port c9eb6a02-3cb9-410b-b918-58fc1af7c1b4 in datapath 9529e132-05a0-41f3-8c25-fc78fb407e55 unbound from our chassis
Dec 06 10:30:51 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:51.142 159620 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9529e132-05a0-41f3-8c25-fc78fb407e55, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:30:51 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:30:51.143 309209 DEBUG oslo.privsep.daemon [-] privsep: reply[c38fb66b-ed3b-4ff7-8529-f61eb20054bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:30:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:52.514 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:52 np0005548788.localdomain ceph-mon[293643]: pgmap v766: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Dec 06 10:30:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:52 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e283 do_prune osdmap full prune enabled
Dec 06 10:30:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 e284: 6 total, 6 up, 6 in
Dec 06 10:30:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:53.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:53.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:30:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:53.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:30:53 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : osdmap e284: 6 total, 6 up, 6 in
Dec 06 10:30:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:53.032 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:30:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:53.032 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:53.033 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:54 np0005548788.localdomain ceph-mon[293643]: osdmap e284: 6 total, 6 up, 6 in
Dec 06 10:30:54 np0005548788.localdomain ceph-mon[293643]: pgmap v768: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Dec 06 10:30:54 np0005548788.localdomain dnsmasq[330056]: exiting on receipt of SIGTERM
Dec 06 10:30:54 np0005548788.localdomain podman[330339]: 2025-12-06 10:30:54.435874047 +0000 UTC m=+0.062811087 container kill 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:30:54 np0005548788.localdomain systemd[1]: libpod-6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8.scope: Deactivated successfully.
Dec 06 10:30:54 np0005548788.localdomain podman[330354]: 2025-12-06 10:30:54.515779821 +0000 UTC m=+0.065506961 container died 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:30:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8-userdata-shm.mount: Deactivated successfully.
Dec 06 10:30:54 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-f92e463d605b943806f92e9018afbb9b47796c4b9ead5d2a07526d6c0eff82bf-merged.mount: Deactivated successfully.
Dec 06 10:30:54 np0005548788.localdomain podman[330354]: 2025-12-06 10:30:54.554806965 +0000 UTC m=+0.104534075 container cleanup 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:30:54 np0005548788.localdomain systemd[1]: libpod-conmon-6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8.scope: Deactivated successfully.
Dec 06 10:30:54 np0005548788.localdomain podman[330356]: 2025-12-06 10:30:54.585886423 +0000 UTC m=+0.128415311 container remove 6ee1e32292afaa713b63ba82bc10c3df4b653316e2f92390b606a04328545cc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9529e132-05a0-41f3-8c25-fc78fb407e55, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:30:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:54.757 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:54.796 262572 INFO neutron.agent.dhcp.agent [None req-e46f1308-5baf-4348-bbf9-4426946a576a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:30:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:54.797 262572 INFO neutron.agent.dhcp.agent [None req-e46f1308-5baf-4348-bbf9-4426946a576a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:30:54 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:30:54 np0005548788.localdomain neutron_dhcp_agent[262568]: 2025-12-06 10:30:54.859 262572 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:30:54 np0005548788.localdomain podman[330384]: 2025-12-06 10:30:54.905405624 +0000 UTC m=+0.084425584 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:30:54 np0005548788.localdomain podman[330384]: 2025-12-06 10:30:54.970842881 +0000 UTC m=+0.149862871 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:30:54 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:30:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:55.011 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:55 np0005548788.localdomain systemd[1]: run-netns-qdhcp\x2d9529e132\x2d05a0\x2d41f3\x2d8c25\x2dfc78fb407e55.mount: Deactivated successfully.
Dec 06 10:30:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:56.043 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:56 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:56.067 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:56 np0005548788.localdomain ceph-mon[293643]: pgmap v769: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 2.2 MiB/s wr, 47 op/s
Dec 06 10:30:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/11701804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4191358629' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:57.558 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:58 np0005548788.localdomain ceph-mon[293643]: pgmap v770: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.0 MiB/s wr, 44 op/s
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: tmp-crun.mnfR97.mount: Deactivated successfully.
Dec 06 10:30:59 np0005548788.localdomain podman[330410]: 2025-12-06 10:30:59.25981271 +0000 UTC m=+0.082661069 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:30:59 np0005548788.localdomain podman[330409]: 2025-12-06 10:30:59.329031514 +0000 UTC m=+0.154745062 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:30:59 np0005548788.localdomain podman[330411]: 2025-12-06 10:30:59.290470135 +0000 UTC m=+0.104908325 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:30:59 np0005548788.localdomain podman[330411]: 2025-12-06 10:30:59.373745593 +0000 UTC m=+0.188183773 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, version=9.6)
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:30:59 np0005548788.localdomain podman[330410]: 2025-12-06 10:30:59.394014028 +0000 UTC m=+0.216862397 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:30:59 np0005548788.localdomain podman[330409]: 2025-12-06 10:30:59.444501794 +0000 UTC m=+0.270215352 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:30:59 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:30:59 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:30:59.761 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.036 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.037 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.037 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.037 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.038 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:31:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:31:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3107954852' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.508 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:31:00 np0005548788.localdomain ceph-mon[293643]: pgmap v771: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3107954852' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.725 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.728 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11360MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.728 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.729 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.797 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.797 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:31:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:00.833 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:31:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:31:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3944337570' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:01.327 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:31:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:01.333 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:31:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:01.356 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:31:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:01.358 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:31:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:01.359 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:31:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3944337570' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:02.561 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:02 np0005548788.localdomain ceph-mon[293643]: pgmap v772: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:04 np0005548788.localdomain ceph-mon[293643]: pgmap v773: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:04 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:04.778 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:06 np0005548788.localdomain ceph-mon[293643]: pgmap v774: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:31:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:07.564 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:08 np0005548788.localdomain ceph-mon[293643]: pgmap v775: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:31:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:31:09 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:09.780 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:10 np0005548788.localdomain ceph-mon[293643]: pgmap v776: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:11 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:31:11 np0005548788.localdomain podman[330514]: 2025-12-06 10:31:11.274961392 +0000 UTC m=+0.098518208 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:31:11 np0005548788.localdomain podman[330514]: 2025-12-06 10:31:11.286747315 +0000 UTC m=+0.110304161 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:31:11 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:31:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:12.603 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:12 np0005548788.localdomain ceph-mon[293643]: pgmap v777: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:31:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:31:13 np0005548788.localdomain podman[330532]: 2025-12-06 10:31:13.25918765 +0000 UTC m=+0.086641183 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:31:13 np0005548788.localdomain podman[330532]: 2025-12-06 10:31:13.270797139 +0000 UTC m=+0.098250622 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:31:13 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:31:13 np0005548788.localdomain podman[330533]: 2025-12-06 10:31:13.36069167 +0000 UTC m=+0.186389298 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 06 10:31:13 np0005548788.localdomain podman[330533]: 2025-12-06 10:31:13.370924625 +0000 UTC m=+0.196622343 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:31:13 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:31:14 np0005548788.localdomain ceph-mon[293643]: pgmap v778: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:14 np0005548788.localdomain ceph-mon[293643]: log_channel(cluster) log [DBG] : mgrmap e55: np0005548790.kvkfyr(active, since 19m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:31:14 np0005548788.localdomain sshd[330573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:14 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:14.818 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:15 np0005548788.localdomain ceph-mon[293643]: mgrmap e55: np0005548790.kvkfyr(active, since 19m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:31:15 np0005548788.localdomain sudo[330575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:31:15 np0005548788.localdomain sudo[330575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:31:15 np0005548788.localdomain sudo[330575]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:15 np0005548788.localdomain sudo[330593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:31:15 np0005548788.localdomain sudo[330593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:31:16 np0005548788.localdomain ceph-mon[293643]: pgmap v779: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:16 np0005548788.localdomain sudo[330593]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:16 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:31:16 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:31:16 np0005548788.localdomain sudo[330642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:31:16 np0005548788.localdomain sudo[330642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:31:16 np0005548788.localdomain sudo[330642]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:31:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:31:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:31:17 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:31:17 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:31:17 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:31:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:17.607 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:31:18 np0005548788.localdomain ceph-mon[293643]: pgmap v780: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:31:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:31:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:31:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:31:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:31:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18772 "" "Go-http-client/1.1"
Dec 06 10:31:19 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:19.822 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:20 np0005548788.localdomain ceph-mon[293643]: pgmap v781: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:22 np0005548788.localdomain ceph-mon[293643]: pgmap v782: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:22.606 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:24 np0005548788.localdomain ceph-mon[293643]: pgmap v783: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:24 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:24.868 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:25 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:31:25 np0005548788.localdomain podman[330660]: 2025-12-06 10:31:25.264437385 +0000 UTC m=+0.088707526 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:31:25 np0005548788.localdomain podman[330660]: 2025-12-06 10:31:25.329656256 +0000 UTC m=+0.153926297 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 06 10:31:25 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:31:26 np0005548788.localdomain ceph-mon[293643]: pgmap v784: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 06 10:31:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:27.609 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:28 np0005548788.localdomain ceph-mon[293643]: pgmap v785: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 06 10:31:29 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:29.871 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:29 np0005548788.localdomain ovn_controller[153970]: 2025-12-06T10:31:29Z|00332|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec 06 10:31:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:31:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:31:30 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:31:30 np0005548788.localdomain podman[330685]: 2025-12-06 10:31:30.260500905 +0000 UTC m=+0.082402661 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:31:30 np0005548788.localdomain podman[330685]: 2025-12-06 10:31:30.274686102 +0000 UTC m=+0.096587918 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:31:30 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:31:30 np0005548788.localdomain podman[330684]: 2025-12-06 10:31:30.36573355 +0000 UTC m=+0.191587969 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:31:30 np0005548788.localdomain podman[330684]: 2025-12-06 10:31:30.375571003 +0000 UTC m=+0.201425432 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Dec 06 10:31:30 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:31:30 np0005548788.localdomain podman[330686]: 2025-12-06 10:31:30.42118887 +0000 UTC m=+0.236131322 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6)
Dec 06 10:31:30 np0005548788.localdomain podman[330686]: 2025-12-06 10:31:30.460834172 +0000 UTC m=+0.275776644 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:31:30 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:31:30 np0005548788.localdomain ceph-mon[293643]: pgmap v786: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 06 10:31:32 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:32.627 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:32 np0005548788.localdomain ceph-mon[293643]: pgmap v787: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:33 np0005548788.localdomain sshd[330744]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:33 np0005548788.localdomain sshd[330746]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:33 np0005548788.localdomain sshd[330746]: Accepted publickey for zuul from 38.102.83.114 port 56368 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:33 np0005548788.localdomain systemd-logind[765]: New session 74 of user zuul.
Dec 06 10:31:33 np0005548788.localdomain systemd[1]: Started Session 74 of User zuul.
Dec 06 10:31:33 np0005548788.localdomain sshd[330746]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:33 np0005548788.localdomain sudo[330766]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itodwefcoegqorsgyrrvpvyioqhplxtf ; /usr/bin/python3
Dec 06 10:31:33 np0005548788.localdomain sudo[330766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:33 np0005548788.localdomain python3[330768]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-7de2-0762-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:31:33 np0005548788.localdomain sshd[330744]: Received disconnect from 193.46.255.244 port 37180:11:  [preauth]
Dec 06 10:31:33 np0005548788.localdomain sshd[330744]: Disconnected from authenticating user root 193.46.255.244 port 37180 [preauth]
Dec 06 10:31:34 np0005548788.localdomain ceph-mon[293643]: pgmap v788: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:34 np0005548788.localdomain sudo[330766]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:35.103 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:36 np0005548788.localdomain ceph-mon[293643]: pgmap v789: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.578540) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097578584, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1034, "num_deletes": 252, "total_data_size": 1247992, "memory_usage": 1271760, "flush_reason": "Manual Compaction"}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097587676, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 1214849, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43000, "largest_seqno": 44033, "table_properties": {"data_size": 1210299, "index_size": 2149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10698, "raw_average_key_size": 20, "raw_value_size": 1200818, "raw_average_value_size": 2291, "num_data_blocks": 96, "num_entries": 524, "num_filter_entries": 524, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765017022, "oldest_key_time": 1765017022, "file_creation_time": 1765017097, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 9190 microseconds, and 4225 cpu microseconds.
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.587729) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 1214849 bytes OK
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.587753) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.589497) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.589516) EVENT_LOG_v1 {"time_micros": 1765017097589510, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.589537) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1243142, prev total WAL file size 1243142, number of live WAL files 2.
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.590139) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end)
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(1186KB)], [81(18MB)]
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097590198, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 20844673, "oldest_snapshot_seqno": -1}
Dec 06 10:31:37 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:37.676 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 14535 keys, 19226893 bytes, temperature: kUnknown
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097710657, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 19226893, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19145746, "index_size": 43734, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36357, "raw_key_size": 392178, "raw_average_key_size": 26, "raw_value_size": 18900878, "raw_average_value_size": 1300, "num_data_blocks": 1601, "num_entries": 14535, "num_filter_entries": 14535, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015536, "oldest_key_time": 0, "file_creation_time": 1765017097, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "e6488537-0b5c-4b08-a28c-5ae83bf6aba9", "db_session_id": "IM8HDYJUB6SO5D2H95ML", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.711123) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 19226893 bytes
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.713329) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.9 rd, 159.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 18.7 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(33.0) write-amplify(15.8) OK, records in: 15063, records dropped: 528 output_compression: NoCompression
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.713360) EVENT_LOG_v1 {"time_micros": 1765017097713346, "job": 50, "event": "compaction_finished", "compaction_time_micros": 120591, "compaction_time_cpu_micros": 54424, "output_level": 6, "num_output_files": 1, "total_output_size": 19226893, "num_input_records": 15063, "num_output_records": 14535, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097713712, "job": 50, "event": "table_file_deletion", "file_number": 83}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548788/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097717389, "job": 50, "event": "table_file_deletion", "file_number": 81}
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.590001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.717525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.717535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.717538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.717541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548788.localdomain ceph-mon[293643]: rocksdb: (Original Log Time 2025/12/06-10:31:37.717544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:38 np0005548788.localdomain ceph-mon[293643]: pgmap v790: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:31:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:31:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:31:39 np0005548788.localdomain sshd[330746]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:39 np0005548788.localdomain systemd[1]: session-74.scope: Deactivated successfully.
Dec 06 10:31:39 np0005548788.localdomain systemd-logind[765]: Session 74 logged out. Waiting for processes to exit.
Dec 06 10:31:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1455173782' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:31:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/1455173782' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:31:39 np0005548788.localdomain systemd-logind[765]: Removed session 74.
Dec 06 10:31:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:40.144 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:40 np0005548788.localdomain ceph-mon[293643]: pgmap v791: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:42 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:31:42 np0005548788.localdomain systemd[1]: tmp-crun.0OyO85.mount: Deactivated successfully.
Dec 06 10:31:42 np0005548788.localdomain podman[330771]: 2025-12-06 10:31:42.272550519 +0000 UTC m=+0.098251530 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:31:42 np0005548788.localdomain podman[330771]: 2025-12-06 10:31:42.289670377 +0000 UTC m=+0.115371378 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:31:42 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:31:42 np0005548788.localdomain ceph-mon[293643]: pgmap v792: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:42 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:42.686 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:31:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:31:43 np0005548788.localdomain podman[330790]: 2025-12-06 10:31:43.684558464 +0000 UTC m=+0.077267292 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:31:43 np0005548788.localdomain podman[330790]: 2025-12-06 10:31:43.689651972 +0000 UTC m=+0.082360880 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 06 10:31:43 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:31:43 np0005548788.localdomain podman[330789]: 2025-12-06 10:31:43.741867122 +0000 UTC m=+0.136017135 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:31:43 np0005548788.localdomain podman[330789]: 2025-12-06 10:31:43.773969001 +0000 UTC m=+0.168119044 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:31:43 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:31:44 np0005548788.localdomain ceph-mon[293643]: pgmap v793: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:45.148 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:45.360 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:45.361 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:31:46 np0005548788.localdomain ceph-mon[293643]: pgmap v794: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:46 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1627710630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:46 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3484471563' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:31:47.452 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:31:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:31:47.453 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:31:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:31:47.453 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:31:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:47.725 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:48.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:48 np0005548788.localdomain ceph-mon[293643]: pgmap v795: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:31:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:31:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:31:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:31:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:31:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Dec 06 10:31:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:50.152 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:50 np0005548788.localdomain ceph-mon[293643]: pgmap v796: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:51.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:52.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:52 np0005548788.localdomain sshd[330829]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:52 np0005548788.localdomain sshd[330829]: Accepted publickey for zuul from 38.102.83.114 port 39966 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:52 np0005548788.localdomain systemd-logind[765]: New session 75 of user zuul.
Dec 06 10:31:52 np0005548788.localdomain systemd[1]: Started Session 75 of User zuul.
Dec 06 10:31:52 np0005548788.localdomain sshd[330829]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:52 np0005548788.localdomain ceph-mon[293643]: pgmap v797: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:52 np0005548788.localdomain sudo[330833]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Dec 06 10:31:52 np0005548788.localdomain sudo[330833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:52 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:52.758 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:53.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:53 np0005548788.localdomain sudo[330833]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:53 np0005548788.localdomain sshd[330832]: Received disconnect from 38.102.83.114 port 39966:11: disconnected by user
Dec 06 10:31:53 np0005548788.localdomain sshd[330832]: Disconnected from user zuul 38.102.83.114 port 39966
Dec 06 10:31:53 np0005548788.localdomain sshd[330829]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:53 np0005548788.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Dec 06 10:31:53 np0005548788.localdomain systemd-logind[765]: Session 75 logged out. Waiting for processes to exit.
Dec 06 10:31:53 np0005548788.localdomain systemd-logind[765]: Removed session 75.
Dec 06 10:31:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:54.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:54.006 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:31:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:54.007 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:31:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:54.022 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:31:54 np0005548788.localdomain ceph-mon[293643]: pgmap v798: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:54 np0005548788.localdomain sshd[330851]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:54 np0005548788.localdomain sshd[330851]: Accepted publickey for zuul from 38.102.83.114 port 39972 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:54 np0005548788.localdomain systemd-logind[765]: New session 76 of user zuul.
Dec 06 10:31:54 np0005548788.localdomain systemd[1]: Started Session 76 of User zuul.
Dec 06 10:31:54 np0005548788.localdomain sshd[330851]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:54 np0005548788.localdomain sudo[330855]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Dec 06 10:31:54 np0005548788.localdomain sudo[330855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:54 np0005548788.localdomain sudo[330855]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:54 np0005548788.localdomain sshd[330854]: Received disconnect from 38.102.83.114 port 39972:11: disconnected by user
Dec 06 10:31:54 np0005548788.localdomain sshd[330854]: Disconnected from user zuul 38.102.83.114 port 39972
Dec 06 10:31:54 np0005548788.localdomain sshd[330851]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:54 np0005548788.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Dec 06 10:31:54 np0005548788.localdomain systemd-logind[765]: Session 76 logged out. Waiting for processes to exit.
Dec 06 10:31:54 np0005548788.localdomain systemd-logind[765]: Removed session 76.
Dec 06 10:31:54 np0005548788.localdomain sshd[330873]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:55.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:55 np0005548788.localdomain sshd[330873]: Accepted publickey for zuul from 38.102.83.114 port 39974 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:55 np0005548788.localdomain systemd-logind[765]: New session 77 of user zuul.
Dec 06 10:31:55 np0005548788.localdomain systemd[1]: Started Session 77 of User zuul.
Dec 06 10:31:55 np0005548788.localdomain sshd[330873]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:55.155 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:55 np0005548788.localdomain sudo[330877]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Dec 06 10:31:55 np0005548788.localdomain sudo[330877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:55 np0005548788.localdomain sudo[330877]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:55 np0005548788.localdomain sshd[330876]: Received disconnect from 38.102.83.114 port 39974:11: disconnected by user
Dec 06 10:31:55 np0005548788.localdomain sshd[330876]: Disconnected from user zuul 38.102.83.114 port 39974
Dec 06 10:31:55 np0005548788.localdomain sshd[330873]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:55 np0005548788.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Dec 06 10:31:55 np0005548788.localdomain systemd-logind[765]: Session 77 logged out. Waiting for processes to exit.
Dec 06 10:31:55 np0005548788.localdomain systemd-logind[765]: Removed session 77.
Dec 06 10:31:55 np0005548788.localdomain sshd[330895]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:55 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:31:55 np0005548788.localdomain sshd[330895]: Accepted publickey for zuul from 38.102.83.114 port 53884 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:55 np0005548788.localdomain systemd-logind[765]: New session 78 of user zuul.
Dec 06 10:31:55 np0005548788.localdomain systemd[1]: Started Session 78 of User zuul.
Dec 06 10:31:55 np0005548788.localdomain sshd[330895]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:55 np0005548788.localdomain podman[330897]: 2025-12-06 10:31:55.8188382 +0000 UTC m=+0.097858578 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:31:55 np0005548788.localdomain podman[330897]: 2025-12-06 10:31:55.866404657 +0000 UTC m=+0.145425035 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:31:55 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:31:55 np0005548788.localdomain sudo[330916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Dec 06 10:31:55 np0005548788.localdomain sudo[330916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:55 np0005548788.localdomain sudo[330916]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:55 np0005548788.localdomain sshd[330909]: Received disconnect from 38.102.83.114 port 53884:11: disconnected by user
Dec 06 10:31:55 np0005548788.localdomain sshd[330909]: Disconnected from user zuul 38.102.83.114 port 53884
Dec 06 10:31:55 np0005548788.localdomain sshd[330895]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:55 np0005548788.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Dec 06 10:31:55 np0005548788.localdomain systemd-logind[765]: Session 78 logged out. Waiting for processes to exit.
Dec 06 10:31:55 np0005548788.localdomain systemd-logind[765]: Removed session 78.
Dec 06 10:31:56 np0005548788.localdomain sshd[330940]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:56 np0005548788.localdomain sshd[330940]: Accepted publickey for zuul from 38.102.83.114 port 53900 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:56 np0005548788.localdomain systemd-logind[765]: New session 79 of user zuul.
Dec 06 10:31:56 np0005548788.localdomain ceph-mon[293643]: pgmap v799: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:56 np0005548788.localdomain systemd[1]: Started Session 79 of User zuul.
Dec 06 10:31:56 np0005548788.localdomain sshd[330940]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:56 np0005548788.localdomain sudo[330944]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Dec 06 10:31:56 np0005548788.localdomain sudo[330944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:56 np0005548788.localdomain sudo[330944]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:56 np0005548788.localdomain sshd[330943]: Received disconnect from 38.102.83.114 port 53900:11: disconnected by user
Dec 06 10:31:56 np0005548788.localdomain sshd[330943]: Disconnected from user zuul 38.102.83.114 port 53900
Dec 06 10:31:56 np0005548788.localdomain sshd[330940]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:56 np0005548788.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Dec 06 10:31:56 np0005548788.localdomain systemd-logind[765]: Session 79 logged out. Waiting for processes to exit.
Dec 06 10:31:56 np0005548788.localdomain systemd-logind[765]: Removed session 79.
Dec 06 10:31:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:57.006 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:57 np0005548788.localdomain sshd[330962]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:57 np0005548788.localdomain sshd[330962]: Accepted publickey for zuul from 38.102.83.114 port 53906 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:57 np0005548788.localdomain systemd-logind[765]: New session 80 of user zuul.
Dec 06 10:31:57 np0005548788.localdomain systemd[1]: Started Session 80 of User zuul.
Dec 06 10:31:57 np0005548788.localdomain sshd[330962]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:57 np0005548788.localdomain sudo[330966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Dec 06 10:31:57 np0005548788.localdomain sudo[330966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:57 np0005548788.localdomain sudo[330966]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:57 np0005548788.localdomain sshd[330965]: Received disconnect from 38.102.83.114 port 53906:11: disconnected by user
Dec 06 10:31:57 np0005548788.localdomain sshd[330965]: Disconnected from user zuul 38.102.83.114 port 53906
Dec 06 10:31:57 np0005548788.localdomain sshd[330962]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:57 np0005548788.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Dec 06 10:31:57 np0005548788.localdomain systemd-logind[765]: Session 80 logged out. Waiting for processes to exit.
Dec 06 10:31:57 np0005548788.localdomain systemd-logind[765]: Removed session 80.
Dec 06 10:31:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3668081092' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1158284102' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:31:57.795 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:57 np0005548788.localdomain sshd[330984]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:57 np0005548788.localdomain sshd[330984]: Accepted publickey for zuul from 38.102.83.114 port 53920 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:57 np0005548788.localdomain systemd-logind[765]: New session 81 of user zuul.
Dec 06 10:31:57 np0005548788.localdomain systemd[1]: Started Session 81 of User zuul.
Dec 06 10:31:57 np0005548788.localdomain sshd[330984]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:58 np0005548788.localdomain sudo[330988]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Dec 06 10:31:58 np0005548788.localdomain sudo[330988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:58 np0005548788.localdomain sudo[330988]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:58 np0005548788.localdomain sshd[330987]: Received disconnect from 38.102.83.114 port 53920:11: disconnected by user
Dec 06 10:31:58 np0005548788.localdomain sshd[330987]: Disconnected from user zuul 38.102.83.114 port 53920
Dec 06 10:31:58 np0005548788.localdomain sshd[330984]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:58 np0005548788.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Dec 06 10:31:58 np0005548788.localdomain systemd-logind[765]: Session 81 logged out. Waiting for processes to exit.
Dec 06 10:31:58 np0005548788.localdomain systemd-logind[765]: Removed session 81.
Dec 06 10:31:58 np0005548788.localdomain sshd[331006]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:58 np0005548788.localdomain sshd[331006]: Accepted publickey for zuul from 38.102.83.114 port 53936 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:58 np0005548788.localdomain systemd-logind[765]: New session 82 of user zuul.
Dec 06 10:31:58 np0005548788.localdomain ceph-mon[293643]: pgmap v800: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:58 np0005548788.localdomain systemd[1]: Started Session 82 of User zuul.
Dec 06 10:31:58 np0005548788.localdomain sshd[331006]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:58 np0005548788.localdomain sudo[331010]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Dec 06 10:31:58 np0005548788.localdomain sudo[331010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:58 np0005548788.localdomain sudo[331010]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:58 np0005548788.localdomain sshd[331009]: Received disconnect from 38.102.83.114 port 53936:11: disconnected by user
Dec 06 10:31:58 np0005548788.localdomain sshd[331009]: Disconnected from user zuul 38.102.83.114 port 53936
Dec 06 10:31:58 np0005548788.localdomain sshd[331006]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:58 np0005548788.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Dec 06 10:31:58 np0005548788.localdomain systemd-logind[765]: Session 82 logged out. Waiting for processes to exit.
Dec 06 10:31:58 np0005548788.localdomain systemd-logind[765]: Removed session 82.
Dec 06 10:31:59 np0005548788.localdomain sshd[331028]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:59 np0005548788.localdomain sshd[331028]: Accepted publickey for zuul from 38.102.83.114 port 53948 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:59 np0005548788.localdomain systemd-logind[765]: New session 83 of user zuul.
Dec 06 10:31:59 np0005548788.localdomain systemd[1]: Started Session 83 of User zuul.
Dec 06 10:31:59 np0005548788.localdomain sshd[331028]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:59 np0005548788.localdomain sudo[331032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Dec 06 10:31:59 np0005548788.localdomain sudo[331032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:59 np0005548788.localdomain sudo[331032]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:59 np0005548788.localdomain sshd[331031]: Received disconnect from 38.102.83.114 port 53948:11: disconnected by user
Dec 06 10:31:59 np0005548788.localdomain sshd[331031]: Disconnected from user zuul 38.102.83.114 port 53948
Dec 06 10:31:59 np0005548788.localdomain sshd[331028]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:59 np0005548788.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Dec 06 10:31:59 np0005548788.localdomain systemd-logind[765]: Session 83 logged out. Waiting for processes to exit.
Dec 06 10:31:59 np0005548788.localdomain systemd-logind[765]: Removed session 83.
Dec 06 10:32:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:00.159 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:00 np0005548788.localdomain ceph-mon[293643]: pgmap v801: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.027 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.028 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.029 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.029 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:32:01 np0005548788.localdomain podman[331055]: 2025-12-06 10:32:01.27353875 +0000 UTC m=+0.103545374 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:32:01 np0005548788.localdomain podman[331055]: 2025-12-06 10:32:01.326630817 +0000 UTC m=+0.156637441 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:32:01 np0005548788.localdomain podman[331062]: 2025-12-06 10:32:01.337220623 +0000 UTC m=+0.157918840 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:32:01 np0005548788.localdomain podman[331062]: 2025-12-06 10:32:01.349542343 +0000 UTC m=+0.170240510 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: tmp-crun.EV1lhA.mount: Deactivated successfully.
Dec 06 10:32:01 np0005548788.localdomain podman[331068]: 2025-12-06 10:32:01.473147204 +0000 UTC m=+0.292125978 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:32:01 np0005548788.localdomain podman[331068]: 2025-12-06 10:32:01.49249111 +0000 UTC m=+0.311469914 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Dec 06 10:32:01 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:32:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:32:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/70586077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.531 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.713 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.715 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11368MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.715 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.716 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.772 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.773 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:32:01 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:01.787 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:32:01 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/70586077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:32:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1583459833' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:02.246 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:32:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:02.253 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:32:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:02.276 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:32:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:02.279 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:32:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:02.279 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:32:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:02.844 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:02 np0005548788.localdomain ceph-mon[293643]: pgmap v802: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1583459833' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:04 np0005548788.localdomain ceph-mon[293643]: pgmap v803: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:05.162 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:06 np0005548788.localdomain ceph-mon[293643]: pgmap v804: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:07 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:07.883 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:08 np0005548788.localdomain ceph-mon[293643]: pgmap v805: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:32:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:32:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:10.165 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:10 np0005548788.localdomain ceph-mon[293643]: pgmap v806: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:12 np0005548788.localdomain ceph-mon[293643]: pgmap v807: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:12 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:12.913 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:13 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:32:13 np0005548788.localdomain podman[331151]: 2025-12-06 10:32:13.382900915 +0000 UTC m=+0.193032192 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:32:13 np0005548788.localdomain podman[331151]: 2025-12-06 10:32:13.444602187 +0000 UTC m=+0.254733434 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:32:13 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:32:14 np0005548788.localdomain ceph-mon[293643]: pgmap v808: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:32:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:32:14 np0005548788.localdomain podman[331170]: 2025-12-06 10:32:14.250108473 +0000 UTC m=+0.079967117 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:32:14 np0005548788.localdomain podman[331170]: 2025-12-06 10:32:14.258695338 +0000 UTC m=+0.088553982 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:32:14 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:32:14 np0005548788.localdomain systemd[1]: tmp-crun.X15t9M.mount: Deactivated successfully.
Dec 06 10:32:14 np0005548788.localdomain podman[331171]: 2025-12-06 10:32:14.31939659 +0000 UTC m=+0.146441577 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:32:14 np0005548788.localdomain podman[331171]: 2025-12-06 10:32:14.325993773 +0000 UTC m=+0.153038750 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 06 10:32:14 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:32:15 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:15.169 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:16 np0005548788.localdomain ceph-mon[293643]: pgmap v809: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:16 np0005548788.localdomain sudo[331211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:32:16 np0005548788.localdomain sudo[331211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:16 np0005548788.localdomain sudo[331211]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:16 np0005548788.localdomain sudo[331229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:32:16 np0005548788.localdomain sudo[331229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:17 np0005548788.localdomain systemd[1]: tmp-crun.Imf9lW.mount: Deactivated successfully.
Dec 06 10:32:17 np0005548788.localdomain podman[331319]: 2025-12-06 10:32:17.560391696 +0000 UTC m=+0.105019489 container exec 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.expose-services=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, distribution-scope=public)
Dec 06 10:32:17 np0005548788.localdomain podman[331319]: 2025-12-06 10:32:17.700717702 +0000 UTC m=+0.245345485 container exec_died 15c2297c9785dae945027602d2b6435ecdbf67321f504642f01a824a3c4b0d09 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548788, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.expose-services=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, version=7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7)
Dec 06 10:32:17 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:17.948 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain sudo[331229]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain sudo[331439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:32:18 np0005548788.localdomain sudo[331439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:18 np0005548788.localdomain sudo[331439]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: pgmap v810: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548788.localdomain sudo[331457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:32:18 np0005548788.localdomain sudo[331457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:19 np0005548788.localdomain sudo[331457]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:19 np0005548788.localdomain sudo[331507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:32:19 np0005548788.localdomain sudo[331507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:19 np0005548788.localdomain sudo[331507]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:19 np0005548788.localdomain sudo[331525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 10:32:19 np0005548788.localdomain sudo[331525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:19 np0005548788.localdomain podman[240078]: time="2025-12-06T10:32:19Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:32:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:32:19 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:32:19 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:32:19 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18781 "" "Go-http-client/1.1"
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 2025-12-06 10:32:20.129110965 +0000 UTC m=+0.079990418 container create af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_ride, release=1763362218, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, GIT_BRANCH=main)
Dec 06 10:32:20 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:20.173 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:20 np0005548788.localdomain systemd[1]: Started libpod-conmon-af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425.scope.
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 2025-12-06 10:32:20.09651819 +0000 UTC m=+0.047397653 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:32:20 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 2025-12-06 10:32:20.21392008 +0000 UTC m=+0.164799543 container init af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_ride, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, version=7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 2025-12-06 10:32:20.22530214 +0000 UTC m=+0.176181593 container start af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_ride, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 2025-12-06 10:32:20.225573029 +0000 UTC m=+0.176452482 container attach af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_ride, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:32:20 np0005548788.localdomain dreamy_ride[331598]: 167 167
Dec 06 10:32:20 np0005548788.localdomain systemd[1]: libpod-af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425.scope: Deactivated successfully.
Dec 06 10:32:20 np0005548788.localdomain podman[331584]: 2025-12-06 10:32:20.231349027 +0000 UTC m=+0.182228530 container died af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_ride, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, ceph=True, io.openshift.expose-services=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:32:20 np0005548788.localdomain podman[331603]: 2025-12-06 10:32:20.339620675 +0000 UTC m=+0.092927936 container remove af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_ride, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True)
Dec 06 10:32:20 np0005548788.localdomain systemd[1]: libpod-conmon-af476cafe7ba87ace0547bb92567afeed0869ae63f3c0f3a0ea737513b23f425.scope: Deactivated successfully.
Dec 06 10:32:20 np0005548788.localdomain podman[331623]: 
Dec 06 10:32:20 np0005548788.localdomain podman[331623]: 2025-12-06 10:32:20.562665792 +0000 UTC m=+0.082150414 container create e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_wright, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:32:20 np0005548788.localdomain systemd[1]: Started libpod-conmon-e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7.scope.
Dec 06 10:32:20 np0005548788.localdomain ceph-mon[293643]: pgmap v811: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:20 np0005548788.localdomain systemd[1]: Started libcrun container.
Dec 06 10:32:20 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81c56de46167c3175e2c1c9928a47a874270dddba6d9f40cd3a3c6b6804e638/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81c56de46167c3175e2c1c9928a47a874270dddba6d9f40cd3a3c6b6804e638/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81c56de46167c3175e2c1c9928a47a874270dddba6d9f40cd3a3c6b6804e638/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548788.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81c56de46167c3175e2c1c9928a47a874270dddba6d9f40cd3a3c6b6804e638/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548788.localdomain podman[331623]: 2025-12-06 10:32:20.530922683 +0000 UTC m=+0.050407345 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:32:20 np0005548788.localdomain podman[331623]: 2025-12-06 10:32:20.633834947 +0000 UTC m=+0.153319569 container init e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_wright, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7)
Dec 06 10:32:20 np0005548788.localdomain podman[331623]: 2025-12-06 10:32:20.644798745 +0000 UTC m=+0.164283367 container start e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_wright, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph)
Dec 06 10:32:20 np0005548788.localdomain podman[331623]: 2025-12-06 10:32:20.645029042 +0000 UTC m=+0.164513694 container attach e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_wright, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 06 10:32:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-9197312e189c1028aa58aaaeacc33dba9e2db2217eccbf10dda4b329260d4e0b-merged.mount: Deactivated successfully.
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]: [
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:     {
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "available": false,
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "ceph_device": false,
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "lsm_data": {},
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "lvs": [],
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "path": "/dev/sr0",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "rejected_reasons": [
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "Has a FileSystem",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "Insufficient space (<5GB)"
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         ],
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         "sys_api": {
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "actuators": null,
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "device_nodes": "sr0",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "human_readable_size": "482.00 KB",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "id_bus": "ata",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "model": "QEMU DVD-ROM",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "nr_requests": "2",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "partitions": {},
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "path": "/dev/sr0",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "removable": "1",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "rev": "2.5+",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "ro": "0",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "rotational": "1",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "sas_address": "",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "sas_device_handle": "",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "scheduler_mode": "mq-deadline",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "sectors": 0,
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "sectorsize": "2048",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "size": 493568.0,
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "support_discard": "0",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "type": "disk",
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:             "vendor": "QEMU"
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:         }
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]:     }
Dec 06 10:32:21 np0005548788.localdomain charming_wright[331638]: ]
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:32:21 np0005548788.localdomain systemd[1]: libpod-e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7.scope: Deactivated successfully.
Dec 06 10:32:21 np0005548788.localdomain systemd[1]: libpod-e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7.scope: Consumed 1.029s CPU time.
Dec 06 10:32:21 np0005548788.localdomain podman[331623]: 2025-12-06 10:32:21.653236907 +0000 UTC m=+1.172721539 container died e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_wright, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, release=1763362218, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Dec 06 10:32:21 np0005548788.localdomain systemd[1]: var-lib-containers-storage-overlay-b81c56de46167c3175e2c1c9928a47a874270dddba6d9f40cd3a3c6b6804e638-merged.mount: Deactivated successfully.
Dec 06 10:32:21 np0005548788.localdomain podman[333391]: 2025-12-06 10:32:21.732386137 +0000 UTC m=+0.071722683 container remove e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_wright, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:32:21 np0005548788.localdomain systemd[1]: libpod-conmon-e4fac13b159e8381f163599aff9dc93c533d75856ea5a1fc3ebf3f863022a2f7.scope: Deactivated successfully.
Dec 06 10:32:21 np0005548788.localdomain sudo[331525]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:32:21 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:21 np0005548788.localdomain sudo[333406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:32:21 np0005548788.localdomain sudo[333406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:21 np0005548788.localdomain sudo[333406]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [INF] : from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: pgmap v812: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:32:22 np0005548788.localdomain ceph-mon[293643]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:22.954 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:23 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:24 np0005548788.localdomain ceph-mon[293643]: pgmap v813: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:25 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:25.176 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:26 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:32:26 np0005548788.localdomain podman[333424]: 2025-12-06 10:32:26.254727309 +0000 UTC m=+0.078076988 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:32:26 np0005548788.localdomain podman[333424]: 2025-12-06 10:32:26.330858726 +0000 UTC m=+0.154208415 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 10:32:26 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:32:26 np0005548788.localdomain ceph-mon[293643]: pgmap v814: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:27 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:27.977 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:28 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:28 np0005548788.localdomain ceph-mon[293643]: pgmap v815: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:30 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:30.180 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:30 np0005548788.localdomain ceph-mon[293643]: pgmap v816: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:32:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:32:32 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:32:32 np0005548788.localdomain podman[333449]: 2025-12-06 10:32:32.272985485 +0000 UTC m=+0.094863896 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 06 10:32:32 np0005548788.localdomain podman[333450]: 2025-12-06 10:32:32.319805108 +0000 UTC m=+0.137960344 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:32:32 np0005548788.localdomain podman[333449]: 2025-12-06 10:32:32.341013712 +0000 UTC m=+0.162892103 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 06 10:32:32 np0005548788.localdomain podman[333450]: 2025-12-06 10:32:32.353439165 +0000 UTC m=+0.171594411 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:32:32 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:32:32 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:32:32 np0005548788.localdomain podman[333451]: 2025-12-06 10:32:32.427152698 +0000 UTC m=+0.243378394 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 06 10:32:32 np0005548788.localdomain podman[333451]: 2025-12-06 10:32:32.445585666 +0000 UTC m=+0.261811362 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 06 10:32:32 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:32:32 np0005548788.localdomain ceph-mon[293643]: pgmap v817: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:33 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:33.025 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:33 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:34 np0005548788.localdomain ceph-mon[293643]: pgmap v818: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:35 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:35.184 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:36 np0005548788.localdomain ceph-mon[293643]: pgmap v819: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:38 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:38.030 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:38 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:38 np0005548788.localdomain ceph-mon[293643]: pgmap v820: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:32:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:32:38 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:32:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3405830284' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:32:39 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.32:0/3405830284' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:32:40 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:40.188 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:40 np0005548788.localdomain ceph-mon[293643]: pgmap v821: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:42 np0005548788.localdomain ceph-mon[293643]: pgmap v822: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:43 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:43.061 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:43 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:43 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:32:43 np0005548788.localdomain podman[333507]: 2025-12-06 10:32:43.707559278 +0000 UTC m=+0.088109747 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:32:43 np0005548788.localdomain podman[333507]: 2025-12-06 10:32:43.723594192 +0000 UTC m=+0.104144471 container exec_died 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:32:43 np0005548788.localdomain systemd[1]: 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.service: Deactivated successfully.
Dec 06 10:32:44 np0005548788.localdomain ceph-mon[293643]: pgmap v823: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:44 np0005548788.localdomain sshd[333526]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:32:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.
Dec 06 10:32:45 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.
Dec 06 10:32:45 np0005548788.localdomain sshd[333526]: Accepted publickey for zuul from 192.168.122.10 port 38680 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:32:45 np0005548788.localdomain systemd-logind[765]: New session 84 of user zuul.
Dec 06 10:32:45 np0005548788.localdomain systemd[1]: Started Session 84 of User zuul.
Dec 06 10:32:45 np0005548788.localdomain sshd[333526]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:32:45 np0005548788.localdomain podman[333529]: 2025-12-06 10:32:45.146248436 +0000 UTC m=+0.092428251 container health_status 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:32:45 np0005548788.localdomain podman[333529]: 2025-12-06 10:32:45.187529869 +0000 UTC m=+0.133709674 container exec_died 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:32:45 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:45.191 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:45 np0005548788.localdomain systemd[1]: tmp-crun.BXsn5V.mount: Deactivated successfully.
Dec 06 10:32:45 np0005548788.localdomain podman[333530]: 2025-12-06 10:32:45.206305807 +0000 UTC m=+0.149357756 container health_status 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 06 10:32:45 np0005548788.localdomain systemd[1]: 315b3d7b89e87c75895fa5970504a4c9edeaa6236b5913d6ce13dc706a5741c7.service: Deactivated successfully.
Dec 06 10:32:45 np0005548788.localdomain sudo[333559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Dec 06 10:32:45 np0005548788.localdomain sudo[333559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:32:45 np0005548788.localdomain podman[333530]: 2025-12-06 10:32:45.243596867 +0000 UTC m=+0.186648806 container exec_died 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:32:45 np0005548788.localdomain systemd[1]: 66dc07677e77a32c0a3298f16297588406f393027c74a18f6570173b633dd476.service: Deactivated successfully.
Dec 06 10:32:46 np0005548788.localdomain ceph-mon[293643]: pgmap v824: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:46 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1680123915' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:46 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2760621750' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:47.281 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:47 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:47.281 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:32:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:32:47.453 159620 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:32:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:32:47.453 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:32:47 np0005548788.localdomain ovn_metadata_agent[159615]: 2025-12-06 10:32:47.453 159620 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:32:48 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:48.063 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:48 np0005548788.localdomain ceph-mon[293643]: pgmap v825: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:48 np0005548788.localdomain ceph-mon[293643]: from='client.49656 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:48 np0005548788.localdomain ceph-mon[293643]: from='client.59071 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:48 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "status"} v 0)
Dec 06 10:32:48 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1062379559' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:49.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:49 np0005548788.localdomain podman[240078]: time="2025-12-06T10:32:49Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.69257 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.49665 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.59077 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.69266 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1062379559' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4047748327' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3542862083' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:32:49 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154922 "" "Go-http-client/1.1"
Dec 06 10:32:49 np0005548788.localdomain podman[240078]: @ - - [06/Dec/2025:10:32:49 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1"
Dec 06 10:32:50 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:50.193 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:50 np0005548788.localdomain ceph-mon[293643]: pgmap v826: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:51 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:51.000 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:51 np0005548788.localdomain ovs-vsctl[333817]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 06 10:32:52 np0005548788.localdomain virtqemud[229107]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 06 10:32:52 np0005548788.localdomain virtqemud[229107]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 06 10:32:52 np0005548788.localdomain virtqemud[229107]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 06 10:32:52 np0005548788.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 333969 (lsinitrd)
Dec 06 10:32:52 np0005548788.localdomain systemd[1]: Mounting EFI System Partition Automount...
Dec 06 10:32:52 np0005548788.localdomain systemd[1]: Mounted EFI System Partition Automount.
Dec 06 10:32:52 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: cache status {prefix=cache status} (starting...)
Dec 06 10:32:52 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:52 np0005548788.localdomain ceph-mon[293643]: pgmap v827: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:52 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: client ls {prefix=client ls} (starting...)
Dec 06 10:32:52 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:52 np0005548788.localdomain lvm[334062]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 10:32:52 np0005548788.localdomain lvm[334062]: VG ceph_vg0 finished
Dec 06 10:32:52 np0005548788.localdomain lvm[334066]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 10:32:52 np0005548788.localdomain lvm[334066]: VG ceph_vg1 finished
Dec 06 10:32:53 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:53.067 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: damage ls {prefix=damage ls} (starting...)
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: dump loads {prefix=dump loads} (starting...)
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: from='client.49686 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: from='client.59095 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2201466626' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Dec 06 10:32:53 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 06 10:32:53 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:54.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:54 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:54.005 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4106160759' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 06 10:32:54 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 06 10:32:54 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config log"} v 0)
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3723108171' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3383880708' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: ops {prefix=ops} (starting...)
Dec 06 10:32:54 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.49692 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.69284 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.69287 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: pgmap v828: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.69299 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2201466626' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/595936019' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4106160759' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2581391178' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2116789574' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3723108171' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1477964806' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3383880708' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1949286175' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3501209297' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:32:54 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4083339819' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:55.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:55.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:32:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:55.005 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:32:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:55.019 281009 DEBUG nova.compute.manager [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:32:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:55.019 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1960715726' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:55.215 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:55 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: session ls {prefix=session ls} (starting...)
Dec 06 10:32:55 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf Can't run that command on an inactive MDS!
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/439920032' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mds[285743]: mds.mds.np0005548788.erzujf asok_command: status {prefix=status} (starting...)
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2396717601' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.49716 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.59125 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.69323 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1390228284' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4083339819' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2031437587' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3256193873' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/26526715' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1960715726' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3484329055' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/439920032' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3151208344' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/512599547' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2354599686' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/923202201' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3829669704' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3306622931' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3917801459' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1497560563' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2669572884' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.59170 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.49767 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: pgmap v829: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.59182 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.69380 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2396717601' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.49785 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1988228710' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/923202201' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3829669704' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3306622931' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3917801459' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3394250906' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1911255415' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/608376543' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2522593498' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1497560563' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2669572884' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:56 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/558313208' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:57.015 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1118821392' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.
Dec 06 10:32:57 np0005548788.localdomain podman[334628]: 2025-12-06 10:32:57.258519082 +0000 UTC m=+0.077724657 container health_status 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:32:57 np0005548788.localdomain podman[334628]: 2025-12-06 10:32:57.309826914 +0000 UTC m=+0.129032539 container exec_died 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:32:57 np0005548788.localdomain systemd[1]: 948061bf9fa5951b8ed608050cc1d059e1f3cfe7caf10cce983e53be093e05b7.service: Deactivated successfully.
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1880444688' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1069869138' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.69395 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4026296822' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/192676122' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3282262568' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1118821392' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2399987487' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2365653258' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1461476266' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1880444688' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:57 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1069869138' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:58.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2756974343' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:32:58.107 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2621128474' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.59227 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.49836 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.69452 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.69446 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: pgmap v830: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.59254 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.69464 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.49863 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2083356224' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1654648070' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2756974343' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/997443539' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2621128474' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:58 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/880484158' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4179676265' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:11.867900+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:12.868050+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:13.868232+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 674798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:14.868406+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 92 heartbeat osd_stat(store_statfs(0x1bbee9000/0x0/0x1bfc00000, data 0xcc42d0/0xd44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:15.868620+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:16.868835+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:17.869081+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:18.869308+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76824576 unmapped: 2482176 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 674798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 88.718849182s of 88.731254578s, submitted: 3
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 33
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now 
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2974129872
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect No active mgr available yet
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_reset con 0x55c53afa7800 session 0x55c53b4cfc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:19.869497+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76840960 unmapped: 2465792 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 34
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.103:6800/180363885,v1:172.18.0.103:6801/180363885]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect Starting new session with [v2:172.18.0.103:6800/180363885,v1:172.18.0.103:6801/180363885]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:20.869738+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76840960 unmapped: 2465792 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:21.869925+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76840960 unmapped: 2465792 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 35
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.103:6800/180363885,v1:172.18.0.103:6801/180363885]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:22.870519+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77029376 unmapped: 2277376 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:23.870845+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77029376 unmapped: 2277376 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:24.871012+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 36
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.103:6800/180363885,v1:172.18.0.103:6801/180363885]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:25.871415+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:26.871588+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:27.871750+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:28.871883+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:29.872055+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:30.872279+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:31.872464+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:32.872668+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:33.872832+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:34.873016+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:35.873233+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77045760 unmapped: 2260992 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 37
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.103:6800/180363885,v1:172.18.0.103:6801/180363885]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:36.873442+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:37.873663+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:38.873941+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:39.874110+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:40.874332+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:41.874495+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:42.874650+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:43.874903+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.2 total, 600.0 interval
                                                          Cumulative writes: 5387 writes, 23K keys, 5387 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5387 writes, 687 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 43 writes, 135 keys, 43 commit groups, 1.0 writes per commit group, ingest: 0.21 MB, 0.00 MB/s
                                                          Interval WAL: 43 writes, 21 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:44.875176+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:45.875583+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:46.875776+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:47.875999+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:48.876259+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:49.876439+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:50.876748+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:51.876989+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:52.877267+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:53.877447+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:54.877619+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:55.877793+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:56.877986+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:57.878270+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:58.878469+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:59.878636+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_refused con 0x55c538909800 session 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:00.878994+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:01.879164+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:02.879335+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:03.879541+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 677798 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:04.879763+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:05.879974+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:06.880180+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77053952 unmapped: 2252800 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:07.880415+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 38
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now 
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect Terminating session with v2:172.18.0.103:6800/180363885
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect No active mgr available yet
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 ms_handle_reset con 0x55c537d43400 session 0x55c53b4cef00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 heartbeat osd_stat(store_statfs(0x1bbee5000/0x0/0x1bfc00000, data 0xcc6559/0xd48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 48.623813629s of 48.634334564s, submitted: 2
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a7000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76201984 unmapped: 3104768 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:08.880708+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 39
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: get_auth_request con 0x55c53b532800 auth_method 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76201984 unmapped: 3104768 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:09.880913+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76201984 unmapped: 3104768 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:10.881130+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76201984 unmapped: 3104768 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:11.881342+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76201984 unmapped: 3104768 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:12.881518+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76201984 unmapped: 3104768 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:13.881691+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 41
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:14.881868+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:15.882047+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:16.882276+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:17.882504+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:18.882720+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:19.882987+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:20.883286+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:21.883462+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:22.883634+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:23.883799+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:24.883997+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:25.884291+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:26.884526+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:27.884725+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:28.884932+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:29.885312+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:30.885600+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:31.885817+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:32.886008+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:33.886282+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:34.886536+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:35.886757+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:36.886964+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:37.887179+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:38.887396+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:39.887571+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:40.887796+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:41.887950+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:42.888103+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:43.888306+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:44.888503+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:45.888698+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:46.888894+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:47.889089+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:48.889250+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:49.889423+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:50.889660+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:51.889845+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:52.890004+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:53.890176+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:54.890456+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:55.890632+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:56.890821+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:57.891035+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:58.891257+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:59.891457+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:00.891653+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:01.891850+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:02.892040+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:03.892250+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76398592 unmapped: 2908160 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:04.892417+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:05.892720+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:06.892968+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:07.893355+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:08.893515+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 681470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:09.893688+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:10.893965+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76406784 unmapped: 2899968 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 42
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now 
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/2148019987
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect No active mgr available yet
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 ms_handle_reset con 0x55c53b0a7000 session 0x55c53b4ced20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537f3e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:11.894125+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 63.843162537s of 63.848369598s, submitted: 1
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbee0000/0x0/0x1bfc00000, data 0xcc893a/0xd4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76693504 unmapped: 2613248 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 43
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: get_auth_request con 0x55c53847dc00 auth_method 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:12.894270+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76693504 unmapped: 2613248 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:13.894440+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76693504 unmapped: 2613248 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:14.894558+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 44
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76701696 unmapped: 2605056 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:15.894707+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76701696 unmapped: 2605056 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 45
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:16.894860+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:17.895048+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:18.895294+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:19.895473+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:20.895709+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:21.895937+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:22.896114+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:23.896299+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:24.896510+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:25.896677+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:26.896869+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:27.897081+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:28.897282+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:29.897476+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:30.897835+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:31.898078+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:32.898277+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:33.898531+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:34.898754+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:35.899052+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:36.899381+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:37.899592+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:38.899793+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:39.899982+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:40.900227+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:41.900405+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:42.900596+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:43.900785+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:44.900945+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:45.901116+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:46.901262+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:47.901514+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:48.901711+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:49.901859+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:50.902029+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:51.902245+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:52.902398+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:53.902570+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:54.902750+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:55.903709+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:56.903943+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:57.904124+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:58.904275+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:59.904485+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:00.904705+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:01.904880+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:02.905186+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:03.905585+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:04.905780+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:05.906032+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:06.906268+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:07.906434+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:08.906606+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:09.906764+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:10.906961+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:11.907168+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:12.907354+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:13.907548+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:14.907735+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:15.907900+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:16.908104+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:17.908322+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:18.908509+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:19.908704+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:20.908923+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:21.909125+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:22.909582+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:23.910183+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:24.910776+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:25.911335+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:26.911778+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:27.912127+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:28.912376+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:29.912532+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:30.913015+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:31.913396+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:32.913784+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:33.913990+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:34.914386+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:35.914693+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:36.914944+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:37.915106+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:38.915399+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:39.915795+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 684470 data_alloc: 285212672 data_used: 1511424
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 heartbeat osd_stat(store_statfs(0x1bbedc000/0x0/0x1bfc00000, data 0xccae73/0xd50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:40.916176+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:41.916362+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76464128 unmapped: 2842624 heap: 79306752 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 90.531120300s of 90.535911560s, submitted: 1
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:42.916708+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76488704 unmapped: 9576448 heap: 86065152 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 46
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:43.916891+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 96 heartbeat osd_stat(store_statfs(0x1bba69000/0x0/0x1bfc00000, data 0x113d3a6/0x11c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76619776 unmapped: 17842176 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:44.917071+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76709888 unmapped: 17752064 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 831325 data_alloc: 285212672 data_used: 1523712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 ms_handle_reset con 0x55c538909c00 session 0x55c53a6543c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:45.917257+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:46.917401+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:47.917572+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:48.917914+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa63000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:49.918101+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 834773 data_alloc: 285212672 data_used: 1523712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:50.918537+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa63000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:51.918854+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:52.919080+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa63000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:53.919243+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:54.919430+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 834773 data_alloc: 285212672 data_used: 1523712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa63000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:55.919641+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:56.919912+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:57.920098+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:58.920398+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:59.920667+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 834773 data_alloc: 285212672 data_used: 1523712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa63000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:00.920861+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:01.921034+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:02.921244+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:03.921433+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:04.921699+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 834773 data_alloc: 285212672 data_used: 1523712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:05.921918+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa63000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:06.922189+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:07.922554+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:08.922783+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76718080 unmapped: 17743872 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 26.771194458s of 26.907588959s, submitted: 18
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 heartbeat osd_stat(store_statfs(0x1baa65000/0x0/0x1bfc00000, data 0x213f8e9/0x21c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:09.922988+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 98 ms_handle_reset con 0x55c53890c400 session 0x55c53b749680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76726272 unmapped: 17735680 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 840558 data_alloc: 285212672 data_used: 1536000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:10.923245+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76726272 unmapped: 17735680 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:11.923425+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 76726272 unmapped: 17735680 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:12.923606+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77783040 unmapped: 16678912 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 99 ms_handle_reset con 0x55c537d43400 session 0x55c53b544960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:13.923834+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 99 heartbeat osd_stat(store_statfs(0x1baa5d000/0x0/0x1bfc00000, data 0x21443a3/0x21d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:14.924007+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 841516 data_alloc: 285212672 data_used: 1548288
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:15.924163+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 99 heartbeat osd_stat(store_statfs(0x1baa5d000/0x0/0x1bfc00000, data 0x21443a3/0x21d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:16.924314+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:17.924482+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:18.924706+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:19.924996+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77832192 unmapped: 16629760 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 841516 data_alloc: 285212672 data_used: 1548288
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.993021965s of 11.191704750s, submitted: 52
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 99 heartbeat osd_stat(store_statfs(0x1baa5d000/0x0/0x1bfc00000, data 0x21443a3/0x21d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:20.925220+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77946880 unmapped: 16515072 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 100 ms_handle_reset con 0x55c538909c00 session 0x55c53b0f30e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:21.925370+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77971456 unmapped: 16490496 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:22.925682+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78012416 unmapped: 16449536 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:23.925878+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78012416 unmapped: 16449536 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 101 heartbeat osd_stat(store_statfs(0x1baa50000/0x0/0x1bfc00000, data 0x2149161/0x21dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:24.926125+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78012416 unmapped: 16449536 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 856744 data_alloc: 285212672 data_used: 1560576
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:25.926332+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78012416 unmapped: 16449536 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:26.926487+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77791232 unmapped: 16670720 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:27.926636+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77807616 unmapped: 16654336 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 102 ms_handle_reset con 0x55c53890c400 session 0x55c53b0f32c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:28.926776+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:29.927054+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 855923 data_alloc: 285212672 data_used: 1572864
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 102 heartbeat osd_stat(store_statfs(0x1baa50000/0x0/0x1bfc00000, data 0x214b292/0x21dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:30.927349+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:31.927476+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:32.927685+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:33.927852+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 102 heartbeat osd_stat(store_statfs(0x1baa50000/0x0/0x1bfc00000, data 0x214b292/0x21dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:34.928054+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 855923 data_alloc: 285212672 data_used: 1572864
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:35.928254+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77324288 unmapped: 17137664 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 102 heartbeat osd_stat(store_statfs(0x1baa50000/0x0/0x1bfc00000, data 0x214b292/0x21dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 16.082559586s of 16.388864517s, submitted: 94
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:36.928423+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1baa4c000/0x0/0x1bfc00000, data 0x214d6ab/0x21e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:37.928585+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:38.928812+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:39.929024+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 859069 data_alloc: 285212672 data_used: 1585152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:40.929396+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1baa4d000/0x0/0x1bfc00000, data 0x214d6ab/0x21e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:41.929579+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:42.929864+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:43.930102+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:44.930334+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 859069 data_alloc: 285212672 data_used: 1585152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:45.930558+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77340672 unmapped: 17121280 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1baa4d000/0x0/0x1bfc00000, data 0x214d6ab/0x21e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a7000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.855633736s of 10.001469612s, submitted: 35
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:46.930720+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c53b0a7000 session 0x55c53b0f34a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 77979648 unmapped: 16482304 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b532800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c53b532800 session 0x55c53b0f3c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:47.930861+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78061568 unmapped: 16400384 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:48.931038+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78061568 unmapped: 16400384 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c537d43400 session 0x55c53b5443c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:49.931230+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78143488 unmapped: 16318464 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 907158 data_alloc: 285212672 data_used: 1585152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1ba55f000/0x0/0x1bfc00000, data 0x2639730/0x26cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:50.931487+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:51.931662+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:52.931839+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:53.932007+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1ba55f000/0x0/0x1bfc00000, data 0x2639730/0x26cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:54.932218+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 910998 data_alloc: 285212672 data_used: 2109440
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:55.932426+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:56.932621+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:57.932825+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:58.933030+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:59.933825+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1ba55f000/0x0/0x1bfc00000, data 0x2639730/0x26cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 78512128 unmapped: 15949824 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 910998 data_alloc: 285212672 data_used: 2109440
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:00.934092+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1ba55f000/0x0/0x1bfc00000, data 0x2639730/0x26cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 14.065759659s of 14.183358192s, submitted: 21
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 79847424 unmapped: 14614528 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1ba55f000/0x0/0x1bfc00000, data 0x2639730/0x26cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:01.934260+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 82763776 unmapped: 11698176 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:02.934432+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 82591744 unmapped: 11870208 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:03.934613+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 85114880 unmapped: 9347072 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:04.934797+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 85458944 unmapped: 9003008 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1017244 data_alloc: 285212672 data_used: 2113536
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:05.935032+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 85458944 unmapped: 9003008 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c538909c00 session 0x55c53a1843c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:06.935284+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 85475328 unmapped: 8986624 heap: 94461952 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b8710000/0x0/0x1bfc00000, data 0x32e2730/0x3378000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c53890c400 session 0x55c53a7a8000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:07.935429+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a7000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c53b0a7000 session 0x55c53a6543c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c53afa7800 session 0x55c53b4ced20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c537d43400 session 0x55c538526960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 ms_handle_reset con 0x55c538909c00 session 0x55c53aab0d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 87801856 unmapped: 18210816 heap: 106012672 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:08.935625+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 104 ms_handle_reset con 0x55c53afa6c00 session 0x55c53a655a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 104 heartbeat osd_stat(store_statfs(0x1b7995000/0x0/0x1bfc00000, data 0x4063730/0x40f9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 87588864 unmapped: 18423808 heap: 106012672 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:09.935776+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 104 ms_handle_reset con 0x55c53890c400 session 0x55c53a60f4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a7000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 95076352 unmapped: 18489344 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1282160 data_alloc: 301989888 data_used: 6873088
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:10.935996+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.955270767s of 10.002805710s, submitted: 240
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 104 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 105 ms_handle_reset con 0x55c53b0a7000 session 0x55c53a654000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 95125504 unmapped: 18440192 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:11.936253+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 105 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 95174656 unmapped: 18391040 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:12.936389+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 95174656 unmapped: 18391040 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:13.936558+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 106 ms_handle_reset con 0x55c538909c00 session 0x55c53b4cfc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 88924160 unmapped: 24641536 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:14.936744+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b7a67000/0x0/0x1bfc00000, data 0x3f8779b/0x4023000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 88924160 unmapped: 24641536 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1134157 data_alloc: 285212672 data_used: 1613824
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:15.936941+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 88932352 unmapped: 24633344 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:16.937114+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 90087424 unmapped: 23478272 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:17.937249+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 94076928 unmapped: 19488768 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:18.937406+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 95846400 unmapped: 17719296 heap: 113565696 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b7a66000/0x0/0x1bfc00000, data 0x3f89bb4/0x4027000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:19.937549+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 107 ms_handle_reset con 0x55c53890c400 session 0x55c53a67a3c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b7a66000/0x0/0x1bfc00000, data 0x3f89bb4/0x4027000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113500160 unmapped: 3497984 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1307427 data_alloc: 301989888 data_used: 21409792
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:20.937745+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b7336000/0x0/0x1bfc00000, data 0x46babb4/0x4758000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113500160 unmapped: 3497984 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:21.937914+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.581859589s of 10.898261070s, submitted: 89
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113672192 unmapped: 3325952 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:22.938032+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106758144 unmapped: 10240000 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384ca000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 ms_handle_reset con 0x55c5384ca000 session 0x55c53a5fc5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:23.938324+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98934784 unmapped: 18063360 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:24.938483+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100466688 unmapped: 16531456 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 heartbeat osd_stat(store_statfs(0x1b83be000/0x0/0x1bfc00000, data 0x362f077/0x36cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1163865 data_alloc: 301989888 data_used: 13066240
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:25.938676+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100499456 unmapped: 16498688 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:26.938888+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100499456 unmapped: 16498688 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:27.938974+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104357888 unmapped: 12640256 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:28.939139+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 ms_handle_reset con 0x55c537d43400 session 0x55c53a6b32c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102940672 unmapped: 14057472 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:29.939277+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 heartbeat osd_stat(store_statfs(0x1b761a000/0x0/0x1bfc00000, data 0x43d7077/0x4474000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102703104 unmapped: 14295040 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1277197 data_alloc: 301989888 data_used: 13058048
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:30.939496+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102703104 unmapped: 14295040 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:31.939651+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.451931000s of 10.022341728s, submitted: 157
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103743488 unmapped: 13254656 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:32.939759+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103743488 unmapped: 13254656 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:33.939880+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104038400 unmapped: 12959744 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:34.940010+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104120320 unmapped: 12877824 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1288023 data_alloc: 301989888 data_used: 13897728
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:35.940139+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b75e3000/0x0/0x1bfc00000, data 0x440b490/0x44aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103940096 unmapped: 13058048 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:36.940325+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b75da000/0x0/0x1bfc00000, data 0x4412490/0x44b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103956480 unmapped: 13041664 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b75da000/0x0/0x1bfc00000, data 0x4412490/0x44b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 109 ms_handle_reset con 0x55c53afa6c00 session 0x55c5396434a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:37.940455+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104038400 unmapped: 12959744 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:38.940579+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 110 ms_handle_reset con 0x55c537d43400 session 0x55c53ae061e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384ca000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107937792 unmapped: 9060352 heap: 116998144 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:39.940715+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 110 ms_handle_reset con 0x55c5384ca000 session 0x55c53b190b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113491968 unmapped: 27140096 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1501724 data_alloc: 301989888 data_used: 17387520
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.940892+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113516544 unmapped: 27115520 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.941042+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.724843025s of 10.069683075s, submitted: 78
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c538909c00 session 0x55c53a711a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 heartbeat osd_stat(store_statfs(0x1b5ea2000/0x0/0x1bfc00000, data 0x5b46499/0x5bea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113573888 unmapped: 27058176 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53890c400 session 0x55c53b4cfc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4cd800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53b4cd800 session 0x55c5393545a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c537d43400 session 0x55c53aab12c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:42.941160+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113598464 unmapped: 27033600 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384ca000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c538909c00 session 0x55c53a67af00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:43.941288+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53890c400 session 0x55c5389150e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b19a800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53b19a800 session 0x55c5389154a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c5384ca000 session 0x55c53b749a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c537d43400 session 0x55c53b3c3680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106414080 unmapped: 34217984 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c538909c00 session 0x55c53b3c2b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53890c400 session 0x55c53b3c25a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b19a800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53b19a800 session 0x55c53b3c32c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b19ac00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c53b19ac00 session 0x55c53b3c3c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c537d43400 session 0x55c53b3c3e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:44.941432+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104267776 unmapped: 36364288 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1421648 data_alloc: 301989888 data_used: 9867264
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets getting new tickets!
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.941674+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _finish_auth 0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.942613+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104267776 unmapped: 36364288 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:46.941821+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 ms_handle_reset con 0x55c538909c00 session 0x55c53a6541e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c53890c400 session 0x55c53b4cf680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b19a800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c53b19a800 session 0x55c53ad26780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c53b533800 session 0x55c53b7494a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c537d43400 session 0x55c53a662b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104316928 unmapped: 36315136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c538909c00 session 0x55c53b3c3680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:47.941974+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 heartbeat osd_stat(store_statfs(0x1b6359000/0x0/0x1bfc00000, data 0x568d8c2/0x5734000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 122290176 unmapped: 18341888 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c53890c400 session 0x55c53b749a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 heartbeat osd_stat(store_statfs(0x1b534e000/0x0/0x1bfc00000, data 0x66998c2/0x6740000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:48.942160+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b19a800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 heartbeat osd_stat(store_statfs(0x1b534e000/0x0/0x1bfc00000, data 0x66998c2/0x6740000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111181824 unmapped: 29450240 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:49.942275+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 ms_handle_reset con 0x55c53b533800 session 0x55c53b7485a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111558656 unmapped: 29073408 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1620990 data_alloc: 301989888 data_used: 17235968
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:50.942451+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a9a1c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105938944 unmapped: 34693120 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c53b0a6400 session 0x55c53a7a8000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c537d43400 session 0x55c538527c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c538909c00 session 0x55c53b4ce1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:51.942606+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c53890c400 session 0x55c53b4cfc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c53b533800 session 0x55c53aab0d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.617158890s of 10.149737358s, submitted: 117
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106512384 unmapped: 34119680 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c53aed7800 session 0x55c53b191680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c53a9a1c00 session 0x55c538a75c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c53b027000 session 0x55c53b191c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:52.942746+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c5384f7400 session 0x55c53ae06b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 99303424 unmapped: 41328640 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c538909c00 session 0x55c53b3c2960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c537d43400 session 0x55c53b3c3e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:53.942912+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 41279488 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:54.943122+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537d43400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c537d43400 session 0x55c53b190b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 heartbeat osd_stat(store_statfs(0x1b8acf000/0x0/0x1bfc00000, data 0x258eebb/0x2639000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 41279488 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1021417 data_alloc: 285212672 data_used: 1703936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 heartbeat osd_stat(store_statfs(0x1b8acf000/0x0/0x1bfc00000, data 0x258eebb/0x2639000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:55.943283+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 41279488 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 ms_handle_reset con 0x55c5384f7400 session 0x55c53b190960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 heartbeat osd_stat(store_statfs(0x1b8acf000/0x0/0x1bfc00000, data 0x258eebb/0x2639000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:56.943432+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c538909c00 session 0x55c53b7492c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a9a1c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c53a9a1c00 session 0x55c53b749860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98893824 unmapped: 41738240 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:57.943597+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98893824 unmapped: 41738240 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:58.943764+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98893824 unmapped: 41738240 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:59.943938+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1029819 data_alloc: 285212672 data_used: 1716224
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98893824 unmapped: 41738240 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:00.944138+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b9425000/0x0/0x1bfc00000, data 0x25bb2e4/0x2668000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98902016 unmapped: 41730048 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:01.944307+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98902016 unmapped: 41730048 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:02.944448+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b9425000/0x0/0x1bfc00000, data 0x25bb2e4/0x2668000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98902016 unmapped: 41730048 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:03.944609+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98902016 unmapped: 41730048 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:04.944768+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1041179 data_alloc: 285212672 data_used: 3330048
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98910208 unmapped: 41721856 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:05.944928+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c53b19a800 session 0x55c53b3c30e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98910208 unmapped: 41721856 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b9425000/0x0/0x1bfc00000, data 0x25bb2e4/0x2668000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:06.945087+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98910208 unmapped: 41721856 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:07.945281+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 98910208 unmapped: 41721856 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:08.945425+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 16.376327515s of 16.738838196s, submitted: 102
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101564416 unmapped: 39067648 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:09.945555+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b9271000/0x0/0x1bfc00000, data 0x27702e4/0x281d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1097327 data_alloc: 285212672 data_used: 3518464
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102432768 unmapped: 38199296 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:10.945735+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103653376 unmapped: 36978688 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b8cfc000/0x0/0x1bfc00000, data 0x2ce42e4/0x2d91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:11.945886+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102580224 unmapped: 38051840 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:12.946051+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102596608 unmapped: 38035456 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:13.946249+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b8cf8000/0x0/0x1bfc00000, data 0x2ce82e4/0x2d95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102596608 unmapped: 38035456 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:14.946509+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1116973 data_alloc: 285212672 data_used: 4050944
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102596608 unmapped: 38035456 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:15.946713+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102596608 unmapped: 38035456 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:16.946906+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c53b027000 session 0x55c5393732c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102596608 unmapped: 38035456 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:17.947030+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c53890c400 session 0x55c539643a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c53aed6c00 session 0x55c53ad270e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100179968 unmapped: 40452096 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b93e7000/0x0/0x1bfc00000, data 0x25fb272/0x26a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:18.947311+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed8c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.679263115s of 10.029356956s, submitted: 81
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 ms_handle_reset con 0x55c53aed8c00 session 0x55c538527860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:19.947474+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:20.947669+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:21.947890+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:22.948059+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:23.948182+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:24.948341+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:25.948559+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:26.948726+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:27.990552+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:28.990817+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:29.991058+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:30.991315+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:31.991515+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:32.991710+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:33.991869+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:34.992042+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:35.992419+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:36.992544+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:37.992729+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:38.992897+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:39.993120+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:40.993350+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:41.993495+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:42.993678+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:43.993843+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:44.993962+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:45.994120+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:46.994284+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:47.994423+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:48.994574+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:49.994719+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:50.994926+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:51.995107+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:52.995267+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:53.995438+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:54.995623+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:55.995756+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:56.995920+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:57.996089+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:58.996264+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:59.996382+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:00.996587+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:01.996759+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:02.996946+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:03.997298+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:04.997503+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:05.997678+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:06.997825+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:07.997950+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:08.998160+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:09.998283+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:10.998465+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:11.998611+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:12.998740+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:13.998919+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:14.999056+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100220928 unmapped: 40411136 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:15.999234+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100237312 unmapped: 40394752 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:16.999373+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100237312 unmapped: 40394752 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:17.999560+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100237312 unmapped: 40394752 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:18.999699+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100237312 unmapped: 40394752 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:19.999826+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b987c000/0x0/0x1bfc00000, data 0x216922f/0x2211000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 999277 data_alloc: 285212672 data_used: 1708032
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100237312 unmapped: 40394752 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:21.000023+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 62.637428284s of 62.701454163s, submitted: 17
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100245504 unmapped: 40386560 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:22.000155+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b026800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 116 ms_handle_reset con 0x55c53b026800 session 0x55c538526f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 100278272 unmapped: 40353792 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:23.000311+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 117 ms_handle_reset con 0x55c53aed6c00 session 0x55c53afd1e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 117 ms_handle_reset con 0x55c53890c400 session 0x55c53b7483c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed8c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101351424 unmapped: 39280640 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:24.000474+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 118 ms_handle_reset con 0x55c53aed8c00 session 0x55c53a174000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101400576 unmapped: 39231488 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:25.000625+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 118 handle_osd_map epochs [118,119], i have 118, src has [1,119]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 119 heartbeat osd_stat(store_statfs(0x1b9866000/0x0/0x1bfc00000, data 0x2172153/0x2225000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 119 ms_handle_reset con 0x55c53b027000 session 0x55c53b544f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1029679 data_alloc: 285212672 data_used: 1720320
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101548032 unmapped: 39084032 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:26.000765+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101588992 unmapped: 39043072 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:27.000922+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101638144 unmapped: 38993920 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:28.001095+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101670912 unmapped: 38961152 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:29.001264+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 121 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 122 ms_handle_reset con 0x55c53aed9c00 session 0x55c53bb5be00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101703680 unmapped: 38928384 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:30.001430+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1049782 data_alloc: 285212672 data_used: 1724416
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101703680 unmapped: 38928384 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:31.001588+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 122 heartbeat osd_stat(store_statfs(0x1b9853000/0x0/0x1bfc00000, data 0x217af63/0x2239000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101711872 unmapped: 38920192 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:32.001805+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.863801003s of 10.345592499s, submitted: 130
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 123 ms_handle_reset con 0x55c53890c400 session 0x55c53bb5b860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b9856000/0x0/0x1bfc00000, data 0x217af40/0x2238000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101752832 unmapped: 38879232 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:33.001955+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101818368 unmapped: 38813696 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:34.002110+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 ms_handle_reset con 0x55c53aed6c00 session 0x55c53bb5af00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed8c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 ms_handle_reset con 0x55c53aed8c00 session 0x55c53a1843c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 ms_handle_reset con 0x55c53b027000 session 0x55c53b2ea3c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101851136 unmapped: 38780928 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:35.002268+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 ms_handle_reset con 0x55c53aed6400 session 0x55c5389a7860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 heartbeat osd_stat(store_statfs(0x1b944e000/0x0/0x1bfc00000, data 0x217fe18/0x223e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1053794 data_alloc: 285212672 data_used: 1736704
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 101892096 unmapped: 38739968 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:36.002382+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 125 ms_handle_reset con 0x55c53890c400 session 0x55c5389174a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 125 ms_handle_reset con 0x55c53aed6c00 session 0x55c53a5fcb40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed8c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 125 heartbeat osd_stat(store_statfs(0x1b944f000/0x0/0x1bfc00000, data 0x217fdf5/0x223d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 125 ms_handle_reset con 0x55c53aed8c00 session 0x55c538914780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102023168 unmapped: 38608896 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:37.002695+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b944a000/0x0/0x1bfc00000, data 0x218239f/0x2241000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:38.002862+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102154240 unmapped: 38477824 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b944a000/0x0/0x1bfc00000, data 0x2183e8e/0x2243000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 127 ms_handle_reset con 0x55c53b027000 session 0x55c5389a7e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 127 ms_handle_reset con 0x55c5384f7800 session 0x55c53b545c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b944a000/0x0/0x1bfc00000, data 0x2183e8e/0x2243000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:39.002992+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102211584 unmapped: 38420480 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 128 ms_handle_reset con 0x55c5384f7800 session 0x55c53b544960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 128 ms_handle_reset con 0x55c53aed6000 session 0x55c5393550e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:40.003110+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102285312 unmapped: 38346752 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 129 ms_handle_reset con 0x55c53890c400 session 0x55c53aab10e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1062407 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:41.003298+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102375424 unmapped: 38256640 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:42.003481+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102375424 unmapped: 38256640 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.769134521s of 10.704959869s, submitted: 308
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:43.003639+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102416384 unmapped: 38215680 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:44.003858+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 130 heartbeat osd_stat(store_statfs(0x1b943f000/0x0/0x1bfc00000, data 0x218c3ba/0x224d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102457344 unmapped: 38174720 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:45.003966+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102457344 unmapped: 38174720 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:46.004131+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1066081 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102457344 unmapped: 38174720 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 130 heartbeat osd_stat(store_statfs(0x1b943f000/0x0/0x1bfc00000, data 0x218c3ba/0x224d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:47.004288+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102457344 unmapped: 38174720 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:48.004445+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:49.004601+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:50.004717+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:51.004913+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1068411 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:52.005074+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 131 heartbeat osd_stat(store_statfs(0x1b943c000/0x0/0x1bfc00000, data 0x218e813/0x2251000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:53.005285+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:54.005456+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:55.005679+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:56.005915+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1068411 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 131 heartbeat osd_stat(store_statfs(0x1b943c000/0x0/0x1bfc00000, data 0x218e813/0x2251000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:57.006175+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 131 heartbeat osd_stat(store_statfs(0x1b943c000/0x0/0x1bfc00000, data 0x218e813/0x2251000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:58.006468+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:59.006659+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:00.006816+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:01.007005+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1068411 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:02.007152+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:03.007389+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 131 heartbeat osd_stat(store_statfs(0x1b943c000/0x0/0x1bfc00000, data 0x218e813/0x2251000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:04.007582+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102498304 unmapped: 38133760 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:05.007708+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102506496 unmapped: 38125568 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:06.007856+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1068411 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102506496 unmapped: 38125568 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 131 heartbeat osd_stat(store_statfs(0x1b943c000/0x0/0x1bfc00000, data 0x218e813/0x2251000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 23.459136963s of 23.491872787s, submitted: 35
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:07.008045+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102506496 unmapped: 38125568 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 133 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:08.008268+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102531072 unmapped: 38100992 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:09.008844+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102531072 unmapped: 38100992 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:10.009072+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102539264 unmapped: 38092800 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:11.009309+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1081287 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102539264 unmapped: 38092800 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:12.010094+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 135 heartbeat osd_stat(store_statfs(0x1b942b000/0x0/0x1bfc00000, data 0x2197dc6/0x2262000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 135 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 102637568 unmapped: 37994496 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 136 ms_handle_reset con 0x55c53aed6c00 session 0x55c53a7110e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:13.010342+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103686144 unmapped: 36945920 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 136 heartbeat osd_stat(store_statfs(0x1b9426000/0x0/0x1bfc00000, data 0x219a1fb/0x2266000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:14.010725+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103686144 unmapped: 36945920 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:15.011063+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103686144 unmapped: 36945920 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 136 heartbeat osd_stat(store_statfs(0x1b9426000/0x0/0x1bfc00000, data 0x219a1fb/0x2266000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:16.011261+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1087115 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103686144 unmapped: 36945920 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:17.011837+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.693208694s of 10.844986916s, submitted: 55
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103727104 unmapped: 36904960 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 137 heartbeat osd_stat(store_statfs(0x1b9423000/0x0/0x1bfc00000, data 0x219c614/0x226a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:18.012044+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103727104 unmapped: 36904960 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 137 heartbeat osd_stat(store_statfs(0x1b9423000/0x0/0x1bfc00000, data 0x219c614/0x226a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:19.012459+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103727104 unmapped: 36904960 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:20.012828+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103727104 unmapped: 36904960 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:21.013164+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1089445 data_alloc: 285212672 data_used: 1748992
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103727104 unmapped: 36904960 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:22.013556+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103727104 unmapped: 36904960 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:23.013756+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed8c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103759872 unmapped: 36872192 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 ms_handle_reset con 0x55c53aed8c00 session 0x55c539707860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:24.014067+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 heartbeat osd_stat(store_statfs(0x1b941e000/0x0/0x1bfc00000, data 0x219eba9/0x226f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103759872 unmapped: 36872192 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 ms_handle_reset con 0x55c53890c400 session 0x55c53afd03c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 ms_handle_reset con 0x55c5384f7800 session 0x55c53a6854a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:25.014305+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103768064 unmapped: 36864000 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:26.014503+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1099360 data_alloc: 285212672 data_used: 1761280
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 ms_handle_reset con 0x55c53aed6000 session 0x55c53aab03c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103768064 unmapped: 36864000 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 ms_handle_reset con 0x55c53aed6c00 session 0x55c53b191c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:27.014669+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103768064 unmapped: 36864000 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.115886688s of 10.226550102s, submitted: 33
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:28.014874+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103768064 unmapped: 36864000 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 heartbeat osd_stat(store_statfs(0x1b9420000/0x0/0x1bfc00000, data 0x219eb47/0x226e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c53b027000 session 0x55c53afd12c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c5384f7800 session 0x55c53b0f2d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:29.015134+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103809024 unmapped: 36823040 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:30.015373+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103833600 unmapped: 36798464 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c53890c400 session 0x55c53b1910e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:31.015670+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1102674 data_alloc: 285212672 data_used: 1773568
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103792640 unmapped: 36839424 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c53aed6000 session 0x55c53b748000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c53aed6c00 session 0x55c53b3c2d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:32.016167+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103849984 unmapped: 36782080 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b941c000/0x0/0x1bfc00000, data 0x21a10ce/0x2272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6f800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c53ab6f800 session 0x55c5389a61e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:33.016423+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103866368 unmapped: 36765696 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:34.016728+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103866368 unmapped: 36765696 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 ms_handle_reset con 0x55c5384f7800 session 0x55c53a5fd860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:35.017004+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103858176 unmapped: 36773888 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:36.017230+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1110153 data_alloc: 285212672 data_used: 1773568
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103858176 unmapped: 36773888 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b9418000/0x0/0x1bfc00000, data 0x21a11b4/0x2276000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:37.017434+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103866368 unmapped: 36765696 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.802186966s of 10.080909729s, submitted: 75
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:38.017600+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 140 heartbeat osd_stat(store_statfs(0x1b9413000/0x0/0x1bfc00000, data 0x21a35cd/0x227a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103866368 unmapped: 36765696 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:39.017764+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103866368 unmapped: 36765696 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:40.017981+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103849984 unmapped: 36782080 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:41.019022+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1114496 data_alloc: 285212672 data_used: 1789952
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 141 ms_handle_reset con 0x55c53890c400 session 0x55c53ad261e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103915520 unmapped: 36716544 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:42.019251+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 141 ms_handle_reset con 0x55c53aed6c00 session 0x55c53aab1a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103923712 unmapped: 36708352 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 142 ms_handle_reset con 0x55c53aed6000 session 0x55c53b4ce5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:43.019465+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 142 ms_handle_reset con 0x55c53ab6e000 session 0x55c53a67bc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103923712 unmapped: 36708352 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:44.019740+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b940a000/0x0/0x1bfc00000, data 0x21a8033/0x2282000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 142 ms_handle_reset con 0x55c53890c400 session 0x55c53b190960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 103981056 unmapped: 36651008 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 143 ms_handle_reset con 0x55c5384f7800 session 0x55c53ae07680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 143 heartbeat osd_stat(store_statfs(0x1b940c000/0x0/0x1bfc00000, data 0x21a8033/0x2282000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 143 ms_handle_reset con 0x55c53ab6e000 session 0x55c539707860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:45.020031+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104005632 unmapped: 36626432 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 144 ms_handle_reset con 0x55c53aed6000 session 0x55c53a182f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 144 ms_handle_reset con 0x55c53aed6c00 session 0x55c53b191e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:46.020299+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1135762 data_alloc: 285212672 data_used: 1806336
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104087552 unmapped: 36544512 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53890c400 session 0x55c53a5fcb40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c5384f7800 session 0x55c538a745a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 heartbeat osd_stat(store_statfs(0x1b93ff000/0x0/0x1bfc00000, data 0x21acf60/0x228d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53ab6e000 session 0x55c539370b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b027c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:47.020475+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53aed6000 session 0x55c5389174a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53b027c00 session 0x55c53afd14a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104185856 unmapped: 36446208 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.869979858s of 10.507114410s, submitted: 189
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:48.020780+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5a000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104185856 unmapped: 36446208 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:49.020950+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 heartbeat osd_stat(store_statfs(0x1b93fe000/0x0/0x1bfc00000, data 0x21af44f/0x228f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104194048 unmapped: 36438016 heap: 140632064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:50.021116+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53a99fc00 session 0x55c53a7103c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53890c400 session 0x55c5393732c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104251392 unmapped: 53166080 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 ms_handle_reset con 0x55c53aed6000 session 0x55c53bb5ba40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:51.021385+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1244249 data_alloc: 285212672 data_used: 1798144
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104251392 unmapped: 53166080 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:52.021545+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104210432 unmapped: 53207040 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6f000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:53.021747+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 147 ms_handle_reset con 0x55c53ab6f000 session 0x55c53ad26b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104243200 unmapped: 53174272 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 147 ms_handle_reset con 0x55c53ab6e000 session 0x55c53bb5be00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:54.021883+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104251392 unmapped: 53166080 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 148 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5bc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 149 handle_osd_map epochs [148,149], i have 149, src has [1,149]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 149 ms_handle_reset con 0x55c53890c400 session 0x55c53b4cef00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:55.022169+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 149 heartbeat osd_stat(store_statfs(0x1b6bf8000/0x0/0x1bfc00000, data 0x49b3d44/0x4a95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104275968 unmapped: 53141504 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 150 ms_handle_reset con 0x55c53a99fc00 session 0x55c53b4cf860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:56.022407+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1156051 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 104341504 unmapped: 53075968 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:57.022584+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105398272 unmapped: 52019200 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:58.022722+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 152 handle_osd_map epochs [151,152], i have 152, src has [1,152]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.860943794s of 10.461911201s, submitted: 152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105431040 unmapped: 51986432 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 152 ms_handle_reset con 0x55c53aed6000 session 0x55c53a7a9a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:59.022969+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 152 heartbeat osd_stat(store_statfs(0x1b93e5000/0x0/0x1bfc00000, data 0x21bf64f/0x22a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105480192 unmapped: 51937280 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:00.023172+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105480192 unmapped: 51937280 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:01.023428+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1162679 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105480192 unmapped: 51937280 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:02.023615+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105553920 unmapped: 51863552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:03.023813+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105553920 unmapped: 51863552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:04.024080+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105553920 unmapped: 51863552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 153 heartbeat osd_stat(store_statfs(0x1b93e3000/0x0/0x1bfc00000, data 0x21c1ac0/0x22aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:05.024246+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105553920 unmapped: 51863552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:06.024410+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1164337 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105553920 unmapped: 51863552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:07.024619+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105553920 unmapped: 51863552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:08.024808+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105578496 unmapped: 51838976 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.432126045s of 10.487936974s, submitted: 37
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 154 ms_handle_reset con 0x55c5384f7800 session 0x55c53a1741e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:09.024959+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105578496 unmapped: 51838976 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:10.025131+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 154 heartbeat osd_stat(store_statfs(0x1b93dd000/0x0/0x1bfc00000, data 0x21c4065/0x22b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105594880 unmapped: 51822592 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 154 ms_handle_reset con 0x55c53890c400 session 0x55c53a6630e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:11.025329+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1173895 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 155 ms_handle_reset con 0x55c53a99fc00 session 0x55c53b545860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105594880 unmapped: 51822592 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 155 ms_handle_reset con 0x55c53ab6e000 session 0x55c53aab12c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:12.025461+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 156 ms_handle_reset con 0x55c53aed6000 session 0x55c53a5fd680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 156 ms_handle_reset con 0x55c5384f7800 session 0x55c53b4cfa40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:13.025633+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 156 ms_handle_reset con 0x55c53890c400 session 0x55c53ae07860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:14.026007+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:15.026576+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:16.027090+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 156 heartbeat osd_stat(store_statfs(0x1b93d6000/0x0/0x1bfc00000, data 0x21c8b11/0x22b6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1174947 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:17.027555+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d6000/0x0/0x1bfc00000, data 0x21c8b11/0x22b6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:18.027810+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105619456 unmapped: 51798016 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:19.027989+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d3000/0x0/0x1bfc00000, data 0x21caf4a/0x22ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 47
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.332359314s of 10.456606865s, submitted: 67
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105922560 unmapped: 51494912 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53a99fc00 session 0x55c539371860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d3000/0x0/0x1bfc00000, data 0x21caf4a/0x22ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:20.028364+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105922560 unmapped: 51494912 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:21.028587+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1177277 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105873408 unmapped: 51544064 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:22.028729+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105873408 unmapped: 51544064 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d3000/0x0/0x1bfc00000, data 0x21caf4a/0x22ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:23.029018+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105873408 unmapped: 51544064 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:24.029273+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d3000/0x0/0x1bfc00000, data 0x21caf4a/0x22ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105873408 unmapped: 51544064 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:25.029509+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 48
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106127360 unmapped: 51290112 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d4000/0x0/0x1bfc00000, data 0x21caf4a/0x22ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:26.029726+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1176397 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106127360 unmapped: 51290112 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:27.029975+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106143744 unmapped: 51273728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:28.030140+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106143744 unmapped: 51273728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d3000/0x0/0x1bfc00000, data 0x21cb014/0x22bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:29.030305+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106143744 unmapped: 51273728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d3000/0x0/0x1bfc00000, data 0x21cb014/0x22bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:30.030567+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.068720818s of 11.111158371s, submitted: 12
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53ab6e000 session 0x55c53afd1860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106160128 unmapped: 51257344 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d1000/0x0/0x1bfc00000, data 0x21cb086/0x22bd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:31.030752+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1184535 data_alloc: 285212672 data_used: 1810432
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106160128 unmapped: 51257344 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:32.030928+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106160128 unmapped: 51257344 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b93d1000/0x0/0x1bfc00000, data 0x21cb086/0x22bd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:33.031227+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53b0a6800 session 0x55c53a67ad20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c5384f7c00 session 0x55c53afd1e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c5384f7800 session 0x55c5393545a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113246208 unmapped: 44171264 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53890c400 session 0x55c53a6b3c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:34.031402+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53a99fc00 session 0x55c538914f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53ab6e000 session 0x55c53b1910e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106414080 unmapped: 51003392 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 ms_handle_reset con 0x55c53ab6e000 session 0x55c53b748d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:35.031609+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105562112 unmapped: 51855360 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 157 handle_osd_map epochs [157,158], i have 157, src has [1,158]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:36.031814+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 ms_handle_reset con 0x55c5384f7800 session 0x55c53a60e3c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1288553 data_alloc: 285212672 data_used: 1822720
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105594880 unmapped: 51822592 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:37.032012+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 ms_handle_reset con 0x55c5384f7c00 session 0x55c53880c960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105594880 unmapped: 51822592 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:38.032316+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105611264 unmapped: 51806208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 heartbeat osd_stat(store_statfs(0x1b81d7000/0x0/0x1bfc00000, data 0x2cf567d/0x2deb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:39.032540+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 heartbeat osd_stat(store_statfs(0x1b81d7000/0x0/0x1bfc00000, data 0x2cf567d/0x2deb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890c400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 ms_handle_reset con 0x55c53a99fc00 session 0x55c53b190960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105635840 unmapped: 51781632 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 159 ms_handle_reset con 0x55c53890c400 session 0x55c538a74000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:40.032782+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.246817589s of 10.016303062s, submitted: 180
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 159 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5a1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105652224 unmapped: 51765248 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:41.033009+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1290229 data_alloc: 285212672 data_used: 1835008
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105668608 unmapped: 51748864 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 159 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b0f23c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:42.033256+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105668608 unmapped: 51748864 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 160 ms_handle_reset con 0x55c53a99fc00 session 0x55c53d89e780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:43.033405+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105693184 unmapped: 51724288 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 handle_osd_map epochs [160,161], i have 161, src has [1,161]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 handle_osd_map epochs [160,161], i have 161, src has [1,161]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:44.033530+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4f3000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53b4f3000 session 0x55c53a182f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53ab6e000 session 0x55c53b2eaf00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 heartbeat osd_stat(store_statfs(0x1b889b000/0x0/0x1bfc00000, data 0x2cfa167/0x2df2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c5384f7800 session 0x55c53b3c2000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105717760 unmapped: 51699712 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:45.033673+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c5384f7c00 session 0x55c53880d2c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53a99fc00 session 0x55c53b3c3a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105750528 unmapped: 51666944 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:46.033872+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53ab6e000 session 0x55c53a5fc5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1301257 data_alloc: 285212672 data_used: 1851392
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:47.034039+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4f3000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:48.034178+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53b4f3000 session 0x55c538916f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:49.034414+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c5384f7800 session 0x55c53aab05a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:50.034568+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 heartbeat osd_stat(store_statfs(0x1b8898000/0x0/0x1bfc00000, data 0x2cfc624/0x2df6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c5384f7c00 session 0x55c5389a7e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.004096985s of 10.307150841s, submitted: 78
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53a99fc00 session 0x55c53bb5b680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:51.034772+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53ab6e000 session 0x55c53880de00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1302445 data_alloc: 285212672 data_used: 1851392
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c53a99e000 session 0x55c53aab1c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:52.034908+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 ms_handle_reset con 0x55c5384f7800 session 0x55c53afd0000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105799680 unmapped: 51617792 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 162 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a6b32c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:53.035062+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 162 ms_handle_reset con 0x55c53a99e000 session 0x55c53bb5a5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105799680 unmapped: 51617792 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 162 ms_handle_reset con 0x55c53a99fc00 session 0x55c538526f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:54.035274+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105799680 unmapped: 51617792 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:55.035421+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53ab6e000 session 0x55c53b5450e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888e000/0x0/0x1bfc00000, data 0x2d00fd2/0x2dff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:56.035585+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1311820 data_alloc: 285212672 data_used: 1875968
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:57.035748+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5b0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:58.035925+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888f000/0x0/0x1bfc00000, data 0x2d00fd2/0x2dff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f7c00 session 0x55c538a752c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53a99e000 session 0x55c53bb5ad20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:59.036036+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888f000/0x0/0x1bfc00000, data 0x2d00fd2/0x2dff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53a99fc00 session 0x55c53b0f34a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a715000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53a715000 session 0x55c53b190f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:00.036309+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105865216 unmapped: 51552256 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:01.036525+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888e000/0x0/0x1bfc00000, data 0x2d00fe3/0x2e00000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1312768 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105865216 unmapped: 51552256 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:02.036700+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105865216 unmapped: 51552256 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:03.036845+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.568959236s of 12.825785637s, submitted: 68
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f7800 session 0x55c53a685860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105889792 unmapped: 51527680 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:04.037019+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105889792 unmapped: 51527680 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:05.037164+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105889792 unmapped: 51527680 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f7c00 session 0x55c53aab14a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888d000/0x0/0x1bfc00000, data 0x2d00ff3/0x2e01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [1,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:06.037277+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53a99e000 session 0x55c53a7a9860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1330203 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115056640 unmapped: 42360832 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53a99fc00 session 0x55c5389143c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f6000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:07.037394+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f6000 session 0x55c539355e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:08.037543+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b7e14000/0x0/0x1bfc00000, data 0x3779159/0x387a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:09.037649+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b7e14000/0x0/0x1bfc00000, data 0x3779159/0x387a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:10.037779+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f7800 session 0x55c53a664780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105783296 unmapped: 51634176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:11.037974+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a6b2000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1402729 data_alloc: 285212672 data_used: 1884160
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105816064 unmapped: 51601408 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:12.038179+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 ms_handle_reset con 0x55c53a99e000 session 0x55c538916000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:13.038335+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.996505737s of 10.309669495s, submitted: 63
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:14.038491+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888a000/0x0/0x1bfc00000, data 0x2d01213/0x2e03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105832448 unmapped: 51585024 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:15.038643+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105832448 unmapped: 51585024 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:16.038828+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888d000/0x0/0x1bfc00000, data 0x2d01277/0x2e01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1324992 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105832448 unmapped: 51585024 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:17.038966+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105832448 unmapped: 51585024 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:18.039110+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105799680 unmapped: 51617792 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:19.039281+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105799680 unmapped: 51617792 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:20.039452+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105799680 unmapped: 51617792 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:21.039636+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888e000/0x0/0x1bfc00000, data 0x2d012a6/0x2e00000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1324302 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:22.039829+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:23.040001+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:24.040172+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:25.040339+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:26.040516+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1324302 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888e000/0x0/0x1bfc00000, data 0x2d012a6/0x2e00000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:27.040620+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:28.040774+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105807872 unmapped: 51609600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 15.074279785s of 15.129702568s, submitted: 12
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:29.040914+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:30.041075+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:31.041247+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1325702 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:32.041945+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888d000/0x0/0x1bfc00000, data 0x2d01341/0x2e01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:33.042101+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:34.042288+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:35.042458+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:36.042614+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1327294 data_alloc: 285212672 data_used: 1880064
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:37.042779+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105824256 unmapped: 51593216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:38.042955+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 heartbeat osd_stat(store_statfs(0x1b888c000/0x0/0x1bfc00000, data 0x2d013dc/0x2e02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105832448 unmapped: 51585024 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:39.043116+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105840640 unmapped: 51576832 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:40.043261+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.403656006s of 11.445689201s, submitted: 9
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105848832 unmapped: 51568640 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:41.043466+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4ccc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 164 ms_handle_reset con 0x55c53b4ccc00 session 0x55c538526000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332814 data_alloc: 285212672 data_used: 1892352
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105857024 unmapped: 51560448 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:42.043597+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 105865216 unmapped: 51552256 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:43.043731+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 165 ms_handle_reset con 0x55c53890d800 session 0x55c538915a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 38K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2729 syncs, 3.70 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4714 writes, 14K keys, 4714 commit groups, 1.0 writes per commit group, ingest: 11.91 MB, 0.02 MB/s
                                                          Interval WAL: 4714 writes, 2042 syncs, 2.31 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106930176 unmapped: 50487296 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:44.043870+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 165 heartbeat osd_stat(store_statfs(0x1b8882000/0x0/0x1bfc00000, data 0x2d060f6/0x2e0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106930176 unmapped: 50487296 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:45.044009+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 166 ms_handle_reset con 0x55c5384f7800 session 0x55c53a7a8f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106930176 unmapped: 50487296 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:46.044172+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 166 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a67a1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1343254 data_alloc: 285212672 data_used: 1916928
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106946560 unmapped: 50470912 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:47.044374+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 167 ms_handle_reset con 0x55c53890d800 session 0x55c53880c5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106987520 unmapped: 50429952 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:48.044552+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106995712 unmapped: 50421760 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:49.044933+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 168 ms_handle_reset con 0x55c53a99e000 session 0x55c53a67b4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4ccc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107003904 unmapped: 50413568 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 168 ms_handle_reset con 0x55c53b4ccc00 session 0x55c53880cf00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:50.045129+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 169 ms_handle_reset con 0x55c5384f7800 session 0x55c53b3c21e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.540680885s of 10.003373146s, submitted: 154
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 169 heartbeat osd_stat(store_statfs(0x1b8473000/0x0/0x1bfc00000, data 0x2d0f719/0x2e1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107020288 unmapped: 50397184 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:51.045371+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 169 heartbeat osd_stat(store_statfs(0x1b8473000/0x0/0x1bfc00000, data 0x2d0f719/0x2e1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 170 ms_handle_reset con 0x55c5384f7c00 session 0x55c53d89f860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 170 heartbeat osd_stat(store_statfs(0x1b846e000/0x0/0x1bfc00000, data 0x2d11c78/0x2e1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1355165 data_alloc: 285212672 data_used: 1929216
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107044864 unmapped: 50372608 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:52.046016+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 171 ms_handle_reset con 0x55c53890d800 session 0x55c53d89e1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 171 heartbeat osd_stat(store_statfs(0x1b846a000/0x0/0x1bfc00000, data 0x2d140dc/0x2e21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106708992 unmapped: 50708480 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:53.050667+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 172 ms_handle_reset con 0x55c53a99e000 session 0x55c53a1843c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106717184 unmapped: 50700288 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:54.050846+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106717184 unmapped: 50700288 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:55.051235+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b56f800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 173 ms_handle_reset con 0x55c53b56f800 session 0x55c53a67a000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 106741760 unmapped: 50675712 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:56.051455+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 173 ms_handle_reset con 0x55c5384f7800 session 0x55c53880d860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1363744 data_alloc: 285212672 data_used: 1929216
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107806720 unmapped: 49610752 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:57.051643+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 174 ms_handle_reset con 0x55c5384f7c00 session 0x55c538526d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107831296 unmapped: 49586176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:58.051817+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 174 heartbeat osd_stat(store_statfs(0x1b845f000/0x0/0x1bfc00000, data 0x2d1b23c/0x2e2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107831296 unmapped: 49586176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 174 ms_handle_reset con 0x55c53890d800 session 0x55c53b190960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:59.051931+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 175 ms_handle_reset con 0x55c53a99e000 session 0x55c53b1905a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107839488 unmapped: 49577984 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:00.052151+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 175 ms_handle_reset con 0x55c53aed7c00 session 0x55c5389a6f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.530163765s of 10.002629280s, submitted: 148
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 175 ms_handle_reset con 0x55c5384f7800 session 0x55c53b2ea000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107872256 unmapped: 49545216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:01.052417+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 175 heartbeat osd_stat(store_statfs(0x1b845a000/0x0/0x1bfc00000, data 0x2d1d948/0x2e33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1372222 data_alloc: 285212672 data_used: 1953792
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107872256 unmapped: 49545216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 175 heartbeat osd_stat(store_statfs(0x1b845a000/0x0/0x1bfc00000, data 0x2d1d948/0x2e33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:02.052632+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 175 handle_osd_map epochs [175,176], i have 175, src has [1,176]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107872256 unmapped: 49545216 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:03.052875+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107905024 unmapped: 49512448 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:04.053125+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 176 heartbeat osd_stat(store_statfs(0x1b8453000/0x0/0x1bfc00000, data 0x2d200b9/0x2e3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 107905024 unmapped: 49512448 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:05.053287+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 49
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108093440 unmapped: 49324032 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:06.053447+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1381468 data_alloc: 285212672 data_used: 1953792
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 176 ms_handle_reset con 0x55c53890d800 session 0x55c53b544960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108093440 unmapped: 49324032 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:07.053636+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108093440 unmapped: 49324032 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:08.053859+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 177 heartbeat osd_stat(store_statfs(0x1b8451000/0x0/0x1bfc00000, data 0x2d2257d/0x2e3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108126208 unmapped: 49291264 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 177 ms_handle_reset con 0x55c53a99e000 session 0x55c5389163c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:09.054030+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108126208 unmapped: 49291264 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:10.054216+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.721705437s of 10.000925064s, submitted: 82
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108142592 unmapped: 49274880 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:11.054492+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1387299 data_alloc: 285212672 data_used: 1978368
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:12.054696+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108150784 unmapped: 49266688 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:13.054895+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108150784 unmapped: 49266688 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:14.055125+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108150784 unmapped: 49266688 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 178 heartbeat osd_stat(store_statfs(0x1b844e000/0x0/0x1bfc00000, data 0x2d24d0c/0x2e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:15.055329+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108150784 unmapped: 49266688 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 178 heartbeat osd_stat(store_statfs(0x1b844e000/0x0/0x1bfc00000, data 0x2d24da7/0x2e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:16.055487+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108150784 unmapped: 49266688 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 178 heartbeat osd_stat(store_statfs(0x1b844e000/0x0/0x1bfc00000, data 0x2d24da7/0x2e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1389501 data_alloc: 285212672 data_used: 1978368
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:17.055650+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108183552 unmapped: 49233920 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b026c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:18.055795+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108183552 unmapped: 49233920 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b026c00 session 0x55c53b4cfc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844a000/0x0/0x1bfc00000, data 0x2d2720f/0x2e43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:19.055989+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108191744 unmapped: 49225728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:20.056140+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108191744 unmapped: 49225728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844a000/0x0/0x1bfc00000, data 0x2d2720f/0x2e43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.268141747s of 10.467493057s, submitted: 81
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:21.056347+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108216320 unmapped: 49201152 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1391061 data_alloc: 285212672 data_used: 1990656
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:22.056504+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108216320 unmapped: 49201152 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:23.056738+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108216320 unmapped: 49201152 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:24.056888+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108216320 unmapped: 49201152 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:25.057049+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108216320 unmapped: 49201152 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:26.057246+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108232704 unmapped: 49184768 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844c000/0x0/0x1bfc00000, data 0x2d2723e/0x2e42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c538b96800 session 0x55c539354f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1390371 data_alloc: 285212672 data_used: 1990656
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:27.057437+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108232704 unmapped: 49184768 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:28.057641+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844d000/0x0/0x1bfc00000, data 0x2d2726d/0x2e41000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108232704 unmapped: 49184768 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c5384f7800 session 0x55c53aab03c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:29.057799+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108240896 unmapped: 49176576 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:30.057959+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108240896 unmapped: 49176576 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:31.058294+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108240896 unmapped: 49176576 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.446413994s of 10.506946564s, submitted: 14
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53890d800 session 0x55c53a7a9c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:32.058474+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1391831 data_alloc: 285212672 data_used: 1990656
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108240896 unmapped: 49176576 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53a99e000 session 0x55c53bb5b680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b026c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b026c00 session 0x55c53afd0f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:33.058659+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108249088 unmapped: 49168384 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844b000/0x0/0x1bfc00000, data 0x2d2728d/0x2e43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b8449000/0x0/0x1bfc00000, data 0x2d272ff/0x2e45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:34.058791+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b533400 session 0x55c53bb5b4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108257280 unmapped: 49160192 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:35.059034+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108257280 unmapped: 49160192 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5a780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:36.059224+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108257280 unmapped: 49160192 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53a99e000 session 0x55c53a655680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b026c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:37.059367+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1399759 data_alloc: 285212672 data_used: 1990656
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53890d800 session 0x55c53bb5a5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108281856 unmapped: 49135616 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b026c00 session 0x55c53b3c34a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4f0000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4f0800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b4f0000 session 0x55c53bb5a1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:38.059513+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b4f0800 session 0x55c53b3c2000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5ba40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108306432 unmapped: 49111040 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53a99e000 session 0x55c538526b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53890d800 session 0x55c53bb5be00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844b000/0x0/0x1bfc00000, data 0x2d272df/0x2e43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:39.059700+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108380160 unmapped: 49037312 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b026c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53b026c00 session 0x55c53a185c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:40.059865+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:41.060079+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:42.060275+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1395518 data_alloc: 285212672 data_used: 1990656
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:43.060475+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:44.060668+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844d000/0x0/0x1bfc00000, data 0x2d2726d/0x2e41000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:45.060870+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 13.728454590s of 14.020287514s, submitted: 67
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:46.061024+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:47.061161+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1396992 data_alloc: 285212672 data_used: 1990656
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108388352 unmapped: 49029120 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:48.061294+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108404736 unmapped: 49012736 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c5384f7800 session 0x55c5389a6f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:49.061456+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108404736 unmapped: 49012736 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b844c000/0x0/0x1bfc00000, data 0x2d27337/0x2e42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:50.061644+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108404736 unmapped: 49012736 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 ms_handle_reset con 0x55c53890d800 session 0x55c53b190960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:51.061868+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108412928 unmapped: 49004544 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:52.062042+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1402519 data_alloc: 285212672 data_used: 2002944
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108412928 unmapped: 49004544 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 ms_handle_reset con 0x55c53a99e000 session 0x55c53b1905a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4f0800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 ms_handle_reset con 0x55c53b4f0800 session 0x55c53880cf00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:53.062249+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108453888 unmapped: 48963584 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 heartbeat osd_stat(store_statfs(0x1b8447000/0x0/0x1bfc00000, data 0x2d2999d/0x2e46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:54.062430+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108453888 unmapped: 48963584 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:55.062613+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108453888 unmapped: 48963584 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b56ec00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.668917656s of 10.006016731s, submitted: 99
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 heartbeat osd_stat(store_statfs(0x1b8448000/0x0/0x1bfc00000, data 0x2d2999d/0x2e46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 ms_handle_reset con 0x55c53b56ec00 session 0x55c53d89e1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:56.062770+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108494848 unmapped: 48922624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 heartbeat osd_stat(store_statfs(0x1b8448000/0x0/0x1bfc00000, data 0x2d2999d/0x2e46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:57.062937+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1401608 data_alloc: 285212672 data_used: 2002944
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108494848 unmapped: 48922624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 heartbeat osd_stat(store_statfs(0x1b8443000/0x0/0x1bfc00000, data 0x2d2bdb6/0x2e4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:58.063136+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108494848 unmapped: 48922624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:59.063325+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108494848 unmapped: 48922624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:00.063534+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108494848 unmapped: 48922624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 ms_handle_reset con 0x55c5384f7800 session 0x55c53b3c2780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:01.063765+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108494848 unmapped: 48922624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 heartbeat osd_stat(store_statfs(0x1b8443000/0x0/0x1bfc00000, data 0x2d2be51/0x2e4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:02.063945+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 ms_handle_reset con 0x55c53890d800 session 0x55c53b3c21e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1410248 data_alloc: 285212672 data_used: 2019328
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108503040 unmapped: 48914432 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:03.064111+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108519424 unmapped: 48898048 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 ms_handle_reset con 0x55c53a99e000 session 0x55c53a654d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:04.064301+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108519424 unmapped: 48898048 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:05.064489+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108519424 unmapped: 48898048 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 heartbeat osd_stat(store_statfs(0x1b8442000/0x0/0x1bfc00000, data 0x2d2beb3/0x2e4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 181 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.835681915s of 10.002489090s, submitted: 54
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:06.064661+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108511232 unmapped: 48906240 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 183 handle_osd_map epochs [182,183], i have 183, src has [1,183]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4f0800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:07.064806+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1420173 data_alloc: 285212672 data_used: 2035712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108511232 unmapped: 48906240 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 184 ms_handle_reset con 0x55c53ab6e400 session 0x55c53a6654a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:08.064959+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108519424 unmapped: 48898048 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 184 handle_osd_map epochs [183,185], i have 184, src has [1,185]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 184 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 185 ms_handle_reset con 0x55c53b4f0800 session 0x55c53aab03c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 185 heartbeat osd_stat(store_statfs(0x1b842e000/0x0/0x1bfc00000, data 0x2d35573/0x2e5f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:09.065084+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108568576 unmapped: 48848896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 185 heartbeat osd_stat(store_statfs(0x1b842d000/0x0/0x1bfc00000, data 0x2d35a78/0x2e60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:10.065244+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108568576 unmapped: 48848896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:11.065395+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 108593152 unmapped: 48824320 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:12.065731+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1427166 data_alloc: 285212672 data_used: 2035712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 186 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a654000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109379584 unmapped: 48037888 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 186 heartbeat osd_stat(store_statfs(0x1b8429000/0x0/0x1bfc00000, data 0x2d37a46/0x2e61000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 187 ms_handle_reset con 0x55c5384f7800 session 0x55c53a665e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 187 ms_handle_reset con 0x55c53890d800 session 0x55c53aab0000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:13.065920+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109387776 unmapped: 48029696 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 50
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:14.066080+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109592576 unmapped: 47824896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:15.066268+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109592576 unmapped: 47824896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:16.066409+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109592576 unmapped: 47824896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 187 heartbeat osd_stat(store_statfs(0x1b842c000/0x0/0x1bfc00000, data 0x2d3a0bf/0x2e62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:17.066569+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1430446 data_alloc: 285212672 data_used: 2043904
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109592576 unmapped: 47824896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 187 handle_osd_map epochs [187,188], i have 187, src has [1,188]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.480596542s of 11.886313438s, submitted: 425
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:18.066723+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109600768 unmapped: 47816704 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:19.066858+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109633536 unmapped: 47783936 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 189 heartbeat osd_stat(store_statfs(0x1b8421000/0x0/0x1bfc00000, data 0x2d3eb77/0x2e6b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:20.067053+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109633536 unmapped: 47783936 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 189 ms_handle_reset con 0x55c53a99e000 session 0x55c53a6843c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 189 ms_handle_reset con 0x55c53ab6e400 session 0x55c53a7a9e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:21.067255+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109666304 unmapped: 47751168 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:22.067404+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1446813 data_alloc: 285212672 data_used: 2060288
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109666304 unmapped: 47751168 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 ms_handle_reset con 0x55c5384f7800 session 0x55c53b191e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:23.067566+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109666304 unmapped: 47751168 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 heartbeat osd_stat(store_statfs(0x1b841a000/0x0/0x1bfc00000, data 0x2d4130f/0x2e73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b748000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:24.067759+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 ms_handle_reset con 0x55c53890d800 session 0x55c53a5fd860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 109707264 unmapped: 47710208 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:25.067892+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 110755840 unmapped: 46661632 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 ms_handle_reset con 0x55c53b0a6c00 session 0x55c538917860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:26.068067+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 110764032 unmapped: 46653440 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 heartbeat osd_stat(store_statfs(0x1b6c19000/0x0/0x1bfc00000, data 0x4541346/0x4675000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:27.068264+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1675023 data_alloc: 285212672 data_used: 2076672
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 110845952 unmapped: 46571520 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.630031586s of 10.174038887s, submitted: 152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:28.068451+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119250944 unmapped: 38166528 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:29.068626+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 110927872 unmapped: 46489600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:30.068794+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111034368 unmapped: 46383104 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:31.068975+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111034368 unmapped: 46383104 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:32.069134+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 192 heartbeat osd_stat(store_statfs(0x1b4410000/0x0/0x1bfc00000, data 0x6d45eb3/0x6e7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2034443 data_alloc: 285212672 data_used: 2101248
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119332864 unmapped: 38084608 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:33.069284+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111001600 unmapped: 46415872 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 192 heartbeat osd_stat(store_statfs(0x1b2c11000/0x0/0x1bfc00000, data 0x8545eb3/0x867d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:34.069419+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 124215296 unmapped: 33202176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:35.069584+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111206400 unmapped: 46211072 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:36.069718+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119685120 unmapped: 37732352 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:37.069865+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2343325 data_alloc: 285212672 data_used: 2101248
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 192 handle_osd_map epochs [192,193], i have 192, src has [1,193]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111321088 unmapped: 46096384 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1b0411000/0x0/0x1bfc00000, data 0xad4604f/0xae7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.254240036s of 10.049238205s, submitted: 143
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:38.070022+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111435776 unmapped: 45981696 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:39.070174+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1af40c000/0x0/0x1bfc00000, data 0xbd48532/0xbe81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119898112 unmapped: 37519360 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:40.070354+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 111648768 unmapped: 45768704 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:41.070554+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 112844800 unmapped: 44572672 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:42.070709+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2895777 data_alloc: 285212672 data_used: 2113536
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113082368 unmapped: 44335104 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1ab40d000/0x0/0x1bfc00000, data 0xfd485fc/0xfe81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:43.070887+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113213440 unmapped: 44204032 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:44.071033+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1aa40d000/0x0/0x1bfc00000, data 0x10d485fc/0x10e81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113213440 unmapped: 44204032 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:45.071191+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 121626624 unmapped: 35790848 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:46.071412+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 121790464 unmapped: 35627008 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:47.071590+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1a940e000/0x0/0x1bfc00000, data 0x11d4863b/0x11e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3279181 data_alloc: 285212672 data_used: 2113536
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113582080 unmapped: 43835392 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.365412712s of 10.136847496s, submitted: 63
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:48.071760+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113647616 unmapped: 43769856 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:49.071959+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 122167296 unmapped: 35250176 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:50.072146+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1a640e000/0x0/0x1bfc00000, data 0x14d4863b/0x14e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113934336 unmapped: 43483136 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:51.072374+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 113999872 unmapped: 43417600 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 51
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:52.072568+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3745341 data_alloc: 285212672 data_used: 2113536
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x1a4c0b000/0x0/0x1bfc00000, data 0x165489b9/0x16683000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 122404864 unmapped: 35012608 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:53.072745+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 114221056 unmapped: 43196416 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:54.072931+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 114278400 unmapped: 43139072 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:55.073243+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 114409472 unmapped: 43008000 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:56.073457+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 114638848 unmapped: 42778624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:57.073669+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4216925 data_alloc: 285212672 data_used: 2113536
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 114712576 unmapped: 42704896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.033995628s of 10.114027977s, submitted: 67
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 ms_handle_reset con 0x55c53ab6fc00 session 0x55c53a665680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:58.073860+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x19ec0b000/0x0/0x1bfc00000, data 0x1c548b38/0x1c683000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115884032 unmapped: 41533440 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:59.074049+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115957760 unmapped: 41459712 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:00.074289+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115990528 unmapped: 41426944 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 heartbeat osd_stat(store_statfs(0x19d408000/0x0/0x1bfc00000, data 0x1dd48d87/0x1de86000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 194 ms_handle_reset con 0x55c5384f7c00 session 0x55c53bb5b0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:01.074555+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 194 ms_handle_reset con 0x55c53890d800 session 0x55c53b4cf0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115425280 unmapped: 41992192 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:02.074736+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 ms_handle_reset con 0x55c53ab6fc00 session 0x55c53b4ce960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 ms_handle_reset con 0x55c5384f7800 session 0x55c53b4ce5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4641680 data_alloc: 285212672 data_used: 2125824
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 123961344 unmapped: 33456128 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:03.074886+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115630080 unmapped: 41787392 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a6c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:04.075087+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 ms_handle_reset con 0x55c53b0a6c00 session 0x55c538a74000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 124116992 unmapped: 33300480 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:05.075265+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 124952576 unmapped: 32464896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 ms_handle_reset con 0x55c5384f7800 session 0x55c53b2ea5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:06.075493+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 heartbeat osd_stat(store_statfs(0x199674000/0x0/0x1bfc00000, data 0x216d902f/0x2181a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4d6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116629504 unmapped: 40787968 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:07.075638+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4799592 data_alloc: 285212672 data_used: 2129920
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 ms_handle_reset con 0x55c53ab6fc00 session 0x55c53a684780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116686848 unmapped: 40730624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 196 heartbeat osd_stat(store_statfs(0x199f77000/0x0/0x1bfc00000, data 0x20d4e02f/0x20e8f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4d6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 196 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b2eaf00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.263569832s of 10.145400047s, submitted: 207
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:08.075781+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 196 ms_handle_reset con 0x55c53a714000 session 0x55c53ae061e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116727808 unmapped: 40689664 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 197 ms_handle_reset con 0x55c53a99e000 session 0x55c53b545860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 197 ms_handle_reset con 0x55c5384f7800 session 0x55c53a1741e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 197 ms_handle_reset con 0x55c53890d800 session 0x55c538a752c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:09.075917+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116744192 unmapped: 40673280 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:10.076050+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116744192 unmapped: 40673280 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:11.076275+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 ms_handle_reset con 0x55c53a714000 session 0x55c5389a6d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a175860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53ab6fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115351552 unmapped: 42065920 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 ms_handle_reset con 0x55c53ab6fc00 session 0x55c53a6843c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:12.076439+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 ms_handle_reset con 0x55c5384f7800 session 0x55c53b3c2000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1584238 data_alloc: 285212672 data_used: 2162688
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115359744 unmapped: 42057728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 199 heartbeat osd_stat(store_statfs(0x1b7ff7000/0x0/0x1bfc00000, data 0x2d55223/0x2e97000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4d6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:13.076606+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 199 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b3c21e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115359744 unmapped: 42057728 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:14.076751+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115367936 unmapped: 42049536 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 200 ms_handle_reset con 0x55c53890d800 session 0x55c53b3c3a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:15.076901+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 200 ms_handle_reset con 0x55c53a714000 session 0x55c53bb5ba40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115048448 unmapped: 42369024 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:16.077076+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b532c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 201 ms_handle_reset con 0x55c53b532c00 session 0x55c53b544780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115089408 unmapped: 42328064 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:17.077270+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1598884 data_alloc: 285212672 data_used: 2166784
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 202 heartbeat osd_stat(store_statfs(0x1b7feb000/0x0/0x1bfc00000, data 0x2d5ca9a/0x2ea1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4d6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115113984 unmapped: 42303488 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.096128464s of 10.042016983s, submitted: 355
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:18.077447+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 203 ms_handle_reset con 0x55c5384f7800 session 0x55c53b544f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115367936 unmapped: 42049536 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 203 ms_handle_reset con 0x55c53890d800 session 0x55c53ae07680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:19.077667+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 115466240 unmapped: 41951232 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 204 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a185860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 204 ms_handle_reset con 0x55c53a714000 session 0x55c53bb5be00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:20.077867+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b0a7400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 204 ms_handle_reset con 0x55c53b0a7400 session 0x55c53ae06d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116637696 unmapped: 40779776 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 205 ms_handle_reset con 0x55c53890cc00 session 0x55c53a60f4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:21.078087+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 205 handle_osd_map epochs [205,206], i have 205, src has [1,206]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116703232 unmapped: 40714240 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:22.078254+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1626669 data_alloc: 285212672 data_used: 2191360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116752384 unmapped: 40665088 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:23.078417+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b8f90000/0x0/0x1bfc00000, data 0x2d8c88c/0x2edc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d8f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116768768 unmapped: 40648704 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 207 ms_handle_reset con 0x55c5384f7800 session 0x55c53d89f4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:24.078600+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 116801536 unmapped: 40615936 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 208 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a6854a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:25.078752+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 208 ms_handle_reset con 0x55c53a714000 session 0x55c53880c1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 208 ms_handle_reset con 0x55c53890d800 session 0x55c53b4cf2c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117112832 unmapped: 40304640 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:26.078875+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 208 ms_handle_reset con 0x55c5384f7800 session 0x55c538914f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117121024 unmapped: 40296448 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:27.079044+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 208 ms_handle_reset con 0x55c5384f7c00 session 0x55c5389141e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1627541 data_alloc: 285212672 data_used: 2203648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117235712 unmapped: 40181760 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.040109634s of 10.028843880s, submitted: 317
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:28.079171+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117243904 unmapped: 40173568 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 209 heartbeat osd_stat(store_statfs(0x1b8f6e000/0x0/0x1bfc00000, data 0x2db065e/0x2eff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d8f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:29.079352+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117243904 unmapped: 40173568 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:30.079514+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117374976 unmapped: 40042496 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:31.079752+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 209 ms_handle_reset con 0x55c53890cc00 session 0x55c53a185860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117374976 unmapped: 40042496 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 209 heartbeat osd_stat(store_statfs(0x1b8f69000/0x0/0x1bfc00000, data 0x2db6017/0x2f05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d8f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:32.079886+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1634829 data_alloc: 285212672 data_used: 2215936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117440512 unmapped: 39976960 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:33.080123+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 210 ms_handle_reset con 0x55c53a714000 session 0x55c5389a6b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117522432 unmapped: 39895040 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:34.080341+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117530624 unmapped: 39886848 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c537f3e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:35.080526+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 ms_handle_reset con 0x55c537f3e400 session 0x55c53ae07680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117620736 unmapped: 39796736 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 heartbeat osd_stat(store_statfs(0x1b8f4b000/0x0/0x1bfc00000, data 0x2dce81c/0x2f22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d8f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:36.080680+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 212 ms_handle_reset con 0x55c5384f7800 session 0x55c53b545860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 212 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b544780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117637120 unmapped: 39780352 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 213 ms_handle_reset con 0x55c53890cc00 session 0x55c53bb5b0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:37.080815+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1652359 data_alloc: 285212672 data_used: 2244608
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 213 handle_osd_map epochs [213,214], i have 213, src has [1,214]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117891072 unmapped: 39526400 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 214 ms_handle_reset con 0x55c53a714000 session 0x55c53bb5ba40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 214 ms_handle_reset con 0x55c53b533c00 session 0x55c53b3c2000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:38.080970+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.891691208s of 10.125980377s, submitted: 73
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 214 ms_handle_reset con 0x55c5384f7800 session 0x55c53b3c21e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117874688 unmapped: 39542784 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:39.081124+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117940224 unmapped: 39477248 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 214 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a684780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:40.081290+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117948416 unmapped: 39469056 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:41.081499+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117710848 unmapped: 39706624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 215 heartbeat osd_stat(store_statfs(0x1b8f16000/0x0/0x1bfc00000, data 0x2dfe895/0x2f58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d8f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 215 ms_handle_reset con 0x55c53890cc00 session 0x55c53b2eb4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:42.081659+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1665078 data_alloc: 285212672 data_used: 2252800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 117710848 unmapped: 39706624 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:43.081820+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 ms_handle_reset con 0x55c53b533000 session 0x55c53b191c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 ms_handle_reset con 0x55c53a714000 session 0x55c53a175860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 118841344 unmapped: 38576128 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:44.082015+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 ms_handle_reset con 0x55c5384f7800 session 0x55c539371e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 ms_handle_reset con 0x55c5384f7c00 session 0x55c539371860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 ms_handle_reset con 0x55c53890cc00 session 0x55c53aab03c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 118865920 unmapped: 38551552 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:45.082163+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 118980608 unmapped: 38436864 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:46.082347+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119078912 unmapped: 38338560 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:47.082510+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1673951 data_alloc: 285212672 data_used: 2256896
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 heartbeat osd_stat(store_statfs(0x1b7d35000/0x0/0x1bfc00000, data 0x2e387bf/0x2f99000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119078912 unmapped: 38338560 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:48.082691+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.024193764s of 10.264261246s, submitted: 64
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 118685696 unmapped: 38731776 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:49.082868+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 118808576 unmapped: 38608896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 ms_handle_reset con 0x55c53a714000 session 0x55c539370960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b533000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:50.083018+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 118808576 unmapped: 38608896 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 ms_handle_reset con 0x55c53b533000 session 0x55c53d89e5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:51.083191+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 ms_handle_reset con 0x55c5384f7800 session 0x55c53d89ed20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 ms_handle_reset con 0x55c5384f7c00 session 0x55c538a75860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 ms_handle_reset con 0x55c53890cc00 session 0x55c53a664780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 119898112 unmapped: 37519360 heap: 157417472 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a714000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:52.083349+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 heartbeat osd_stat(store_statfs(0x1b7d10000/0x0/0x1bfc00000, data 0x2e5ac45/0x2fbd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1847751 data_alloc: 285212672 data_used: 2265088
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128548864 unmapped: 41476096 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:53.083495+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 132972544 unmapped: 37052416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:54.083631+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 ms_handle_reset con 0x55c53afa6800 session 0x55c5389a7e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 127918080 unmapped: 42106880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:55.083804+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 133144576 unmapped: 36880384 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:56.083948+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 124780544 unmapped: 45244416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:57.084140+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3494007 data_alloc: 285212672 data_used: 2277376
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 129089536 unmapped: 40935424 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:58.084339+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 heartbeat osd_stat(store_statfs(0x1a80f5000/0x0/0x1bfc00000, data 0x12a771a5/0x12bd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.361514568s of 10.010675430s, submitted: 396
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 125190144 unmapped: 44834816 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:59.084521+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 133627904 unmapped: 36397056 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 heartbeat osd_stat(store_statfs(0x1a38dc000/0x0/0x1bfc00000, data 0x1728cf5f/0x173f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:00.084703+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 134807552 unmapped: 35217408 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890d000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:01.084974+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c53890d000 session 0x55c53ad26b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 139100160 unmapped: 30924800 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:02.085178+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5126147 data_alloc: 285212672 data_used: 2277376
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 130834432 unmapped: 39190528 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c5384f7800 session 0x55c539371860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:03.085407+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131178496 unmapped: 38846464 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:04.085572+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c5384f7c00 session 0x55c53aab03c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135577600 unmapped: 34447360 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:05.085710+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c53890cc00 session 0x55c53ae06d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 heartbeat osd_stat(store_statfs(0x193cac000/0x0/0x1bfc00000, data 0x26abbf7e/0x26c22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,1,1,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 141336576 unmapped: 28688384 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:06.085866+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c538909400 session 0x55c53a60f0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c53a714000 session 0x55c53a1825a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c53afa6800 session 0x55c53d89e000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a7103c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128221184 unmapped: 41803776 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c5384f7800 session 0x55c53b3c2f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:07.086031+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c538909400 session 0x55c53bb5a1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53890cc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5997523 data_alloc: 285212672 data_used: 2281472
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128335872 unmapped: 41689088 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:08.086264+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 ms_handle_reset con 0x55c53890cc00 session 0x55c538915860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 6.219436646s of 10.141233444s, submitted: 513
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 127819776 unmapped: 42205184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:09.086454+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 127590400 unmapped: 42434560 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 220 heartbeat osd_stat(store_statfs(0x1b607e000/0x0/0x1bfc00000, data 0x2eecfb2/0x304f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:10.086624+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 127770624 unmapped: 42254336 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:11.086842+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 127770624 unmapped: 42254336 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:12.087009+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 222 handle_osd_map epochs [221,222], i have 222, src has [1,222]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 222 ms_handle_reset con 0x55c53aed9400 session 0x55c53a67a5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1893635 data_alloc: 285212672 data_used: 2301952
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128614400 unmapped: 41410560 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:13.087227+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 52
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128729088 unmapped: 41295872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:14.087398+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 223 ms_handle_reset con 0x55c5384f7800 session 0x55c538527860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 223 heartbeat osd_stat(store_statfs(0x1b783c000/0x0/0x1bfc00000, data 0x2f264fb/0x3091000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128729088 unmapped: 41295872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:15.087533+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128729088 unmapped: 41295872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:16.087671+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128729088 unmapped: 41295872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:17.087862+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 224 ms_handle_reset con 0x55c5384f7c00 session 0x55c539707860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1901428 data_alloc: 285212672 data_used: 2326528
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128729088 unmapped: 41295872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538909400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:18.088019+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 ms_handle_reset con 0x55c53afa6800 session 0x55c53a6654a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 ms_handle_reset con 0x55c538909400 session 0x55c53ad274a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 ms_handle_reset con 0x55c5384f7800 session 0x55c5389a7c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 128761856 unmapped: 41263104 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.621878624s of 10.432227135s, submitted: 567
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 ms_handle_reset con 0x55c5384f7c00 session 0x55c53afd1a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:19.088224+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 heartbeat osd_stat(store_statfs(0x1b7818000/0x0/0x1bfc00000, data 0x2f45d5d/0x30b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 ms_handle_reset con 0x55c53aed9400 session 0x55c53afd1860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 129982464 unmapped: 40042496 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:20.088388+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 ms_handle_reset con 0x55c53afa6800 session 0x55c53b5443c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c813c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131080192 unmapped: 38944768 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 heartbeat osd_stat(store_statfs(0x1b77ef000/0x0/0x1bfc00000, data 0x2f7206d/0x30df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:21.088606+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 225 handle_osd_map epochs [225,226], i have 225, src has [1,226]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 226 ms_handle_reset con 0x55c53c813c00 session 0x55c538915a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 226 ms_handle_reset con 0x55c5384f7800 session 0x55c53a185c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131162112 unmapped: 38862848 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:22.088774+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 227 ms_handle_reset con 0x55c5384f7c00 session 0x55c53afd1c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 227 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 handle_osd_map epochs [227,228], i have 228, src has [1,228]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1918658 data_alloc: 285212672 data_used: 2342912
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 ms_handle_reset con 0x55c53aed9400 session 0x55c53880c780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131440640 unmapped: 38584320 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:23.088936+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 heartbeat osd_stat(store_statfs(0x1b77da000/0x0/0x1bfc00000, data 0x2f85120/0x30f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 ms_handle_reset con 0x55c53afa6800 session 0x55c53afd1680
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131440640 unmapped: 38584320 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:24.089067+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b592c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 ms_handle_reset con 0x55c53b592c00 session 0x55c53d89e1e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 142311424 unmapped: 27713536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:25.089299+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 ms_handle_reset con 0x55c5384f7800 session 0x55c53ae06b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 229 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b4cfa40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131768320 unmapped: 38256640 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:26.089445+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 229 ms_handle_reset con 0x55c53afa6800 session 0x55c53a7a90e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 229 ms_handle_reset con 0x55c53aed9400 session 0x55c53ae074a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 131899392 unmapped: 38125568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:27.089617+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 229 ms_handle_reset con 0x55c538b96400 session 0x55c53d89fe00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 229 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2023689 data_alloc: 285212672 data_used: 2359296
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 230 ms_handle_reset con 0x55c5384f7800 session 0x55c53aab0d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 132153344 unmapped: 37871616 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:28.089760+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 230 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a7a9e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 230 ms_handle_reset con 0x55c53aed9400 session 0x55c53d89f0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53afa6800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 231 ms_handle_reset con 0x55c538b96400 session 0x55c53b3c3e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 133603328 unmapped: 36421632 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:29.089910+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 231 heartbeat osd_stat(store_statfs(0x1b61df000/0x0/0x1bfc00000, data 0x4579760/0x46ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.364305496s of 10.601664543s, submitted: 291
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 ms_handle_reset con 0x55c53afa6800 session 0x55c53aab0b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 ms_handle_reset con 0x55c5384f7800 session 0x55c53a6630e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 134651904 unmapped: 35373056 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:30.090117+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 ms_handle_reset con 0x55c5384f7c00 session 0x55c53b3c21e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 134742016 unmapped: 35282944 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:31.090344+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 ms_handle_reset con 0x55c538b96400 session 0x55c53b4ce5a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4adc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 134283264 unmapped: 35741696 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:32.090563+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b532000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 ms_handle_reset con 0x55c53b4adc00 session 0x55c538a743c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 ms_handle_reset con 0x55c53b532000 session 0x55c53a5fd0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2172323 data_alloc: 285212672 data_used: 2383872
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 233 ms_handle_reset con 0x55c5384f6400 session 0x55c53a711a40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135028736 unmapped: 34996224 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:33.090772+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 233 ms_handle_reset con 0x55c53aed9400 session 0x55c53afd05a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 134995968 unmapped: 35028992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:34.090987+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 234 heartbeat osd_stat(store_statfs(0x1b4965000/0x0/0x1bfc00000, data 0x4c4cea8/0x4dc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 235 ms_handle_reset con 0x55c5384f7800 session 0x55c5393701e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135086080 unmapped: 34938880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:35.091169+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 235 ms_handle_reset con 0x55c5384f7c00 session 0x55c53880d0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135249920 unmapped: 34775040 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:36.091340+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 235 heartbeat osd_stat(store_statfs(0x1b5011000/0x0/0x1bfc00000, data 0x459e45b/0x471c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 235 heartbeat osd_stat(store_statfs(0x1b4fed000/0x0/0x1bfc00000, data 0x45c18f9/0x4740000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135258112 unmapped: 34766848 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:37.091521+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 236 handle_osd_map epochs [236,237], i have 237, src has [1,237]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 237 handle_osd_map epochs [236,237], i have 237, src has [1,237]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 237 ms_handle_reset con 0x55c538b96400 session 0x55c539370b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2057216 data_alloc: 285212672 data_used: 2404352
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135364608 unmapped: 34660352 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:38.091650+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135364608 unmapped: 34660352 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 237 heartbeat osd_stat(store_statfs(0x1b5a10000/0x0/0x1bfc00000, data 0x3b9c2c9/0x3d1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:39.091790+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.157032013s of 10.002032280s, submitted: 247
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 238 ms_handle_reset con 0x55c5384f6400 session 0x55c5393732c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135380992 unmapped: 34643968 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:40.091967+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 238 ms_handle_reset con 0x55c5384f7800 session 0x55c53a662b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135659520 unmapped: 34365440 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:41.092281+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135667712 unmapped: 34357248 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:42.092446+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1998291 data_alloc: 285212672 data_used: 2416640
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135667712 unmapped: 34357248 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:43.092574+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135618560 unmapped: 34406400 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:44.092691+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b6574000/0x0/0x1bfc00000, data 0x3031e59/0x31b8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 135675904 unmapped: 34349056 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:45.092853+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136790016 unmapped: 33234944 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:46.092985+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136380416 unmapped: 33644544 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:47.093115+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2003969 data_alloc: 285212672 data_used: 2428928
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136388608 unmapped: 33636352 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:48.093264+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 heartbeat osd_stat(store_statfs(0x1b6520000/0x0/0x1bfc00000, data 0x3086162/0x320d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 ms_handle_reset con 0x55c5384f7c00 session 0x55c53a1741e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53aed9400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 ms_handle_reset con 0x55c53aed9400 session 0x55c53a183860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b4adc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 ms_handle_reset con 0x55c53b4adc00 session 0x55c5389174a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136962048 unmapped: 33062912 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 ms_handle_reset con 0x55c5384f6400 session 0x55c53b2eab40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:49.093410+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 ms_handle_reset con 0x55c5384f7800 session 0x55c53aab0000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 137052160 unmapped: 32972800 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:50.093539+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.881058693s of 10.671497345s, submitted: 273
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136945664 unmapped: 33079296 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:51.093695+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136945664 unmapped: 33079296 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:52.093835+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b56f400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2104191 data_alloc: 285212672 data_used: 2445312
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 136945664 unmapped: 33079296 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:53.093983+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 137117696 unmapped: 32907264 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:54.094168+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b5588000/0x0/0x1bfc00000, data 0x3c18423/0x3da5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 137306112 unmapped: 32718848 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:55.094375+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 142303232 unmapped: 27721728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:56.094500+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b5555000/0x0/0x1bfc00000, data 0x3c4d05e/0x3dd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 142491648 unmapped: 27533312 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:57.094654+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2165823 data_alloc: 301989888 data_used: 10350592
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 142491648 unmapped: 27533312 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:58.094798+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 143622144 unmapped: 26402816 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:59.094930+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 143622144 unmapped: 26402816 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:00.095106+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b5536000/0x0/0x1bfc00000, data 0x3c6aedb/0x3df8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.895036697s of 10.094698906s, submitted: 57
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 143998976 unmapped: 26025984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:01.095301+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b54e4000/0x0/0x1bfc00000, data 0x3cb9384/0x3e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 143679488 unmapped: 26345472 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:02.095443+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2176959 data_alloc: 301989888 data_used: 10350592
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b54e4000/0x0/0x1bfc00000, data 0x3cb944e/0x3e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 143761408 unmapped: 26263552 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:03.095625+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b54e4000/0x0/0x1bfc00000, data 0x3cb944e/0x3e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 143949824 unmapped: 26075136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:04.095767+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 144146432 unmapped: 25878528 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:05.095923+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 146423808 unmapped: 23601152 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:06.096062+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b5499000/0x0/0x1bfc00000, data 0x3d04ca7/0x3e95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 148127744 unmapped: 21897216 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:07.096239+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2269495 data_alloc: 301989888 data_used: 11493376
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:08.096432+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150142976 unmapped: 19881984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:09.096597+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150233088 unmapped: 19791872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:10.096806+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149659648 unmapped: 20365312 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.336468697s of 10.075609207s, submitted: 173
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:11.096990+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149741568 unmapped: 20283392 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:12.097141+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149741568 unmapped: 20283392 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4a8d000/0x0/0x1bfc00000, data 0x471155f/0x48a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2269183 data_alloc: 301989888 data_used: 11497472
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:13.097317+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149872640 unmapped: 20152320 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4a6b000/0x0/0x1bfc00000, data 0x4733e40/0x48c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4a54000/0x0/0x1bfc00000, data 0x474ae4d/0x48da000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:14.097468+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149962752 unmapped: 20062208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4a51000/0x0/0x1bfc00000, data 0x474dac0/0x48dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:15.097637+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149987328 unmapped: 20037632 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:16.097798+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150151168 unmapped: 19873792 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:17.097917+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150151168 unmapped: 19873792 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2273179 data_alloc: 301989888 data_used: 11497472
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:18.098078+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150151168 unmapped: 19873792 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4a0f000/0x0/0x1bfc00000, data 0x478f193/0x491f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:19.098278+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151412736 unmapped: 18612224 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:20.098457+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151494656 unmapped: 18530304 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.732328415s of 10.002904892s, submitted: 58
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:21.098673+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151470080 unmapped: 18554880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:22.098866+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151379968 unmapped: 18644992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2281717 data_alloc: 301989888 data_used: 11497472
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:23.099043+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151379968 unmapped: 18644992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:24.099184+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151379968 unmapped: 18644992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4977000/0x0/0x1bfc00000, data 0x4826c84/0x49b6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:25.099382+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151535616 unmapped: 18489344 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 244 heartbeat osd_stat(store_statfs(0x1b493b000/0x0/0x1bfc00000, data 0x4863b3d/0x49f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:26.099532+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151961600 unmapped: 18063360 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:27.099682+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151969792 unmapped: 18055168 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2282061 data_alloc: 301989888 data_used: 11509760
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:28.099878+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151429120 unmapped: 18595840 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53d8fa800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:29.100043+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151429120 unmapped: 18595840 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:30.100267+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151470080 unmapped: 18554880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 244 heartbeat osd_stat(store_statfs(0x1b48fb000/0x0/0x1bfc00000, data 0x48a23ea/0x4a33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.618723869s of 10.006350517s, submitted: 99
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:31.100497+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151470080 unmapped: 18554880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53dcd1400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 244 ms_handle_reset con 0x55c53dcd1400 session 0x55c53a183c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:32.100638+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152535040 unmapped: 17489920 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53d2b9800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 245 ms_handle_reset con 0x55c53d2b9800 session 0x55c53b3c2f00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 246 ms_handle_reset con 0x55c53c00e400 session 0x55c53ae072c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 246 ms_handle_reset con 0x55c538b96c00 session 0x55c53b3c2b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2306165 data_alloc: 301989888 data_used: 11526144
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:33.100804+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153591808 unmapped: 16433152 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 246 heartbeat osd_stat(store_statfs(0x1b48c4000/0x0/0x1bfc00000, data 0x48d0599/0x4a68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:34.100968+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152305664 unmapped: 17719296 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:35.101123+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152395776 unmapped: 17629184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 247 ms_handle_reset con 0x55c5384f6400 session 0x55c53880d4a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 247 handle_osd_map epochs [247,248], i have 247, src has [1,248]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:36.101244+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152395776 unmapped: 17629184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 248 ms_handle_reset con 0x55c5384f7800 session 0x55c53b0f3e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 248 ms_handle_reset con 0x55c53c00e400 session 0x55c53b749860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b4890000/0x0/0x1bfc00000, data 0x490381b/0x4a9d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:37.101459+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152395776 unmapped: 17629184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2311225 data_alloc: 301989888 data_used: 11542528
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:38.101587+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152395776 unmapped: 17629184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53d2b9800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b486f000/0x0/0x1bfc00000, data 0x492266c/0x4abe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:39.101746+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 248 handle_osd_map epochs [248,249], i have 248, src has [1,249]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150847488 unmapped: 19177472 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 249 ms_handle_reset con 0x55c53d2b9800 session 0x55c53ad263c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:40.101917+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 53
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150888448 unmapped: 19136512 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.616605759s of 10.001934052s, submitted: 103
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f6400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:41.102108+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152182784 unmapped: 17842176 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 250 ms_handle_reset con 0x55c538b96c00 session 0x55c539707e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:42.102260+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152281088 unmapped: 17743872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 251 ms_handle_reset con 0x55c53c00e400 session 0x55c53b2ea780
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 251 ms_handle_reset con 0x55c5384f7800 session 0x55c53a7103c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2339103 data_alloc: 301989888 data_used: 11583488
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:43.102576+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152608768 unmapped: 17416192 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 251 heartbeat osd_stat(store_statfs(0x1b47de000/0x0/0x1bfc00000, data 0x49add23/0x4b50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:44.102724+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152674304 unmapped: 17350656 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53dcd1400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:45.102905+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152682496 unmapped: 17342464 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b4795000/0x0/0x1bfc00000, data 0x49f449d/0x4b99000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 252 ms_handle_reset con 0x55c53dcd1400 session 0x55c53ad26d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:46.103080+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152887296 unmapped: 17137664 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:47.103242+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152911872 unmapped: 17113088 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2355982 data_alloc: 301989888 data_used: 11595776
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:48.103378+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 253 ms_handle_reset con 0x55c53c00fc00 session 0x55c53ad26960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152911872 unmapped: 17113088 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 253 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:49.103559+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153608192 unmapped: 16416768 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 254 ms_handle_reset con 0x55c53d8fa800 session 0x55c53a6643c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 254 heartbeat osd_stat(store_statfs(0x1b473e000/0x0/0x1bfc00000, data 0x4a47174/0x4bef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:50.103733+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153616384 unmapped: 16408576 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 254 ms_handle_reset con 0x55c53b56f400 session 0x55c53d89fa40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.375536919s of 10.000599861s, submitted: 226
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:51.103945+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149716992 unmapped: 20307968 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 255 ms_handle_reset con 0x55c53c00fc00 session 0x55c53880d2c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:52.104076+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149987328 unmapped: 20037632 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2147940 data_alloc: 285212672 data_used: 2547712
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:53.104267+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149987328 unmapped: 20037632 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:54.104432+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151068672 unmapped: 18956288 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 heartbeat osd_stat(store_statfs(0x1b5bf8000/0x0/0x1bfc00000, data 0x358ee2f/0x3735000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:55.104617+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 151068672 unmapped: 18956288 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 heartbeat osd_stat(store_statfs(0x1b5bf5000/0x0/0x1bfc00000, data 0x3591416/0x3737000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:56.104792+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150183936 unmapped: 19841024 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 259 heartbeat osd_stat(store_statfs(0x1b5ba3000/0x0/0x1bfc00000, data 0x35e31a5/0x378a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:57.104992+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149110784 unmapped: 20914176 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 260 ms_handle_reset con 0x55c5384f7800 session 0x55c53b748960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c538b96c00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:58.105270+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2156204 data_alloc: 285212672 data_used: 2560000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 260 ms_handle_reset con 0x55c538b96c00 session 0x55c53b0f32c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 149266432 unmapped: 20758528 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 260 heartbeat osd_stat(store_statfs(0x1b5b9e000/0x0/0x1bfc00000, data 0x35e7f58/0x378f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:59.105411+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150315008 unmapped: 19709952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:00.105577+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150315008 unmapped: 19709952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:01.105814+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150315008 unmapped: 19709952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 260 heartbeat osd_stat(store_statfs(0x1b5b9f000/0x0/0x1bfc00000, data 0x35e8015/0x378d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:02.105948+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150315008 unmapped: 19709952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.374423027s of 12.215964317s, submitted: 321
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:03.106118+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158334 data_alloc: 285212672 data_used: 2555904
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150323200 unmapped: 19701760 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:04.106286+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150331392 unmapped: 19693568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:05.106426+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 262 ms_handle_reset con 0x55c5384f7800 session 0x55c53aab05a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150331392 unmapped: 19693568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:06.106620+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b5b97000/0x0/0x1bfc00000, data 0x35ecae3/0x3796000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b56f400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150339584 unmapped: 19685376 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 263 ms_handle_reset con 0x55c53b56f400 session 0x55c5389141e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:07.106765+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150364160 unmapped: 19660800 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 264 ms_handle_reset con 0x55c53c00fc00 session 0x55c53d89f0e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:08.106901+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2175758 data_alloc: 285212672 data_used: 2560000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150388736 unmapped: 19636224 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53d8fa800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 265 ms_handle_reset con 0x55c53d8fa800 session 0x55c53b544000
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:09.107139+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150405120 unmapped: 19619840 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:10.107425+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150405120 unmapped: 19619840 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00e400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 265 ms_handle_reset con 0x55c53c00e400 session 0x55c53a664d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 266 heartbeat osd_stat(store_statfs(0x1b5b86000/0x0/0x1bfc00000, data 0x35f3c5f/0x37a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 266 ms_handle_reset con 0x55c5384f7800 session 0x55c5396434a0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:11.107643+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150413312 unmapped: 19611648 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b56f400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 266 ms_handle_reset con 0x55c53b56f400 session 0x55c538914d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 267 ms_handle_reset con 0x55c53c00fc00 session 0x55c53b4cf2c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:12.107810+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150437888 unmapped: 19587072 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53d8fa800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.376722336s of 10.003239632s, submitted: 177
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 267 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 267 ms_handle_reset con 0x55c53d8fa800 session 0x55c539354960
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53dcd1400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:13.107983+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2186073 data_alloc: 285212672 data_used: 2584576
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150454272 unmapped: 19570688 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 269 ms_handle_reset con 0x55c53dcd1400 session 0x55c5393732c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:14.108246+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150462464 unmapped: 19562496 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c5384f7800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 269 ms_handle_reset con 0x55c5384f7800 session 0x55c53bb5bc20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53b56f400
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:15.108402+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150478848 unmapped: 19546112 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 269 ms_handle_reset con 0x55c53b56f400 session 0x55c53a663c20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:16.108538+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150487040 unmapped: 19537920 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 269 heartbeat osd_stat(store_statfs(0x1b5b7c000/0x0/0x1bfc00000, data 0x35fd145/0x37b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:17.108697+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150487040 unmapped: 19537920 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 269 handle_osd_map epochs [269,270], i have 269, src has [1,270]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b5b77000/0x0/0x1bfc00000, data 0x35ff57a/0x37b6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:18.108873+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2191700 data_alloc: 285212672 data_used: 2596864
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150487040 unmapped: 19537920 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b5b76000/0x0/0x1bfc00000, data 0x35ff615/0x37b7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:19.109096+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150487040 unmapped: 19537920 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:20.109286+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150487040 unmapped: 19537920 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:21.109466+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 270 heartbeat osd_stat(store_statfs(0x1b5b76000/0x0/0x1bfc00000, data 0x35ff6b0/0x37b8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:22.109650+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 270 handle_osd_map epochs [270,271], i have 270, src has [1,271]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.689367294s of 10.014828682s, submitted: 96
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:23.109810+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2198206 data_alloc: 285212672 data_used: 2609152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:24.109962+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b71000/0x0/0x1bfc00000, data 0x3601ac9/0x37bc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:25.110133+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b72000/0x0/0x1bfc00000, data 0x3601af8/0x37bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:26.110337+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:27.110452+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:28.110625+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2195624 data_alloc: 285212672 data_used: 2609152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:29.110793+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b75000/0x0/0x1bfc00000, data 0x3601b56/0x37b9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:30.110987+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:31.111242+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:32.111428+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b75000/0x0/0x1bfc00000, data 0x3601b56/0x37b9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:33.111604+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2195624 data_alloc: 285212672 data_used: 2609152
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150495232 unmapped: 19529728 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:34.111743+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.405589104s of 11.490046501s, submitted: 26
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:35.111942+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:36.112126+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b75000/0x0/0x1bfc00000, data 0x3601c20/0x37b9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:37.112259+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:38.112393+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2194758 data_alloc: 285212672 data_used: 2605056
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:39.112583+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:40.112738+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b76000/0x0/0x1bfc00000, data 0x3601c53/0x37b8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:41.112946+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:42.113093+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 150503424 unmapped: 19521536 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 ms_handle_reset con 0x55c5384f6400 session 0x55c5389163c0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:43.113309+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2193716 data_alloc: 285212672 data_used: 2605056
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b77000/0x0/0x1bfc00000, data 0x3601c1d/0x37b7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152018944 unmapped: 18006016 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b5b77000/0x0/0x1bfc00000, data 0x3601c1d/0x37b7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 54
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:44.113498+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.899184227s of 10.007975578s, submitted: 316
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:45.113655+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:46.113798+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 272 heartbeat osd_stat(store_statfs(0x1b5b76000/0x0/0x1bfc00000, data 0x3601d82/0x37b8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:47.113948+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:48.114090+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2199510 data_alloc: 285212672 data_used: 2617344
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:49.114273+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:50.114468+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:51.114674+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 272 heartbeat osd_stat(store_statfs(0x1b5b71000/0x0/0x1bfc00000, data 0x360434e/0x37bc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 272 heartbeat osd_stat(store_statfs(0x1b5b71000/0x0/0x1bfc00000, data 0x360434e/0x37bc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:52.114826+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 272 handle_osd_map epochs [272,273], i have 272, src has [1,273]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:53.114967+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2205806 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152010752 unmapped: 18014208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:54.115156+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.774814606s of 10.008799553s, submitted: 74
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152018944 unmapped: 18006016 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:55.115284+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152018944 unmapped: 18006016 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b6b000/0x0/0x1bfc00000, data 0x36069a8/0x37c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:56.115495+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152027136 unmapped: 17997824 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b6a000/0x0/0x1bfc00000, data 0x3606aa2/0x37c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:57.115646+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152043520 unmapped: 17981440 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:58.115814+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2206390 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152043520 unmapped: 17981440 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:59.116003+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152043520 unmapped: 17981440 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:00.116176+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152059904 unmapped: 17965056 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:01.116441+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152092672 unmapped: 17932288 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b6c000/0x0/0x1bfc00000, data 0x3606c03/0x37c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:02.116711+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152092672 unmapped: 17932288 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:03.116866+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2206052 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152092672 unmapped: 17932288 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:04.117058+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152117248 unmapped: 17907712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:05.117285+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.056751251s of 11.250020027s, submitted: 36
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152117248 unmapped: 17907712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:06.117442+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152117248 unmapped: 17907712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b6a000/0x0/0x1bfc00000, data 0x3606cfa/0x37c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:07.117622+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152117248 unmapped: 17907712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:08.117782+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b69000/0x0/0x1bfc00000, data 0x3606c98/0x37c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2209316 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152117248 unmapped: 17907712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:09.117959+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:10.118284+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:11.118492+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:12.118686+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:13.118862+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2208406 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b68000/0x0/0x1bfc00000, data 0x3606d6b/0x37c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:14.119477+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:15.119718+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.819792747s of 10.001510620s, submitted: 38
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:16.119933+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152125440 unmapped: 17899520 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5b6c000/0x0/0x1bfc00000, data 0x3606d3d/0x37c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:17.120132+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152133632 unmapped: 17891328 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:18.120312+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2212246 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:19.120483+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b576a000/0x0/0x1bfc00000, data 0x3606f0b/0x37c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:20.120612+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:21.120896+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:22.121097+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:23.121292+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2209536 data_alloc: 285212672 data_used: 2629632
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:24.121430+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152141824 unmapped: 17883136 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b576b000/0x0/0x1bfc00000, data 0x3606ffa/0x37c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:25.121634+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.846505165s of 10.007899284s, submitted: 31
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152150016 unmapped: 17874944 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 273 handle_osd_map epochs [273,274], i have 273, src has [1,274]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:26.121816+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:27.122001+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:28.122242+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b5767000/0x0/0x1bfc00000, data 0x3609562/0x37c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2215442 data_alloc: 285212672 data_used: 2641920
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:29.122467+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b5766000/0x0/0x1bfc00000, data 0x36095fd/0x37c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:30.123520+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:31.123817+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:32.123997+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152174592 unmapped: 17850368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:33.124270+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 275 heartbeat osd_stat(store_statfs(0x1b5767000/0x0/0x1bfc00000, data 0x3609733/0x37c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2223250 data_alloc: 285212672 data_used: 2654208
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152190976 unmapped: 17833984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:34.124446+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152190976 unmapped: 17833984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:35.124620+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152190976 unmapped: 17833984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:36.124793+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 275 heartbeat osd_stat(store_statfs(0x1b5762000/0x0/0x1bfc00000, data 0x360bb17/0x37cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152190976 unmapped: 17833984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:37.124964+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.460308075s of 11.774886131s, submitted: 85
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152190976 unmapped: 17833984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:38.125134+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2223400 data_alloc: 285212672 data_used: 2654208
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152199168 unmapped: 17825792 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:39.125304+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152199168 unmapped: 17825792 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:40.125459+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 275 heartbeat osd_stat(store_statfs(0x1b575f000/0x0/0x1bfc00000, data 0x360bdaf/0x37cd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152199168 unmapped: 17825792 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:41.125647+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152256512 unmapped: 17768448 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:42.125804+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152256512 unmapped: 17768448 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:43.125958+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2228764 data_alloc: 285212672 data_used: 2666496
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152256512 unmapped: 17768448 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:44.126136+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152256512 unmapped: 17768448 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b575d000/0x0/0x1bfc00000, data 0x360e2da/0x37cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:45.126323+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152281088 unmapped: 17743872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:46.126520+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152281088 unmapped: 17743872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:47.126714+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b5760000/0x0/0x1bfc00000, data 0x360e2d7/0x37cd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 152281088 unmapped: 17743872 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.353332520s of 10.622464180s, submitted: 79
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:48.126873+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2230354 data_alloc: 285212672 data_used: 2678784
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153337856 unmapped: 16687104 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:49.127062+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153346048 unmapped: 16678912 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:50.127300+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 277 heartbeat osd_stat(store_statfs(0x1b575b000/0x0/0x1bfc00000, data 0x361078b/0x37d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153346048 unmapped: 16678912 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:51.127516+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153354240 unmapped: 16670720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 277 heartbeat osd_stat(store_statfs(0x1b575a000/0x0/0x1bfc00000, data 0x36108e6/0x37d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:52.127708+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153354240 unmapped: 16670720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:53.128062+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2231354 data_alloc: 285212672 data_used: 2678784
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153354240 unmapped: 16670720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:54.128253+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153354240 unmapped: 16670720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:55.128395+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153354240 unmapped: 16670720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:56.128588+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153387008 unmapped: 16637952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 278 heartbeat osd_stat(store_statfs(0x1b5757000/0x0/0x1bfc00000, data 0x3612f1c/0x37d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:57.128819+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153387008 unmapped: 16637952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.765386581s of 10.005261421s, submitted: 90
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:58.129028+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2237186 data_alloc: 285212672 data_used: 2691072
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153387008 unmapped: 16637952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:59.129274+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153387008 unmapped: 16637952 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:00.129445+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 278 heartbeat osd_stat(store_statfs(0x1b5755000/0x0/0x1bfc00000, data 0x361311c/0x37d7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153370624 unmapped: 16654336 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:01.129742+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153378816 unmapped: 16646144 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:02.129881+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153378816 unmapped: 16646144 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 278 handle_osd_map epochs [278,279], i have 278, src has [1,279]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:03.130071+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2242660 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153395200 unmapped: 16629760 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:04.130255+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153403392 unmapped: 16621568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5750000/0x0/0x1bfc00000, data 0x3615664/0x37dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:05.130441+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153403392 unmapped: 16621568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:06.130618+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153403392 unmapped: 16621568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:07.130831+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153403392 unmapped: 16621568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.831203461s of 10.115341187s, submitted: 59
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:08.130969+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2244274 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153403392 unmapped: 16621568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:09.131145+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153403392 unmapped: 16621568 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:10.131323+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5750000/0x0/0x1bfc00000, data 0x36156c6/0x37dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153411584 unmapped: 16613376 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:11.131496+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153411584 unmapped: 16613376 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:12.131654+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153419776 unmapped: 16605184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:13.131892+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2241900 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153419776 unmapped: 16605184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:14.132069+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153419776 unmapped: 16605184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5752000/0x0/0x1bfc00000, data 0x361572e/0x37db000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:15.132270+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153419776 unmapped: 16605184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:16.132492+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153419776 unmapped: 16605184 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:17.132671+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153427968 unmapped: 16596992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:18.132850+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.108919144s of 10.290252686s, submitted: 34
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2243972 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153427968 unmapped: 16596992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:19.133063+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153427968 unmapped: 16596992 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:20.133239+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153444352 unmapped: 16580608 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5754000/0x0/0x1bfc00000, data 0x36157f8/0x37da000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:21.133430+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153452544 unmapped: 16572416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:22.133785+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153452544 unmapped: 16572416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5752000/0x0/0x1bfc00000, data 0x3615861/0x37db000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:23.133963+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2244890 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153452544 unmapped: 16572416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:24.134162+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5751000/0x0/0x1bfc00000, data 0x36158d5/0x37dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153452544 unmapped: 16572416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:25.134328+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153452544 unmapped: 16572416 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5751000/0x0/0x1bfc00000, data 0x3615904/0x37dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:26.134483+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153468928 unmapped: 16556032 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:27.134594+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153468928 unmapped: 16556032 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:28.134777+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2245810 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153477120 unmapped: 16547840 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:29.134937+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153477120 unmapped: 16547840 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:30.135123+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.834587097s of 12.008486748s, submitted: 28
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5754000/0x0/0x1bfc00000, data 0x361595a/0x37da000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 154525696 unmapped: 15499264 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:31.135366+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153485312 unmapped: 16539648 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:32.135533+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153485312 unmapped: 16539648 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:33.135694+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5750000/0x0/0x1bfc00000, data 0x3615b29/0x37dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2248398 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153485312 unmapped: 16539648 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:34.135871+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153665536 unmapped: 16359424 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:35.136030+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153772032 unmapped: 16252928 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:36.136252+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5705000/0x0/0x1bfc00000, data 0x365f137/0x3826000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 153796608 unmapped: 16228352 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:37.136449+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 154312704 unmapped: 15712256 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:38.136628+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2275064 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 154738688 unmapped: 15286272 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:39.136951+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 154738688 unmapped: 15286272 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:40.137347+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.852190018s of 10.176259041s, submitted: 81
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 154910720 unmapped: 15114240 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:41.137673+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b55de000/0x0/0x1bfc00000, data 0x3785dc9/0x394e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 155402240 unmapped: 14622720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:42.137864+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 155402240 unmapped: 14622720 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:43.138038+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2269770 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 154976256 unmapped: 15048704 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:44.138181+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b553d000/0x0/0x1bfc00000, data 0x38260a6/0x39ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 156286976 unmapped: 13737984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:45.138389+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 156286976 unmapped: 13737984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:46.138560+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 156286976 unmapped: 13737984 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:47.138719+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5503000/0x0/0x1bfc00000, data 0x3863658/0x3a2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 157581312 unmapped: 12443648 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:48.138896+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2293712 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 157581312 unmapped: 12443648 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:49.139188+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 158007296 unmapped: 12017664 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:50.139458+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.553035736s of 10.004553795s, submitted: 109
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 157777920 unmapped: 12247040 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:51.140440+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 159129600 unmapped: 10895360 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:52.140601+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 159178752 unmapped: 10846208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:53.140779+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5416000/0x0/0x1bfc00000, data 0x3951321/0x3b18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2293578 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 159178752 unmapped: 10846208 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:54.141017+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 159285248 unmapped: 10739712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b423e000/0x0/0x1bfc00000, data 0x39886db/0x3b4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:55.141235+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 159285248 unmapped: 10739712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:56.141408+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 159285248 unmapped: 10739712 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:57.141609+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160366592 unmapped: 9658368 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b41f1000/0x0/0x1bfc00000, data 0x39d7c91/0x3b9d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:58.141810+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2308726 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160669696 unmapped: 9355264 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:59.142010+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160677888 unmapped: 9347072 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:00.142155+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.539963722s of 10.002388000s, submitted: 106
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160677888 unmapped: 9347072 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:01.142368+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160686080 unmapped: 9338880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:02.142567+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b413c000/0x0/0x1bfc00000, data 0x3a8c723/0x3c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160686080 unmapped: 9338880 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:03.142786+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2316380 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 160694272 unmapped: 9330688 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:04.142948+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 163184640 unmapped: 6840320 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:05.143135+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 163315712 unmapped: 6709248 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:06.143332+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2f3f000/0x0/0x1bfc00000, data 0x3ae77b6/0x3cad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2f20000/0x0/0x1bfc00000, data 0x3b0648a/0x3ccc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 163323904 unmapped: 6701056 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:07.143500+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 162709504 unmapped: 7315456 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:08.143649+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2323514 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 162709504 unmapped: 7315456 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:09.143839+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 162709504 unmapped: 7315456 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:10.144035+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.748250008s of 10.002716064s, submitted: 61
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2ee5000/0x0/0x1bfc00000, data 0x3b439ae/0x3d08000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 163102720 unmapped: 6922240 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:11.144272+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164175872 unmapped: 5849088 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:12.144459+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164175872 unmapped: 5849088 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2e82000/0x0/0x1bfc00000, data 0x3ba4b0f/0x3d6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:13.144636+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2331588 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164454400 unmapped: 5570560 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:14.144811+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2e83000/0x0/0x1bfc00000, data 0x3ba4aad/0x3d69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164323328 unmapped: 5701632 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:15.144961+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164323328 unmapped: 5701632 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:16.145111+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164552704 unmapped: 5472256 heap: 170024960 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2e3b000/0x0/0x1bfc00000, data 0x3bed5cf/0x3db2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:17.145303+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164585472 unmapped: 6488064 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:18.145450+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2340266 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164585472 unmapped: 6488064 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:19.145623+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2de8000/0x0/0x1bfc00000, data 0x3c3d335/0x3e03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164806656 unmapped: 6266880 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:20.145816+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.737001419s of 10.001988411s, submitted: 65
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2de8000/0x0/0x1bfc00000, data 0x3c3d335/0x3e03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,1])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164904960 unmapped: 6168576 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:21.146187+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 164904960 unmapped: 6168576 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:22.147301+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165109760 unmapped: 5963776 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:23.147514+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2342070 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165109760 unmapped: 5963776 heap: 171073536 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:24.147771+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165134336 unmapped: 6987776 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:25.147948+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2d4a000/0x0/0x1bfc00000, data 0x3cdb855/0x3ea2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165363712 unmapped: 6758400 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:26.148187+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165363712 unmapped: 6758400 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:27.148455+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165380096 unmapped: 6742016 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:28.148616+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2350180 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165560320 unmapped: 6561792 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:29.148798+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165560320 unmapped: 6561792 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:30.148993+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.765431404s of 10.002627373s, submitted: 57
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 166780928 unmapped: 5341184 heap: 172122112 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:31.149336+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2caf000/0x0/0x1bfc00000, data 0x3d79ea4/0x3f3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165568512 unmapped: 7602176 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:32.149557+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165576704 unmapped: 7593984 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:33.149699+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2352590 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2cb0000/0x0/0x1bfc00000, data 0x3d79e43/0x3f3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165576704 unmapped: 7593984 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:34.149840+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165871616 unmapped: 7299072 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:35.150048+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165871616 unmapped: 7299072 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:36.150298+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 165871616 unmapped: 7299072 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:37.150486+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 166068224 unmapped: 7102464 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:38.150629+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2364270 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 166068224 unmapped: 7102464 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:39.150812+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c14000/0x0/0x1bfc00000, data 0x3e127a0/0x3fd8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 166068224 unmapped: 7102464 heap: 173170688 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:40.151018+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.731727600s of 10.004139900s, submitted: 59
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167337984 unmapped: 6881280 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:41.151233+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c12000/0x0/0x1bfc00000, data 0x3e1603a/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167337984 unmapped: 6881280 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:42.151366+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167337984 unmapped: 6881280 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:43.151554+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2358886 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167337984 unmapped: 6881280 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:44.151819+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c10000/0x0/0x1bfc00000, data 0x3e16130/0x3fdc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:45.151972+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:46.152177+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c10000/0x0/0x1bfc00000, data 0x3e16130/0x3fdc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:47.152500+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:48.152693+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c13000/0x0/0x1bfc00000, data 0x3e160cf/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2359308 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167370752 unmapped: 6848512 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:49.152884+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167370752 unmapped: 6848512 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:50.153080+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c13000/0x0/0x1bfc00000, data 0x3e160cf/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:51.153290+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167370752 unmapped: 6848512 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.378890991s of 11.480527878s, submitted: 18
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:52.153468+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:53.153788+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c12000/0x0/0x1bfc00000, data 0x3e16124/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2359260 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:54.153976+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:55.154293+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:56.154474+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c12000/0x0/0x1bfc00000, data 0x3e160cf/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:57.154670+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 6864896 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:58.154853+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 6864896 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c12000/0x0/0x1bfc00000, data 0x3e160cf/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2359436 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:59.155056+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 6864896 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:00.155492+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 6864896 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:01.155773+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 6864896 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:02.155924+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 6864896 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.301274300s of 10.400232315s, submitted: 16
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c12000/0x0/0x1bfc00000, data 0x3e160a0/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:03.156049+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2358618 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:04.156299+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:05.156514+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167362560 unmapped: 6856704 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c14000/0x0/0x1bfc00000, data 0x3e16108/0x3fda000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:06.156707+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167370752 unmapped: 6848512 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 ms_handle_reset con 0x55c53a715c00 session 0x55c53b191860
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53c00fc00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:07.156877+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167370752 unmapped: 6848512 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:08.157034+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2357944 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:09.157244+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e161a2/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:10.157418+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:11.157676+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:12.157905+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167378944 unmapped: 6840320 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e161a2/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:13.158145+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2357944 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:14.158280+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:15.158472+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:16.158673+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:17.158843+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:18.158987+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e161a2/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 ms_handle_reset con 0x55c5384cbc00 session 0x55c53b1901e0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53d8fa800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2357944 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:19.159238+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:20.159462+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167387136 unmapped: 6832128 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:21.159688+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:22.159874+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e161a2/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:23.160034+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2357944 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:24.160168+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:25.160323+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e161a2/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:26.160549+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:27.160759+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e161a2/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:28.160964+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 25.717948914s of 25.779832840s, submitted: 12
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 6823936 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:29.161127+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2359344 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:30.161323+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:31.161607+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:32.161808+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:33.161981+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c14000/0x0/0x1bfc00000, data 0x3e1623d/0x3fda000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:34.162139+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2359344 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:35.162297+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c14000/0x0/0x1bfc00000, data 0x3e1623d/0x3fda000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:36.162460+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 6815744 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:37.162645+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167411712 unmapped: 6807552 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:38.162803+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167411712 unmapped: 6807552 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:39.162993+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2358862 data_alloc: 285212672 data_used: 2703360
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167411712 unmapped: 6807552 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:40.163169+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167411712 unmapped: 6807552 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b2c15000/0x0/0x1bfc00000, data 0x3e1626c/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.284474373s of 12.324039459s, submitted: 8
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:41.163372+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167428096 unmapped: 6791168 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:42.163545+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167428096 unmapped: 6791168 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:43.163714+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167428096 unmapped: 6791168 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 22K writes, 86K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 22K writes, 7663 syncs, 2.96 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 38.78 MB, 0.06 MB/s
                                                          Interval WAL: 12K writes, 4934 syncs, 2.55 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:44.163862+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2363048 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167428096 unmapped: 6791168 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 280 heartbeat osd_stat(store_statfs(0x1b2c10000/0x0/0x1bfc00000, data 0x3e188d2/0x3fdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:45.164067+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167436288 unmapped: 6782976 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:46.164288+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167436288 unmapped: 6782976 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:47.164448+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167436288 unmapped: 6782976 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:48.164611+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 6774784 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:49.164809+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2365682 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 6774784 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:50.164994+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 6774784 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:51.165231+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 6774784 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:52.165395+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 6774784 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:53.165594+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:54.165759+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2365682 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:55.165904+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:56.166048+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:57.166264+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:58.166447+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:59.166663+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2365682 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:00.166836+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:01.167026+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:02.167178+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:03.167360+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:04.167516+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2365682 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:05.167753+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:06.167947+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:07.168055+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:08.168178+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 6766592 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:09.168410+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2365682 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b2c0c000/0x0/0x1bfc00000, data 0x3e1aceb/0x3fe1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:10.168612+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:11.169177+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:12.169569+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167460864 unmapped: 6758400 heap: 174219264 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 32.483470917s of 32.654731750s, submitted: 66
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c53a99ec00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:13.169709+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 175874048 unmapped: 6742016 heap: 182616064 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:14.169852+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2467534 data_alloc: 285212672 data_used: 2715648
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167936000 unmapped: 17825792 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:15.170002+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 282 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 282 ms_handle_reset con 0x55c53a99ec00 session 0x55c539372d20
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 282 heartbeat osd_stat(store_statfs(0x1b1f9b000/0x0/0x1bfc00000, data 0x4a8ad1e/0x4c53000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167952384 unmapped: 17809408 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: handle_auth_request added challenge on 0x55c539288800
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:16.170153+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _renew_subs
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 282 handle_osd_map epochs [283,283], i have 282, src has [1,283]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 282 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167583744 unmapped: 18178048 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 283 ms_handle_reset con 0x55c539288800 session 0x55c538916b40
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:17.170325+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:18.170453+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:19.170646+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2378361 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 283 heartbeat osd_stat(store_statfs(0x1b2c02000/0x0/0x1bfc00000, data 0x3e1f7a5/0x3fe9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:20.170870+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:21.171052+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:22.171186+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:23.171384+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167591936 unmapped: 18169856 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.198460579s of 10.569525719s, submitted: 80
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:24.171591+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2380387 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167624704 unmapped: 18137088 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:25.171785+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:26.171957+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:27.172114+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:28.172358+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:29.172505+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2380387 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:30.172661+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:31.172863+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:32.173021+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167632896 unmapped: 18128896 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:33.173243+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:34.173419+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2380387 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:35.173586+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:36.173770+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:37.173952+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:38.174112+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:39.174283+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2380387 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:40.174400+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167641088 unmapped: 18120704 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c00000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:41.174594+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 167649280 unmapped: 18112512 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 18.193515778s of 18.214683533s, submitted: 16
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 ms_handle_reset con 0x55c53a99fc00 session 0x55c538917e00
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:42.174751+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170139648 unmapped: 15622144 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:43.174924+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170139648 unmapped: 15622144 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Got map version 55
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:44.175114+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:45.175272+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:46.175420+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:47.175616+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:48.175801+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:49.175967+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:50.176120+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:51.176348+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:52.176537+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:53.176774+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:54.176963+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:55.177118+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:56.177327+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170196992 unmapped: 15564800 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:57.177488+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170196992 unmapped: 15564800 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:58.177674+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170205184 unmapped: 15556608 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:59.177934+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170205184 unmapped: 15556608 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:00.178127+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170205184 unmapped: 15556608 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:01.178394+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170205184 unmapped: 15556608 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:02.178603+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170205184 unmapped: 15556608 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:03.178828+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170205184 unmapped: 15556608 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:04.179003+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170221568 unmapped: 15540224 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:05.180116+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:06.180577+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:07.181119+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:08.181335+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:09.181938+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:10.182258+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:11.182586+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170229760 unmapped: 15532032 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:12.182728+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170246144 unmapped: 15515648 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:13.182946+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:14.183233+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:15.183513+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:16.183723+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:17.184585+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:18.184945+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:19.185150+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170254336 unmapped: 15507456 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:20.185396+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:21.185621+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:22.185878+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:23.186053+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:24.186254+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:25.186388+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:26.186546+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:27.186676+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170262528 unmapped: 15499264 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:28.186854+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:29.187044+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:30.187244+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:31.191364+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:32.191530+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:33.191684+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:34.191906+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:35.192063+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170270720 unmapped: 15491072 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:36.192399+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:37.192624+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:38.192779+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:39.192949+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:40.193139+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:41.193334+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:42.193494+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:43.193634+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170278912 unmapped: 15482880 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:44.193804+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170287104 unmapped: 15474688 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:45.193969+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170287104 unmapped: 15474688 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:46.194246+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170287104 unmapped: 15474688 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:47.194423+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170287104 unmapped: 15474688 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:48.194571+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170295296 unmapped: 15466496 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:49.194749+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170295296 unmapped: 15466496 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:50.194956+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170295296 unmapped: 15466496 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:51.195158+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170295296 unmapped: 15466496 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:52.195397+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:53.195591+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:54.195791+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:55.195970+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:56.196127+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:57.196340+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:58.196573+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:59.196771+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170303488 unmapped: 15458304 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:00.196912+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:01.197171+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:02.197278+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:03.197425+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:04.197683+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:05.197857+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:06.198099+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:07.198272+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170311680 unmapped: 15450112 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:08.198468+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:09.198636+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:10.198786+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:11.199026+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:12.199225+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:13.199429+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:14.199735+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:15.205677+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170319872 unmapped: 15441920 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:16.205851+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:17.205990+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:18.206135+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:19.206284+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:20.206430+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:21.206629+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:22.206776+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:23.206905+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170328064 unmapped: 15433728 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:24.207032+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: bluestore.MempoolThread(0x55c536fd3b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2379507 data_alloc: 285212672 data_used: 2727936
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170344448 unmapped: 15417344 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:25.214998+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b2c01000/0x0/0x1bfc00000, data 0x3e21bbe/0x3fed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170475520 unmapped: 15286272 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:26.215213+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'config show' '{prefix=config show}'
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170164224 unmapped: 15597568 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:27.215523+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: prioritycache tune_memory target: 5709082009 mapped: 170180608 unmapped: 15581184 heap: 185761792 old mem: 4047413338 new mem: 4047413338
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: tick
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_tickets
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:28.215739+0000)
Dec 06 10:32:59 np0005548788.localdomain ceph-osd[32690]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1645509840' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.59269 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.69476 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.49875 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.59275 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.69488 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.49890 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.59293 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3924164660' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2460720482' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4179676265' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4060345088' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/842483459' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:59 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1645509840' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1470188650' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain crontab[335062]: (root) LIST (root)
Dec 06 10:33:00 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:00.218 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3825277840' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.69503 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.49905 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.59308 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.69521 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.49914 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: pgmap v831: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.59329 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.69536 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3258516296' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.59350 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.49932 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/330610004' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1470188650' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4104655568' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2196284679' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3825277840' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1440614881' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2183890531' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 06 10:33:00 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2272977300' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 06 10:33:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2736339179' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:01 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 06 10:33:01 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1780513470' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.004 281009 DEBUG oslo_service.periodic_task [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.298 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.299 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.299 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.299 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Auditing locally available compute resources for np0005548788.localdomain (node: np0005548788.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.300 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.69551 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.69566 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.59356 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.69575 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.59377 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.49959 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2272977300' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/305260225' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/887149509' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2736339179' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3153028272' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3333518524' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/545032456' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3265217251' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.805 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.951 281009 WARNING nova.virt.libvirt.driver [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.952 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Hypervisor/Node resource view: name=np0005548788.localdomain free_ram=11143MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.953 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:33:02 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:02.953 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 06 10:33:02 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/471931038' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.009 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.011 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Final resource view: name=np0005548788.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.024 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2266612871' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.154 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: tmp-crun.NyUxNk.mount: Deactivated successfully.
Dec 06 10:33:03 np0005548788.localdomain podman[335514]: 2025-12-06 10:33:03.259132004 +0000 UTC m=+0.085264431 container health_status b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:33:03 np0005548788.localdomain podman[335514]: 2025-12-06 10:33:03.298603991 +0000 UTC m=+0.124736448 container exec_died b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: b76843c85ec79f6d680dda4fe1ded7091ebe7ea4172d2654451c922bca1e9426.service: Deactivated successfully.
Dec 06 10:33:03 np0005548788.localdomain podman[335515]: 2025-12-06 10:33:03.336292372 +0000 UTC m=+0.155143934 container health_status b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:33:03 np0005548788.localdomain podman[335515]: 2025-12-06 10:33:03.347510568 +0000 UTC m=+0.166362150 container exec_died b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.69590 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.59407 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.49983 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: pgmap v832: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.69614 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1780513470' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2229859225' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2327447688' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2107682123' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3153028272' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3333518524' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2479036' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/545032456' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/466198046' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3265217251' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1245768976' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3564364230' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/471931038' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/329508633' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2266612871' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1424714830' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1424359202' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3066090828' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: b830c0159c7f576c0a1ac96aaa63a8f39f84661ed9e18f2496d017ecadf813c3.service: Deactivated successfully.
Dec 06 10:33:03 np0005548788.localdomain podman[335513]: 2025-12-06 10:33:03.439737172 +0000 UTC m=+0.270823192 container health_status 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:33:03 np0005548788.localdomain podman[335513]: 2025-12-06 10:33:03.453380922 +0000 UTC m=+0.284466942 container exec_died 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/294876320' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain systemd[1]: 2c8648ac72d40e80dbad911b75197b729f74ec45995146a8d3a7438b369de9ab.service: Deactivated successfully.
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.473 281009 DEBUG oslo_concurrency.processutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.477 281009 DEBUG nova.compute.provider_tree [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed in ProviderTree for provider: 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.507 281009 DEBUG nova.scheduler.client.report [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Inventory has not changed for provider 7413251f-98bc-4150-b1b6-b77ff1bcb5f1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.509 281009 DEBUG nova.compute.resource_tracker [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Compute_service record updated for np0005548788.localdomain:np0005548788.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:33:03 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:03.510 281009 DEBUG oslo_concurrency.lockutils [None req-902bbf6e-fa3c-432e-90e3-6ef49aa50c22 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1935362394' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:31.766487+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94994432 unmapped: 3538944 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:32.766675+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 ms_handle_refused con 0x5648d7a9c800 session 0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94994432 unmapped: 3538944 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:33.766875+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94994432 unmapped: 3538944 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:34.767041+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94994432 unmapped: 3538944 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:35.767300+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94994432 unmapped: 3538944 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 37
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.103:6800/180363885,v1:172.18.0.103:6801/180363885]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:36.767479+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:37.767757+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:38.768027+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5621 writes, 24K keys, 5621 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5621 writes, 913 syncs, 6.16 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 203 writes, 505 keys, 203 commit groups, 1.0 writes per commit group, ingest: 0.50 MB, 0.00 MB/s
                                                          Interval WAL: 203 writes, 91 syncs, 2.23 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:39.768302+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:40.768463+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:41.768675+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:42.768911+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:43.769128+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:44.769385+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:45.769604+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 ms_handle_refused con 0x5648d7a9c800 session 0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:46.769788+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:47.769976+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:48.770147+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:49.770339+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:50.770511+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:51.770705+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:52.770903+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:53.771080+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:54.771232+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:55.771494+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:56.771687+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:57.771881+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:58.772045+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:59.772266+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:00.772403+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 ms_handle_refused con 0x5648d7a9c800 session 0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:01.772602+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:02.772797+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:03.773012+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:04.773237+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b80f2000/0x0/0x1bfc00000, data 0x3911559/0x399c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1086941 data_alloc: 301989888 data_used: 10199040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:05.773475+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:06.773634+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94797824 unmapped: 3735552 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:07.773803+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 48.412761688s of 48.484767914s, submitted: 17
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 38
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now 
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc reconnect Terminating session with v2:172.18.0.103:6800/180363885
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc reconnect No active mgr available yet
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 ms_handle_reset con 0x5648d43a5800 session 0x5648d71b70e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 94994432 unmapped: 3538944 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:08.773918+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 39
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: get_auth_request con 0x5648d7a87800 auth_method 0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ed000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95223808 unmapped: 3309568 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:09.774036+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95240192 unmapped: 3293184 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:10.774267+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95248384 unmapped: 3284992 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:11.774489+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95248384 unmapped: 3284992 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:12.774655+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95248384 unmapped: 3284992 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:13.774819+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 41
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2148019987,v1:172.18.0.106:6811/2148019987]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:14.774973+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:15.775144+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:16.775315+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:17.775455+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:18.775535+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:19.775681+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:20.775883+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:21.776096+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:22.776289+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:23.776495+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:24.776670+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:25.776891+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:26.777081+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:27.777247+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:28.777470+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:29.777660+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:30.777853+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:31.778037+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:32.778287+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:33.778565+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:34.778748+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:35.778983+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:36.779170+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:37.779358+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:38.779539+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:39.779705+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:40.779863+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:41.780006+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:42.780143+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:43.780324+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:44.780522+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:45.780752+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:46.780901+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:47.781078+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:48.781251+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:49.781438+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:50.781610+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:51.781781+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:52.781978+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:53.782159+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:54.782332+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:55.782525+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:56.782716+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:57.782866+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:58.783053+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:59.783259+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:00.783448+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:01.783627+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:02.783791+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:03.783957+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:04.784129+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:05.784336+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:06.784513+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:07.784693+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b80ee000/0x0/0x1bfc00000, data 0x3913d9e/0x39a0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:08.784858+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:09.785090+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:10.785268+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95010816 unmapped: 3522560 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1090261 data_alloc: 301989888 data_used: 10211328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 42
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now 
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/2148019987
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc reconnect No active mgr available yet
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.747966766s of 63.849689484s, submitted: 18
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:11.785386+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 ms_handle_reset con 0x5648d6c4e400 session 0x5648d58912c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 3506176 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7961800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:12.785516+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 43
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: get_auth_request con 0x5648d7916000 auth_method 0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95182848 unmapped: 3350528 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:13.785666+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95199232 unmapped: 3334144 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:14.785806+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95232000 unmapped: 3301376 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 44
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:15.785955+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95232000 unmapped: 3301376 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 45
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:16.786116+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:17.786254+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:18.786418+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:19.786588+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:20.786733+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:21.786868+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:22.787053+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:23.788075+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:24.788241+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:25.788440+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:26.788559+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:27.788725+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:28.788927+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:29.789125+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:30.789310+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:31.789545+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:32.789771+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:33.789995+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:34.790236+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:35.790479+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:36.790663+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:37.790836+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:38.791008+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:39.791242+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:40.791458+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:41.791671+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:42.791866+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:43.792077+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:44.792260+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:45.792546+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:46.792807+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:47.793051+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:48.793240+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:49.793429+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:50.793591+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:51.793758+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:52.793923+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:53.794107+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:54.794309+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:55.794598+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:56.794803+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:57.794977+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:58.795141+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:59.795326+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:00.795465+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:01.795658+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:02.795904+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:03.796128+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:04.796315+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:05.796518+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:06.796673+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:07.796821+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:08.796981+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:09.797181+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:10.797316+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:11.797493+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:12.797677+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:13.797881+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:14.798137+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:15.798455+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:16.798628+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:17.798820+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:18.799027+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:19.799548+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:20.799709+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:21.799929+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:22.800277+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:23.800675+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:24.800971+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:25.801944+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:26.802270+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:27.802450+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:28.802635+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:29.802827+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:30.803032+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:31.803251+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:32.803462+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:33.803678+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:34.803929+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:35.804108+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:36.804279+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:37.804520+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:38.804783+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:39.805097+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:40.805349+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 95436800 unmapped: 3096576 heap: 98533376 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093645 data_alloc: 285212672 data_used: 10223616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 89.555000305s of 89.621978760s, submitted: 18
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b80e9000/0x0/0x1bfc00000, data 0x391673b/0x39a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:41.805546+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 104718336 unmapped: 15851520 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b70e7000/0x0/0x1bfc00000, data 0x4916888/0x49a7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:42.805718+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 46
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96575488 unmapped: 23994368 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 96 ms_handle_reset con 0x5648d85b8000 session 0x5648d6a15e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:43.805944+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96600064 unmapped: 23969792 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:44.806109+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 96 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 ms_handle_reset con 0x5648d43a5800 session 0x5648d6cb5860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:45.806287+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1252182 data_alloc: 285212672 data_used: 10248192
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:46.806456+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:47.806603+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6c6b000/0x0/0x1bfc00000, data 0x4d8b357/0x4e22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:48.806849+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:49.806985+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:50.807130+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1252182 data_alloc: 285212672 data_used: 10248192
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:51.807332+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6c6b000/0x0/0x1bfc00000, data 0x4d8b357/0x4e22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:52.807524+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:53.807692+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:54.807883+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6c6b000/0x0/0x1bfc00000, data 0x4d8b357/0x4e22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:55.808107+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1252182 data_alloc: 285212672 data_used: 10248192
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:56.808316+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:57.808490+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6c6b000/0x0/0x1bfc00000, data 0x4d8b357/0x4e22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:58.808655+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:59.808846+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:00.808999+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1252182 data_alloc: 285212672 data_used: 10248192
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:01.809249+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:02.809398+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:03.809622+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6c6b000/0x0/0x1bfc00000, data 0x4d8b357/0x4e22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:04.809827+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:05.810168+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1252182 data_alloc: 285212672 data_used: 10248192
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:06.810421+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:07.810752+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:08.810986+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b6c6b000/0x0/0x1bfc00000, data 0x4d8b357/0x4e22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96649216 unmapped: 23920640 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.609174728s of 27.921756744s, submitted: 46
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:09.811169+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 98 ms_handle_reset con 0x5648d6c4e400 session 0x5648d6cb5680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96657408 unmapped: 23912448 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:10.811368+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96673792 unmapped: 23896064 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1260309 data_alloc: 285212672 data_used: 10260480
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:11.811541+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96690176 unmapped: 23879680 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:12.811752+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96690176 unmapped: 23879680 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 99 ms_handle_reset con 0x5648d7a87800 session 0x5648d6cb50e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:13.811945+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:14.812106+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 99 heartbeat osd_stat(store_statfs(0x1b6c63000/0x0/0x1bfc00000, data 0x4d8fe11/0x4e2a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:15.812397+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1261630 data_alloc: 285212672 data_used: 10272768
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:16.812606+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:17.812798+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 99 heartbeat osd_stat(store_statfs(0x1b6c63000/0x0/0x1bfc00000, data 0x4d8fe11/0x4e2a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:18.812972+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:19.813266+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96722944 unmapped: 23846912 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.034015656s of 11.215066910s, submitted: 50
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:20.813462+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 100 ms_handle_reset con 0x5648d7a9c800 session 0x5648d71b6780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96780288 unmapped: 23789568 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1265656 data_alloc: 285212672 data_used: 10285056
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:21.813642+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96780288 unmapped: 23789568 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b6c5a000/0x0/0x1bfc00000, data 0x4d94779/0x4e32000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:22.813836+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96813056 unmapped: 23756800 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b6c5c000/0x0/0x1bfc00000, data 0x4d94779/0x4e32000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:23.814008+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96813056 unmapped: 23756800 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:24.814153+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96813056 unmapped: 23756800 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b6c5c000/0x0/0x1bfc00000, data 0x4d94779/0x4e32000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:25.814352+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96813056 unmapped: 23756800 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1267778 data_alloc: 285212672 data_used: 10285056
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:26.814487+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b9000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96813056 unmapped: 23756800 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:27.814679+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 102 ms_handle_reset con 0x5648d85b9000 session 0x5648d71b6960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:28.814817+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 102 heartbeat osd_stat(store_statfs(0x1b6c57000/0x0/0x1bfc00000, data 0x4d96d00/0x4e36000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:29.815003+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:30.815138+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 102 heartbeat osd_stat(store_statfs(0x1b6c57000/0x0/0x1bfc00000, data 0x4d96d00/0x4e36000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1271980 data_alloc: 285212672 data_used: 10297344
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:31.815261+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:32.815536+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 102 heartbeat osd_stat(store_statfs(0x1b6c57000/0x0/0x1bfc00000, data 0x4d96d00/0x4e36000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:33.815684+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:34.816073+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:35.816463+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 96829440 unmapped: 23740416 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1271980 data_alloc: 285212672 data_used: 10297344
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:36.816662+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.505769730s of 16.644773483s, submitted: 44
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97894400 unmapped: 22675456 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:37.816839+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6c54000/0x0/0x1bfc00000, data 0x4d99119/0x4e3a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:38.817024+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6c54000/0x0/0x1bfc00000, data 0x4d99119/0x4e3a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:39.817281+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:40.817478+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1274102 data_alloc: 285212672 data_used: 10297344
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:41.817692+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6c54000/0x0/0x1bfc00000, data 0x4d99119/0x4e3a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6c54000/0x0/0x1bfc00000, data 0x4d99119/0x4e3a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:42.817878+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:43.818115+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:44.818251+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 97935360 unmapped: 22634496 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a95860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:45.818489+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e400 session 0x5648d6c2ab40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 105250816 unmapped: 15319040 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1294582 data_alloc: 301989888 data_used: 18685952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:46.818712+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.782546043s of 10.040440559s, submitted: 52
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d7a87800 session 0x5648d6a7e5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 106766336 unmapped: 13803520 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:47.818866+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 106766336 unmapped: 13803520 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b5c01000/0x0/0x1bfc00000, data 0x5deb129/0x5e8d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:48.819025+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 106766336 unmapped: 13803520 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:49.819169+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 106930176 unmapped: 13639680 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:50.819374+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 107618304 unmapped: 12951552 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b5bdc000/0x0/0x1bfc00000, data 0x5e0f139/0x5eb2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1453229 data_alloc: 301989888 data_used: 21831680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:51.819523+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b5bdc000/0x0/0x1bfc00000, data 0x5e0f139/0x5eb2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 109248512 unmapped: 11321344 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:52.819689+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:53.819804+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:54.819927+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:55.820184+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1464589 data_alloc: 301989888 data_used: 23445504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:56.820384+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b5bdc000/0x0/0x1bfc00000, data 0x5e0f139/0x5eb2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:57.820554+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:58.820707+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b5bdc000/0x0/0x1bfc00000, data 0x5e0f139/0x5eb2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:59.820855+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 110608384 unmapped: 9961472 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b5bdc000/0x0/0x1bfc00000, data 0x5e0f139/0x5eb2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:00.821007+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.841262817s of 13.888993263s, submitted: 11
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 117956608 unmapped: 2613248 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1555597 data_alloc: 301989888 data_used: 23613440
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:01.821157+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116817920 unmapped: 3751936 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:02.821295+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116899840 unmapped: 3670016 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:03.821437+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 117194752 unmapped: 3375104 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:04.821598+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6ab5860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e800 session 0x5648d6ab43c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d43a5800 session 0x5648d6ab5680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 117620736 unmapped: 2949120 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:05.821807+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6f1ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116629504 unmapped: 3940352 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d7a9c800 session 0x5648d6ab4f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b4f42000/0x0/0x1bfc00000, data 0x6aa7139/0x6b4a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1571583 data_alloc: 301989888 data_used: 24133632
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:06.821982+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e400 session 0x5648d6c8ba40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e800 session 0x5648d6f5a3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116310016 unmapped: 4259840 heap: 120569856 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:07.822147+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a874a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6c8b860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d7a9c800 session 0x5648d6c8b680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d7a87800 session 0x5648d6c8a1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9d800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 ms_handle_reset con 0x5648d7a9d800 session 0x5648d6c8b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116121600 unmapped: 12918784 heap: 129040384 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:08.822345+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 104 ms_handle_reset con 0x5648d43a5800 session 0x5648d6c8ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 120815616 unmapped: 8224768 heap: 129040384 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:09.822502+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 104 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6c8ab40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 123060224 unmapped: 15654912 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:10.822664+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 104 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.078203201s of 10.001610756s, submitted: 203
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 105 ms_handle_reset con 0x5648d7a9c800 session 0x5648d5078780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 105 ms_handle_reset con 0x5648d7a87800 session 0x5648d6f1b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9d800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 123994112 unmapped: 14721024 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1909140 data_alloc: 301989888 data_used: 26017792
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:11.822842+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 ms_handle_reset con 0x5648d7a9d800 session 0x5648d69ccd20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 ms_handle_reset con 0x5648d43a5800 session 0x5648d766c780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124051456 unmapped: 14663680 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 heartbeat osd_stat(store_statfs(0x1b2984000/0x0/0x1bfc00000, data 0x905c1d4/0x9108000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6be3a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 ms_handle_reset con 0x5648d7a87800 session 0x5648d6c283c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:12.822986+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124051456 unmapped: 14663680 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:13.823149+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 ms_handle_reset con 0x5648d7a9c800 session 0x5648d6f1bc20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116211712 unmapped: 22503424 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:14.823291+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116211712 unmapped: 22503424 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:15.823516+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 116244480 unmapped: 22470656 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1644243 data_alloc: 301989888 data_used: 19951616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:16.823684+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b4691000/0x0/0x1bfc00000, data 0x73505cd/0x73fc000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 117071872 unmapped: 21643264 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:17.823840+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 117137408 unmapped: 21577728 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9d800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 107 ms_handle_reset con 0x5648d7a9d800 session 0x5648d6a94b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:18.823973+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124051456 unmapped: 14663680 heap: 138715136 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:19.824097+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b1f42000/0x0/0x1bfc00000, data 0x9a9e63f/0x9b4c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 107 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a95680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 127131648 unmapped: 23175168 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:20.824255+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 127131648 unmapped: 23175168 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2012423 data_alloc: 301989888 data_used: 25030656
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:21.824425+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.139199257s of 10.833148956s, submitted: 168
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 107 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6ab41e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 127205376 unmapped: 23101440 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:22.824609+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124887040 unmapped: 25419776 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 ms_handle_reset con 0x5648d7a9c800 session 0x5648d6cb8d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:23.824813+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 heartbeat osd_stat(store_statfs(0x1b3d0a000/0x0/0x1bfc00000, data 0x7cd3be9/0x7d84000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 118472704 unmapped: 31834112 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:24.824963+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 118472704 unmapped: 31834112 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:25.825232+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 118472704 unmapped: 31834112 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:26.831825+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1787747 data_alloc: 301989888 data_used: 22970368
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 120545280 unmapped: 29761536 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:27.832037+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124870656 unmapped: 25436160 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:28.832170+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 ms_handle_reset con 0x5648d6d16000 session 0x5648d6a154a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 ms_handle_reset con 0x5648d6d16400 session 0x5648d6cb4000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 heartbeat osd_stat(store_statfs(0x1b3260000/0x0/0x1bfc00000, data 0x8777be9/0x8828000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125009920 unmapped: 25296896 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:29.832362+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125747200 unmapped: 24559616 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 heartbeat osd_stat(store_statfs(0x1b322c000/0x0/0x1bfc00000, data 0x87b1be9/0x8862000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:30.832562+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125747200 unmapped: 24559616 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:31.832713+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1895755 data_alloc: 301989888 data_used: 23875584
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.454057693s of 10.169719696s, submitted: 217
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125796352 unmapped: 24510464 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:32.832851+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125886464 unmapped: 24420352 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:33.832987+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125190144 unmapped: 25116672 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:34.833126+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125403136 unmapped: 24903680 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:35.833299+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 109 heartbeat osd_stat(store_statfs(0x1b31d7000/0x0/0x1bfc00000, data 0x8805002/0x88b7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 109 heartbeat osd_stat(store_statfs(0x1b31b3000/0x0/0x1bfc00000, data 0x8829002/0x88db000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125435904 unmapped: 24870912 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:36.833451+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1903001 data_alloc: 301989888 data_used: 24387584
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125435904 unmapped: 24870912 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:37.833603+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 109 ms_handle_reset con 0x5648d6d16800 session 0x5648d6f5a1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 109 ms_handle_reset con 0x5648d7a87800 session 0x5648d6cb85a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125435904 unmapped: 24870912 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:38.833747+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 132415488 unmapped: 17891328 heap: 150306816 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:39.834405+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 110 ms_handle_reset con 0x5648d6d16400 session 0x5648d9ffc5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 135610368 unmapped: 22880256 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.834535+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 111 ms_handle_reset con 0x5648d43a5800 session 0x5648d6c8b2c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 111 heartbeat osd_stat(store_statfs(0x1b1a53000/0x0/0x1bfc00000, data 0x9f855a7/0xa03b000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 135602176 unmapped: 22888448 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets getting new tickets!
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.834790+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _finish_auth 0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.835776+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2122478 data_alloc: 301989888 data_used: 26574848
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.414688110s of 10.015502930s, submitted: 138
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 111 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 112 heartbeat osd_stat(store_statfs(0x1b1a4e000/0x0/0x1bfc00000, data 0x9f87b2e/0xa03f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 135618560 unmapped: 22872064 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:42.834962+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 112 heartbeat osd_stat(store_statfs(0x1b1a48000/0x0/0x1bfc00000, data 0x9f8a0df/0xa044000, compress 0x0/0x0/0x0, omap 0x648, meta 0x416f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6c4e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 135667712 unmapped: 22822912 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:43.835109+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 112 ms_handle_reset con 0x5648d6c4e000 session 0x5648d6c294a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 112 ms_handle_reset con 0x5648d43a5800 session 0x5648d6be34a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125337600 unmapped: 33153024 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:44.835326+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125345792 unmapped: 33144832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.835527+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 125345792 unmapped: 33144832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:46.835723+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1938539 data_alloc: 301989888 data_used: 19283968
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124624896 unmapped: 33865728 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:47.835883+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 113 heartbeat osd_stat(store_statfs(0x1b2af8000/0x0/0x1bfc00000, data 0x8ada527/0x8b95000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 113 ms_handle_reset con 0x5648d6d16400 session 0x5648d69ee1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 132530176 unmapped: 25960448 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:48.836005+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 132702208 unmapped: 25788416 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:49.836148+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 132710400 unmapped: 25780224 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:50.836299+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d6d16800 session 0x5648d6f1b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 129441792 unmapped: 29048832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:51.836437+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1999901 data_alloc: 301989888 data_used: 19390464
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.897445679s of 10.107122421s, submitted: 290
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d6d16000 session 0x5648d4369a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124977152 unmapped: 33513472 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d7a87800 session 0x5648d6c8a1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:52.836613+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d6d16c00 session 0x5648d6c28d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d6d17000 session 0x5648d6c285a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 heartbeat osd_stat(store_statfs(0x1b1495000/0x0/0x1bfc00000, data 0xa13d9fa/0xa1f9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d43a5800 session 0x5648d6cc0000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 121208832 unmapped: 37281792 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 ms_handle_reset con 0x5648d6d16000 session 0x5648d6ab52c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:53.836747+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 heartbeat osd_stat(store_statfs(0x1b1495000/0x0/0x1bfc00000, data 0xa13d9fa/0xa1f9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 121241600 unmapped: 37249024 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:54.836895+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 121339904 unmapped: 37150720 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:55.837089+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 121339904 unmapped: 37150720 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:56.837240+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1770624 data_alloc: 301989888 data_used: 14155776
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 121257984 unmapped: 37232640 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:57.837396+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b41a9000/0x0/0x1bfc00000, data 0x742cd0f/0x74e4000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 121257984 unmapped: 37232640 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:58.837520+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 124305408 unmapped: 34185216 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:59.837651+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 128278528 unmapped: 30212096 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:00.837766+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133840896 unmapped: 24649728 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:01.837962+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1908576 data_alloc: 318767104 data_used: 30646272
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133840896 unmapped: 24649728 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:02.838105+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133840896 unmapped: 24649728 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:03.838282+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b41a8000/0x0/0x1bfc00000, data 0x742cd32/0x74e5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b41a8000/0x0/0x1bfc00000, data 0x742cd32/0x74e5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133922816 unmapped: 24567808 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:04.838405+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.503170967s of 13.190760612s, submitted: 193
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133922816 unmapped: 24567808 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:05.838598+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 ms_handle_reset con 0x5648d6d16400 session 0x5648d6a94b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133922816 unmapped: 24567808 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:06.838739+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1907648 data_alloc: 318767104 data_used: 30650368
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133922816 unmapped: 24567808 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:07.838932+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b41a9000/0x0/0x1bfc00000, data 0x742cd32/0x74e5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133931008 unmapped: 24559616 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:08.839114+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b41a9000/0x0/0x1bfc00000, data 0x742cd32/0x74e5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b3fac000/0x0/0x1bfc00000, data 0x7629d32/0x76e2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [0,0,0,7,4])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144515072 unmapped: 13975552 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:09.839244+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b3fac000/0x0/0x1bfc00000, data 0x7629d32/0x76e2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x456f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 146276352 unmapped: 12214272 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:10.839373+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 146694144 unmapped: 11796480 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:11.839495+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2236392 data_alloc: 318767104 data_used: 31997952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:12.839637+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147922944 unmapped: 10567680 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:13.839781+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147922944 unmapped: 10567680 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:14.839946+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147922944 unmapped: 10567680 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:15.840321+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147922944 unmapped: 10567680 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b050f000/0x0/0x1bfc00000, data 0x9f20d32/0x9fd9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:16.840464+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147922944 unmapped: 10567680 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.398570061s of 11.564742088s, submitted: 410
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2246154 data_alloc: 318767104 data_used: 32002048
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 ms_handle_reset con 0x5648d6d16800 session 0x5648d9d863c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:17.840790+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148209664 unmapped: 10280960 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 ms_handle_reset con 0x5648d43a5800 session 0x5648d9ffde00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 ms_handle_reset con 0x5648d7a9c800 session 0x5648d6a14b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:18.840934+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136855552 unmapped: 21635072 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 ms_handle_reset con 0x5648d6d17400 session 0x5648d766c5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:19.841055+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:20.841261+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:21.841400+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:22.841546+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:23.841708+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:24.841844+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:25.842082+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:26.842285+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:27.842506+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:28.842655+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:29.842861+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:30.843048+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:31.843253+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:32.843413+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:33.843556+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:34.843707+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:35.843923+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:36.844080+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:37.844304+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:38.844474+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:39.845138+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:40.845327+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:41.845488+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:42.845627+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:43.845919+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:44.846081+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:45.846388+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:46.846605+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:47.846828+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:48.847038+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:49.847268+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:50.847500+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:51.847716+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:52.847890+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:53.848136+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:54.848368+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:55.848640+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:56.848831+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:57.848994+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:58.849173+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133472256 unmapped: 25018368 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:59.849402+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:00.849605+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:01.849791+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:02.850008+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:03.850346+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:04.850538+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:05.850769+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:06.850990+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:07.851187+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:08.851920+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:09.852089+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:10.852286+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:11.852421+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:12.852582+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:13.852808+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:14.853046+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:15.853281+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133505024 unmapped: 24985600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:16.853435+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133513216 unmapped: 24977408 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496118 data_alloc: 301989888 data_used: 13549568
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b530d000/0x0/0x1bfc00000, data 0x4db4c9d/0x4e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:17.853633+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133513216 unmapped: 24977408 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:18.853777+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133513216 unmapped: 24977408 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:19.853950+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133513216 unmapped: 24977408 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:20.854084+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133513216 unmapped: 24977408 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 64.205314636s of 64.729576111s, submitted: 161
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:21.854251+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133537792 unmapped: 24952832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1503897 data_alloc: 301989888 data_used: 13561856
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 116 ms_handle_reset con 0x5648d6d17800 session 0x5648d766d860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 116 heartbeat osd_stat(store_statfs(0x1b567e000/0x0/0x1bfc00000, data 0x4db798a/0x4e70000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:22.854408+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133521408 unmapped: 24969216 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 117 ms_handle_reset con 0x5648d43a5800 session 0x5648dabdfe00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:23.854536+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133578752 unmapped: 24911872 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 117 handle_osd_map epochs [117,118], i have 117, src has [1,118]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:24.854696+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133677056 unmapped: 24813568 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 119 ms_handle_reset con 0x5648d6d16800 session 0x5648d5d765a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:25.854873+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133644288 unmapped: 24846336 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 120 ms_handle_reset con 0x5648d6d17400 session 0x5648d6cb8d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:26.855065+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 120 handle_osd_map epochs [119,120], i have 120, src has [1,120]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133693440 unmapped: 24797184 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1536316 data_alloc: 301989888 data_used: 13586432
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:27.855265+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 121 ms_handle_reset con 0x5648d6d17800 session 0x5648d6c8a3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133701632 unmapped: 24788992 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 121 heartbeat osd_stat(store_statfs(0x1b565d000/0x0/0x1bfc00000, data 0x4dc4600/0x4e8d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:28.855458+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133718016 unmapped: 24772608 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:29.855624+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133742592 unmapped: 24748032 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:30.855748+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133742592 unmapped: 24748032 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 122 heartbeat osd_stat(store_statfs(0x1b565b000/0x0/0x1bfc00000, data 0x4dc7793/0x4e92000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:31.855923+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.015241623s of 10.534132957s, submitted: 128
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133750784 unmapped: 24739840 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1547070 data_alloc: 301989888 data_used: 13598720
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 123 ms_handle_reset con 0x5648d7a9c800 session 0x5648d6a14000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:32.856146+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b5658000/0x0/0x1bfc00000, data 0x4dc90e6/0x4e95000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133718016 unmapped: 24772608 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:33.856342+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 124 ms_handle_reset con 0x5648d6d16800 session 0x5648d71b7a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133808128 unmapped: 24682496 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 124 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a15c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 124 ms_handle_reset con 0x5648d6d17400 session 0x5648d6a14780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:34.856529+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 124 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6f5b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133824512 unmapped: 24666112 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7e4c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:35.856712+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 125 ms_handle_reset con 0x5648d7e4c800 session 0x5648d6cb5860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 133939200 unmapped: 24551424 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 125 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a7f4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 125 ms_handle_reset con 0x5648d6d17800 session 0x5648d92d0f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 125 ms_handle_reset con 0x5648d6d16800 session 0x5648d9ffcb40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:36.856872+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136159232 unmapped: 22331392 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1561094 data_alloc: 301989888 data_used: 13615104
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:37.857017+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136167424 unmapped: 22323200 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 127 ms_handle_reset con 0x5648d6d17400 session 0x5648d6a95680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b564e000/0x0/0x1bfc00000, data 0x4dcf4e7/0x4ea0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x570f9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7e4c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 127 ms_handle_reset con 0x5648d6d17c00 session 0x5648d9d865a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:38.857265+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136355840 unmapped: 22134784 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 128 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a15680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 128 ms_handle_reset con 0x5648d7e4c800 session 0x5648d766c1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:39.857438+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136495104 unmapped: 21995520 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 ms_handle_reset con 0x5648d6d16800 session 0x5648d9d86960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:40.857581+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136577024 unmapped: 21913600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b5249000/0x0/0x1bfc00000, data 0x4dd616d/0x4ea4000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:41.857782+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136577024 unmapped: 21913600 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1562007 data_alloc: 301989888 data_used: 13623296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b524a000/0x0/0x1bfc00000, data 0x4dd59b3/0x4ea2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.752894402s of 10.150030136s, submitted: 433
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 129 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:42.858002+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:43.858192+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:44.858379+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 130 heartbeat osd_stat(store_statfs(0x1b5245000/0x0/0x1bfc00000, data 0x4dd7e28/0x4ea6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136257536 unmapped: 22233088 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:45.858538+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136257536 unmapped: 22233088 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:46.858681+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1567835 data_alloc: 301989888 data_used: 13623296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:47.858875+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:48.859032+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:49.859159+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:50.859344+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b5243000/0x0/0x1bfc00000, data 0x4dda281/0x4eaa000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:51.859522+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1567835 data_alloc: 301989888 data_used: 13623296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:52.859689+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:53.859887+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:54.860074+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:55.860337+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136298496 unmapped: 22192128 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:56.860607+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b5243000/0x0/0x1bfc00000, data 0x4dda281/0x4eaa000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136306688 unmapped: 22183936 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1567835 data_alloc: 301989888 data_used: 13623296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:57.860787+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136306688 unmapped: 22183936 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:58.860988+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136306688 unmapped: 22183936 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:59.861130+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136306688 unmapped: 22183936 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:00.861279+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136306688 unmapped: 22183936 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:01.861474+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136306688 unmapped: 22183936 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b5243000/0x0/0x1bfc00000, data 0x4dda281/0x4eaa000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1567835 data_alloc: 301989888 data_used: 13623296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:02.861632+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136314880 unmapped: 22175744 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:03.861756+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136314880 unmapped: 22175744 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b5243000/0x0/0x1bfc00000, data 0x4dda281/0x4eaa000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:04.861955+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b5243000/0x0/0x1bfc00000, data 0x4dda281/0x4eaa000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136323072 unmapped: 22167552 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:05.862158+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136323072 unmapped: 22167552 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:06.862348+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 131 handle_osd_map epochs [132,133], i have 131, src has [1,133]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.914072037s of 24.940404892s, submitted: 39
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136380416 unmapped: 22110208 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1572959 data_alloc: 301989888 data_used: 13623296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 133 ms_handle_reset con 0x5648d6d17400 session 0x5648d6cb50e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:07.863727+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136388608 unmapped: 22102016 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:08.863928+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 134 ms_handle_reset con 0x5648d6d17800 session 0x5648d6c29860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136388608 unmapped: 22102016 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:09.864333+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136388608 unmapped: 22102016 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:10.864714+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 134 ms_handle_reset con 0x5648d43a5800 session 0x5648d6c294a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 134 heartbeat osd_stat(store_statfs(0x1b5234000/0x0/0x1bfc00000, data 0x4de135e/0x4eb9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136396800 unmapped: 22093824 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 135 ms_handle_reset con 0x5648d6d16800 session 0x5648d9136b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:11.865078+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 136 ms_handle_reset con 0x5648d6d17400 session 0x5648d6a14b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136421376 unmapped: 22069248 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1590333 data_alloc: 301989888 data_used: 13660160
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 136 handle_osd_map epochs [135,136], i have 136, src has [1,136]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:12.865257+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 136 ms_handle_reset con 0x5648d6d17800 session 0x5648d6a14960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136470528 unmapped: 22020096 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:13.865439+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136470528 unmapped: 22020096 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:14.865820+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136470528 unmapped: 22020096 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:15.867102+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 136 heartbeat osd_stat(store_statfs(0x1b522e000/0x0/0x1bfc00000, data 0x4de5c56/0x4ebe000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136470528 unmapped: 22020096 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:16.867923+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136478720 unmapped: 22011904 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1591925 data_alloc: 301989888 data_used: 13656064
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:17.868568+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136478720 unmapped: 22011904 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:18.869169+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136478720 unmapped: 22011904 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:19.869606+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136478720 unmapped: 22011904 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:20.870017+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136478720 unmapped: 22011904 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.046366692s of 14.464730263s, submitted: 142
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:21.870220+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 137 heartbeat osd_stat(store_statfs(0x1b522b000/0x0/0x1bfc00000, data 0x4de806f/0x4ec2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136478720 unmapped: 22011904 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1594655 data_alloc: 301989888 data_used: 13656064
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:22.870370+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136503296 unmapped: 21987328 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:23.870683+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136503296 unmapped: 21987328 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7e4c800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 ms_handle_reset con 0x5648d7e4c800 session 0x5648d6a150e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:24.870863+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 heartbeat osd_stat(store_statfs(0x1b5225000/0x0/0x1bfc00000, data 0x4dea614/0x4ec8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136503296 unmapped: 21987328 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:25.871121+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a15860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136536064 unmapped: 21954560 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:26.871306+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 ms_handle_reset con 0x5648d6d16800 session 0x5648d766c000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 heartbeat osd_stat(store_statfs(0x1b5226000/0x0/0x1bfc00000, data 0x4dea614/0x4ec8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136536064 unmapped: 21954560 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1599524 data_alloc: 301989888 data_used: 13668352
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:27.871470+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136552448 unmapped: 21938176 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:28.871639+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136560640 unmapped: 21929984 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:29.871802+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136560640 unmapped: 21929984 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 139 ms_handle_reset con 0x5648d6d17400 session 0x5648d766d860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:30.871999+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136601600 unmapped: 21889024 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:31.872169+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1602251 data_alloc: 301989888 data_used: 13680640
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 139 heartbeat osd_stat(store_statfs(0x1b5224000/0x0/0x1bfc00000, data 0x4decb29/0x4eca000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:32.872347+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:33.872632+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 139 heartbeat osd_stat(store_statfs(0x1b5224000/0x0/0x1bfc00000, data 0x4decb29/0x4eca000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:34.872832+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 139 heartbeat osd_stat(store_statfs(0x1b5224000/0x0/0x1bfc00000, data 0x4decb29/0x4eca000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:35.873076+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136609792 unmapped: 21880832 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:36.873257+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.336238861s of 15.615715027s, submitted: 85
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136667136 unmapped: 21823488 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1606453 data_alloc: 301989888 data_used: 13692928
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:37.873464+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 140 heartbeat osd_stat(store_statfs(0x1b521f000/0x0/0x1bfc00000, data 0x4deef42/0x4ece000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136667136 unmapped: 21823488 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:38.873818+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136667136 unmapped: 21823488 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:39.874011+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136667136 unmapped: 21823488 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:40.874259+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136691712 unmapped: 21798912 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:41.874423+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 142 ms_handle_reset con 0x5648d6d17800 session 0x5648d6a86b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136781824 unmapped: 21708800 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1619959 data_alloc: 301989888 data_used: 13705216
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b5213000/0x0/0x1bfc00000, data 0x4df3a6c/0x4ed8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:42.874618+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 142 ms_handle_reset con 0x5648d6d17c00 session 0x5648d69cc1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 142 ms_handle_reset con 0x5648d43a5800 session 0x5648d4369a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136790016 unmapped: 21700608 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b5211000/0x0/0x1bfc00000, data 0x4df3ade/0x4eda000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:43.874806+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 143 ms_handle_reset con 0x5648d6d17400 session 0x5648d6cb4780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136814592 unmapped: 21676032 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 143 ms_handle_reset con 0x5648d6d16800 session 0x5648dabde3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:44.875096+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 143 ms_handle_reset con 0x5648d6d17800 session 0x5648d6a7e5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136888320 unmapped: 21602304 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 144 ms_handle_reset con 0x5648d6d17c00 session 0x5648d69efa40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:45.875285+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 ms_handle_reset con 0x5648d43a5800 session 0x5648d6cb8d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 136937472 unmapped: 21553152 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:46.875484+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 ms_handle_reset con 0x5648d6d16800 session 0x5648d6a86000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 137003008 unmapped: 21487616 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1628941 data_alloc: 301989888 data_used: 13705216
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:47.875715+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.011445045s of 10.515679359s, submitted: 157
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 ms_handle_reset con 0x5648d6d17400 session 0x5648d69f6d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 137011200 unmapped: 21479424 heap: 158490624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:48.875967+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 heartbeat osd_stat(store_statfs(0x1b5209000/0x0/0x1bfc00000, data 0x4dfaa9b/0x4ee5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 137216000 unmapped: 38068224 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:49.876150+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 ms_handle_reset con 0x5648d6d17c00 session 0x5648d5079680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d4927400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 ms_handle_reset con 0x5648d4927400 session 0x5648d5079a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 145735680 unmapped: 29548544 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 heartbeat osd_stat(store_statfs(0x1b3207000/0x0/0x1bfc00000, data 0x6dfaace/0x6ee7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x5b0f9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:50.876286+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 ms_handle_reset con 0x5648d43a5800 session 0x5648dabde1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 146 ms_handle_reset con 0x5648d6d16800 session 0x5648dabdfe00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139436032 unmapped: 35848192 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:51.876631+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 147 ms_handle_reset con 0x5648d6d17400 session 0x5648d6cb34a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139067392 unmapped: 36216832 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2096845 data_alloc: 301989888 data_used: 13717504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8923000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:52.876756+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 147 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6a7f4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 147 ms_handle_reset con 0x5648d8923000 session 0x5648dabde780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 148 ms_handle_reset con 0x5648d8922c00 session 0x5648d6ab5c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 148 ms_handle_reset con 0x5648d6d17800 session 0x5648d6a15680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139059200 unmapped: 36225024 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:53.877088+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8923000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 148 ms_handle_reset con 0x5648d8923000 session 0x5648d6c2ba40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 148 ms_handle_reset con 0x5648d6d16800 session 0x5648d766d4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139100160 unmapped: 36184064 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 149 heartbeat osd_stat(store_statfs(0x1af856000/0x0/0x1bfc00000, data 0x9604042/0x96f5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:54.877249+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 149 ms_handle_reset con 0x5648d43a5800 session 0x5648d766c3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139173888 unmapped: 36110336 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:55.877432+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 150 ms_handle_reset con 0x5648d43a5800 session 0x5648d5d76780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139264000 unmapped: 36020224 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:56.878256+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 151 ms_handle_reset con 0x5648d6d16800 session 0x5648d9ffcb40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139329536 unmapped: 35954688 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1680525 data_alloc: 301989888 data_used: 13746176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:57.878504+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.695060730s of 10.232881546s, submitted: 376
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139362304 unmapped: 35921920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:58.878788+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 152 heartbeat osd_stat(store_statfs(0x1b404c000/0x0/0x1bfc00000, data 0x4e0b0aa/0x4efe000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139362304 unmapped: 35921920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:59.879144+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139362304 unmapped: 35921920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:00.879362+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 152 heartbeat osd_stat(store_statfs(0x1b404c000/0x0/0x1bfc00000, data 0x4e0b0aa/0x4efe000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139362304 unmapped: 35921920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:01.879629+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139403264 unmapped: 35880960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1680321 data_alloc: 301989888 data_used: 13746176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:02.879851+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 153 heartbeat osd_stat(store_statfs(0x1b404b000/0x0/0x1bfc00000, data 0x4e0d51b/0x4f02000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139403264 unmapped: 35880960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:03.880032+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139403264 unmapped: 35880960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:04.880237+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139403264 unmapped: 35880960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 153 heartbeat osd_stat(store_statfs(0x1b404b000/0x0/0x1bfc00000, data 0x4e0d51b/0x4f02000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:05.880609+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 153 ms_handle_reset con 0x5648d6d17800 session 0x5648d9137c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139403264 unmapped: 35880960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:06.880800+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139403264 unmapped: 35880960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1684858 data_alloc: 301989888 data_used: 13746176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:07.881066+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.858315468s of 10.046402931s, submitted: 69
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 ms_handle_reset con 0x5648d8922c00 session 0x5648d9136b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139411456 unmapped: 35872768 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:08.881219+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139411456 unmapped: 35872768 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8923000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 ms_handle_reset con 0x5648d8923000 session 0x5648d9137860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:09.881354+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 heartbeat osd_stat(store_statfs(0x1b4044000/0x0/0x1bfc00000, data 0x4e0fb22/0x4f09000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139436032 unmapped: 35848192 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 154 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 155 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a863c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:10.881506+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 155 ms_handle_reset con 0x5648d6d16800 session 0x5648d6ab4d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 139452416 unmapped: 35831808 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 155 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:11.881640+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 156 ms_handle_reset con 0x5648d6d17800 session 0x5648d57b54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 156 ms_handle_reset con 0x5648d8922c00 session 0x5648d92d0f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140640256 unmapped: 34643968 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1697369 data_alloc: 301989888 data_used: 13758464
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:12.881812+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140656640 unmapped: 34627584 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:13.882030+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140656640 unmapped: 34627584 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:14.882190+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140656640 unmapped: 34627584 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 156 heartbeat osd_stat(store_statfs(0x1b403d000/0x0/0x1bfc00000, data 0x4e1456c/0x4f0e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:15.882597+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140656640 unmapped: 34627584 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:16.883085+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140673024 unmapped: 34611200 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1697209 data_alloc: 301989888 data_used: 13758464
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:17.890553+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d17400 session 0x5648d6c2a5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140673024 unmapped: 34611200 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:18.890984+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.252877235s of 10.754374504s, submitted: 183
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 47
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 141049856 unmapped: 34234368 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:19.891394+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b4034000/0x0/0x1bfc00000, data 0x4e1b7db/0x4f17000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 141049856 unmapped: 34234368 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:20.891597+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 141049856 unmapped: 34234368 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:21.891872+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 141049856 unmapped: 34234368 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1703883 data_alloc: 301989888 data_used: 13758464
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:22.942604+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 141049856 unmapped: 34234368 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:23.942826+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b4032000/0x0/0x1bfc00000, data 0x4e1ca40/0x4f19000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 141099008 unmapped: 34185216 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:24.943013+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b4030000/0x0/0x1bfc00000, data 0x4e22089/0x4f1e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 48
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 140984320 unmapped: 34299904 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:25.943260+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d43a5800 session 0x5648d765e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d17800 session 0x5648d766c000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d16800 session 0x5648d765e3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142770176 unmapped: 32514048 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:26.943421+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d8922c00 session 0x5648d766de00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142770176 unmapped: 32514048 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1698744 data_alloc: 301989888 data_used: 16576512
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:27.943577+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142770176 unmapped: 32514048 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:28.943770+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b402f000/0x0/0x1bfc00000, data 0x4e23d49/0x4f1f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142770176 unmapped: 32514048 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:29.943911+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.245515823s of 11.509392738s, submitted: 59
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d17c00 session 0x5648d765e960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142802944 unmapped: 32481280 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:30.944074+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142802944 unmapped: 32481280 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:31.944272+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b402c000/0x0/0x1bfc00000, data 0x4e253f5/0x4f22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d43a5800 session 0x5648d765eb40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 142819328 unmapped: 32464896 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1703112 data_alloc: 301989888 data_used: 16576512
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:32.944444+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b402c000/0x0/0x1bfc00000, data 0x4e253f5/0x4f22000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6caf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d16800 session 0x5648d765ef00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d17800 session 0x5648d765f0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144146432 unmapped: 31137792 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:33.944579+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6f1a1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d8922c00 session 0x5648d765f4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 ms_handle_reset con 0x5648d43a5800 session 0x5648d765f680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144400384 unmapped: 30883840 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:34.944723+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144343040 unmapped: 30941184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 ms_handle_reset con 0x5648d6d16800 session 0x5648d765fa40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:35.945289+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 ms_handle_reset con 0x5648d6d17800 session 0x5648d765fc20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6cb92c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144359424 unmapped: 30924800 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:36.945506+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8923800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 ms_handle_reset con 0x5648d8923800 session 0x5648d6cb8b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144359424 unmapped: 30924800 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1804620 data_alloc: 301989888 data_used: 16588800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:37.945785+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144359424 unmapped: 30924800 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:38.946030+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 heartbeat osd_stat(store_statfs(0x1b320f000/0x0/0x1bfc00000, data 0x583dda7/0x593e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x70af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 ms_handle_reset con 0x5648d43a5800 session 0x5648d6cb83c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 ms_handle_reset con 0x5648d6d17800 session 0x5648d6f1b680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 ms_handle_reset con 0x5648d6d16800 session 0x5648d6cb85a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144433152 unmapped: 30851072 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:39.946267+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.321061134s of 10.001596451s, submitted: 156
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 ms_handle_reset con 0x5648d6d17c00 session 0x5648d9ffc5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8923c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 ms_handle_reset con 0x5648d8923c00 session 0x5648d69f61e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144457728 unmapped: 30826496 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:40.946486+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 heartbeat osd_stat(store_statfs(0x1b320b000/0x0/0x1bfc00000, data 0x584032c/0x5942000, compress 0x0/0x0/0x0, omap 0x648, meta 0x70af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a863c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144490496 unmapped: 30793728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:41.946674+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 160 ms_handle_reset con 0x5648d6d16800 session 0x5648d765e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144523264 unmapped: 30760960 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1815565 data_alloc: 301989888 data_used: 16617472
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:42.946873+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 handle_osd_map epochs [160,161], i have 161, src has [1,161]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144531456 unmapped: 30752768 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:43.947063+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a80000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6f1ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d7a80000 session 0x5648d6aabe00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d17800 session 0x5648d57b54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d85b6000 session 0x5648d71b61e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144605184 unmapped: 30679040 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:44.947274+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d43a5800 session 0x5648d9137c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144629760 unmapped: 30654464 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:45.947449+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 heartbeat osd_stat(store_statfs(0x1b3202000/0x0/0x1bfc00000, data 0x5844eb6/0x594c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x70af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144654336 unmapped: 30629888 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:46.947625+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144654336 unmapped: 30629888 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1818037 data_alloc: 301989888 data_used: 16613376
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:47.947808+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d16800 session 0x5648d6f1a3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144662528 unmapped: 30621696 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:48.947916+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d17800 session 0x5648d5d76780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144678912 unmapped: 30605312 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:49.948079+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 heartbeat osd_stat(store_statfs(0x1b3202000/0x0/0x1bfc00000, data 0x5844e77/0x594c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x70af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.708637238s of 10.090385437s, submitted: 97
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6a14d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d43a5800 session 0x5648d6ab45a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144711680 unmapped: 30572544 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:50.948185+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d16800 session 0x5648d6c8a5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d6d17800 session 0x5648d6c8be00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 144695296 unmapped: 30588928 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:51.948369+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 ms_handle_reset con 0x5648d85b6000 session 0x5648d6c8b680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a80000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 145784832 unmapped: 29499392 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1828431 data_alloc: 301989888 data_used: 16625664
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:52.948512+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 162 ms_handle_reset con 0x5648d7a80000 session 0x5648d6c8b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 145793024 unmapped: 29491200 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:53.948637+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 145793024 unmapped: 29491200 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:54.948762+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 162 handle_osd_map epochs [162,163], i have 162, src has [1,163]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b31f9000/0x0/0x1bfc00000, data 0x58497c3/0x5954000, compress 0x0/0x0/0x0, omap 0x648, meta 0x70af9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d43a5800 session 0x5648d6c2a5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 145817600 unmapped: 29466624 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:55.948924+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d6d16800 session 0x5648d6ab4d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 145842176 unmapped: 29442048 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:56.949075+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41d6000/0x0/0x1bfc00000, data 0x58498a7/0x5958000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d6d17800 session 0x5648d6ab5a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d85b6000 session 0x5648d71b6960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 146923520 unmapped: 28360704 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1840107 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:57.949238+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d85b6400 session 0x5648d6f1a3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 146980864 unmapped: 28303360 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:58.949376+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d43a5800 session 0x5648d57b54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d16800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:59.949514+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147013632 unmapped: 28270592 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41da000/0x0/0x1bfc00000, data 0x58497c3/0x5954000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:00.949654+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147013632 unmapped: 28270592 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:01.949787+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147013632 unmapped: 28270592 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.899544716s of 11.571134567s, submitted: 169
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d6d17800 session 0x5648d6a150e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d85b6000 session 0x5648d69ef0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:02.950005+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147021824 unmapped: 28262400 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1837152 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8669c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41d5000/0x0/0x1bfc00000, data 0x584eb7b/0x5959000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d8669c00 session 0x5648d71b7a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:03.950182+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147030016 unmapped: 28254208 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:04.950381+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147030016 unmapped: 28254208 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:05.950557+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147054592 unmapped: 28229632 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d70b4800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d70b4800 session 0x5648d69cd4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:06.950732+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147480576 unmapped: 27803648 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41d0000/0x0/0x1bfc00000, data 0x5855618/0x595e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:07.950869+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147480576 unmapped: 27803648 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1932545 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d43a5800 session 0x5648d6aabe00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:08.951000+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:09.951290+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6aab4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b36fc000/0x0/0x1bfc00000, data 0x6324086/0x6431000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d6d17800 session 0x5648d6c2a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:10.951460+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147496960 unmapped: 27787264 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:11.951625+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147398656 unmapped: 27885568 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 ms_handle_reset con 0x5648d85b6000 session 0x5648d69efa40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:12.951829+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1853779 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41b5000/0x0/0x1bfc00000, data 0x586df17/0x5978000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.832743645s of 11.775090218s, submitted: 178
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:13.952026+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:14.952251+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:15.952549+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147406848 unmapped: 27877376 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:16.952744+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:17.952916+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1851207 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41af000/0x0/0x1bfc00000, data 0x58764c5/0x597f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:18.953087+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:19.953323+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:20.953531+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:21.953767+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:22.953954+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1850119 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41ad000/0x0/0x1bfc00000, data 0x58778fd/0x5981000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:23.954135+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:24.954298+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147415040 unmapped: 27869184 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.513941765s of 11.562663078s, submitted: 14
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:25.954553+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:26.954700+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:27.954863+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147423232 unmapped: 27860992 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1850317 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41ac000/0x0/0x1bfc00000, data 0x587895e/0x5982000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:28.955060+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147431424 unmapped: 27852800 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:29.955236+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147431424 unmapped: 27852800 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:30.955416+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147439616 unmapped: 27844608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b41a7000/0x0/0x1bfc00000, data 0x587dc8c/0x5987000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:31.955578+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147439616 unmapped: 27844608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:32.955743+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147439616 unmapped: 27844608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1850609 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:33.955900+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147439616 unmapped: 27844608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:34.956112+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147439616 unmapped: 27844608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.943370819s of 10.005316734s, submitted: 13
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:35.956350+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147439616 unmapped: 27844608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:36.956517+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b419b000/0x0/0x1bfc00000, data 0x5888ed7/0x5993000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147456000 unmapped: 27828224 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:37.956676+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1854895 data_alloc: 301989888 data_used: 16637952
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147456000 unmapped: 27828224 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b419b000/0x0/0x1bfc00000, data 0x5888ed7/0x5993000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:38.956844+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147456000 unmapped: 27828224 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 heartbeat osd_stat(store_statfs(0x1b419b000/0x0/0x1bfc00000, data 0x5888ed7/0x5993000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 13K writes, 48K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 13K writes, 4278 syncs, 3.11 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 7696 writes, 24K keys, 7696 commit groups, 1.0 writes per commit group, ingest: 21.13 MB, 0.04 MB/s
                                                          Interval WAL: 7696 writes, 3365 syncs, 2.29 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:39.956996+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 147472384 unmapped: 27811840 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:40.957137+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148586496 unmapped: 26697728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8669c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 164 ms_handle_reset con 0x5648d8669c00 session 0x5648d96c6780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:41.957273+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 164 heartbeat osd_stat(store_statfs(0x1b4187000/0x0/0x1bfc00000, data 0x589a65d/0x59a7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148529152 unmapped: 26755072 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:42.957404+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1860812 data_alloc: 301989888 data_used: 16650240
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148529152 unmapped: 26755072 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 164 heartbeat osd_stat(store_statfs(0x1b4181000/0x0/0x1bfc00000, data 0x589ff12/0x59ad000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 164 handle_osd_map epochs [164,165], i have 164, src has [1,165]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 165 ms_handle_reset con 0x5648d43a5800 session 0x5648d96c65a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:43.957566+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148520960 unmapped: 26763264 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:44.957699+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148520960 unmapped: 26763264 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.771335602s of 10.000751495s, submitted: 76
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 165 handle_osd_map epochs [165,166], i have 165, src has [1,166]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 166 ms_handle_reset con 0x5648d6d17800 session 0x5648d96c6f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:45.957873+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148561920 unmapped: 26722304 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 166 ms_handle_reset con 0x5648d6d17c00 session 0x5648d96c72c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 166 heartbeat osd_stat(store_statfs(0x1b4169000/0x0/0x1bfc00000, data 0x58b1ba0/0x59c4000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:46.958007+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148561920 unmapped: 26722304 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 167 ms_handle_reset con 0x5648d85b6000 session 0x5648d96c7680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:47.958148+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1873638 data_alloc: 301989888 data_used: 16678912
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148586496 unmapped: 26697728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7657400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:48.958367+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 167 heartbeat osd_stat(store_statfs(0x1b415d000/0x0/0x1bfc00000, data 0x58bce13/0x59d0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148602880 unmapped: 26681344 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 167 handle_osd_map epochs [167,168], i have 167, src has [1,168]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 168 ms_handle_reset con 0x5648d7657400 session 0x5648d96c7a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:49.958533+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 168 ms_handle_reset con 0x5648d43a5800 session 0x5648d96c7c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148627456 unmapped: 26656768 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 169 ms_handle_reset con 0x5648d6d17800 session 0x5648d96c70e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:50.959542+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148733952 unmapped: 26550272 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 169 handle_osd_map epochs [169,170], i have 169, src has [1,170]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 169 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 170 ms_handle_reset con 0x5648d6d17c00 session 0x5648d96c6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 170 heartbeat osd_stat(store_statfs(0x1b4141000/0x0/0x1bfc00000, data 0x58d1350/0x59eb000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:51.960418+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148733952 unmapped: 26550272 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 171 ms_handle_reset con 0x5648d85b6000 session 0x5648d5d765a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7657000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:52.960852+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1888807 data_alloc: 301989888 data_used: 16678912
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148766720 unmapped: 26517504 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 172 ms_handle_reset con 0x5648d7657000 session 0x5648d96c61e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:53.961172+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148824064 unmapped: 26460160 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:54.961920+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148840448 unmapped: 26443776 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 172 handle_osd_map epochs [172,173], i have 172, src has [1,173]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 173 handle_osd_map epochs [172,173], i have 173, src has [1,173]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.542608261s of 10.003323555s, submitted: 152
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 173 ms_handle_reset con 0x5648d43a5800 session 0x5648d5078780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:55.962240+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 148865024 unmapped: 26419200 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 173 ms_handle_reset con 0x5648d6d17800 session 0x5648d6c8ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:56.962402+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 149659648 unmapped: 25624576 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 174 ms_handle_reset con 0x5648d6d17c00 session 0x5648d766c000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 174 heartbeat osd_stat(store_statfs(0x1b4114000/0x0/0x1bfc00000, data 0x58f759f/0x5a19000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:57.962564+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1900851 data_alloc: 301989888 data_used: 16691200
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 149667840 unmapped: 25616384 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:58.962718+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 174 ms_handle_reset con 0x5648d85b6000 session 0x5648d706b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d9e64000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 149725184 unmapped: 25559040 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 175 ms_handle_reset con 0x5648d9e64000 session 0x5648d6c8b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:59.962881+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 149716992 unmapped: 25567232 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 175 ms_handle_reset con 0x5648d43a5800 session 0x5648d6a14780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 175 ms_handle_reset con 0x5648d6d17800 session 0x5648dabde3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:00.963030+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 149741568 unmapped: 25542656 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:01.963242+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 149741568 unmapped: 25542656 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:02.963380+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 176 heartbeat osd_stat(store_statfs(0x1b40fc000/0x0/0x1bfc00000, data 0x5910912/0x5a31000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1905307 data_alloc: 301989888 data_used: 16687104
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150798336 unmapped: 24485888 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:03.963584+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 176 heartbeat osd_stat(store_statfs(0x1b40eb000/0x0/0x1bfc00000, data 0x59216b7/0x5a42000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150806528 unmapped: 24477696 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:04.963750+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150806528 unmapped: 24477696 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 176 heartbeat osd_stat(store_statfs(0x1b40eb000/0x0/0x1bfc00000, data 0x59216b7/0x5a42000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.572480202s of 10.003467560s, submitted: 146
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 49
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:05.963928+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150511616 unmapped: 24772608 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:06.964132+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 176 ms_handle_reset con 0x5648d85b6000 session 0x5648dabdfa40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 176 handle_osd_map epochs [176,177], i have 176, src has [1,177]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150536192 unmapped: 24748032 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:07.964323+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1925481 data_alloc: 301989888 data_used: 16699392
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150536192 unmapped: 24748032 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 177 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6f5b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:08.964516+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 177 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6ab54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150552576 unmapped: 24731648 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:09.964707+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150552576 unmapped: 24731648 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 177 heartbeat osd_stat(store_statfs(0x1b40c2000/0x0/0x1bfc00000, data 0x5949016/0x5a6c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 177 handle_osd_map epochs [177,178], i have 177, src has [1,178]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:10.964891+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 178 heartbeat osd_stat(store_statfs(0x1b40a9000/0x0/0x1bfc00000, data 0x595f6fb/0x5a84000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150577152 unmapped: 24707072 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:11.965071+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150577152 unmapped: 24707072 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:12.965236+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1919606 data_alloc: 301989888 data_used: 16711680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151650304 unmapped: 23633920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:13.965459+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150609920 unmapped: 24674304 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:14.965697+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150609920 unmapped: 24674304 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 178 heartbeat osd_stat(store_statfs(0x1b4099000/0x0/0x1bfc00000, data 0x5971446/0x5a95000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:15.965885+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.344779015s of 10.739048004s, submitted: 110
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150618112 unmapped: 24666112 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 178 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6f5ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:16.966069+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150585344 unmapped: 24698880 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:17.966276+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d43a5800 session 0x5648d7b64000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1925678 data_alloc: 301989888 data_used: 16723968
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150626304 unmapped: 24657920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:18.966519+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150626304 unmapped: 24657920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4084000/0x0/0x1bfc00000, data 0x59835ad/0x5aa9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:19.966662+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150626304 unmapped: 24657920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:20.966814+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150634496 unmapped: 24649728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:21.966973+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4081000/0x0/0x1bfc00000, data 0x59872ce/0x5aad000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150634496 unmapped: 24649728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:22.967113+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1926126 data_alloc: 301989888 data_used: 16723968
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150634496 unmapped: 24649728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:23.967245+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150634496 unmapped: 24649728 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:24.967386+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150659072 unmapped: 24625152 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:25.967585+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150659072 unmapped: 24625152 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.178868294s of 10.391580582s, submitted: 65
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4070000/0x0/0x1bfc00000, data 0x59968d1/0x5abe000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:26.967747+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 150626304 unmapped: 24657920 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:27.967897+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1931020 data_alloc: 301989888 data_used: 16723968
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151707648 unmapped: 23576576 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:28.968140+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151707648 unmapped: 23576576 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:29.968307+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151707648 unmapped: 23576576 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:30.968506+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151707648 unmapped: 23576576 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:31.968675+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d6d17800 session 0x5648d92d14a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151740416 unmapped: 23543808 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4070000/0x0/0x1bfc00000, data 0x5996d6f/0x5abe000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:32.968810+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1934619 data_alloc: 301989888 data_used: 16723968
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151740416 unmapped: 23543808 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d96c7e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:33.968981+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151764992 unmapped: 23519232 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:34.969174+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151764992 unmapped: 23519232 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d85b6000 session 0x5648d7b643c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b406b000/0x0/0x1bfc00000, data 0x5996eb5/0x5ac3000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:35.969462+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4068000/0x0/0x1bfc00000, data 0x599846b/0x5ac6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151789568 unmapped: 23494656 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4068000/0x0/0x1bfc00000, data 0x599846b/0x5ac6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x60cf9b8), peers [0,1,3,4,5] op hist [1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.908418655s of 10.109995842s, submitted: 46
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:36.969606+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d43a5800 session 0x5648d7b645a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d6d17800 session 0x5648d765fc20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151830528 unmapped: 23453696 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:37.969737+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6f5b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6c8a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d43800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1941374 data_alloc: 301989888 data_used: 16723968
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151912448 unmapped: 23371776 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d5d43800 session 0x5648d7b64d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d5d42000 session 0x5648d7b64960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:38.970002+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d43800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d5d43800 session 0x5648d6f5b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151937024 unmapped: 23347200 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b3c68000/0x0/0x1bfc00000, data 0x599e579/0x5ac6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:39.970173+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b3c69000/0x0/0x1bfc00000, data 0x599e517/0x5ac5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151945216 unmapped: 23339008 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:40.970349+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 151953408 unmapped: 23330816 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:41.970496+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153001984 unmapped: 22282240 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:42.970685+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1939483 data_alloc: 301989888 data_used: 16728064
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153001984 unmapped: 22282240 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:43.970851+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153001984 unmapped: 22282240 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b3c65000/0x0/0x1bfc00000, data 0x59a2560/0x5ac9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:44.971057+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153042944 unmapped: 22241280 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:45.971254+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153042944 unmapped: 22241280 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.525398254s of 10.161723137s, submitted: 149
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d43a5800 session 0x5648d6f5ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:46.971410+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b3c53000/0x0/0x1bfc00000, data 0x59b2806/0x5adb000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153051136 unmapped: 22233088 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:47.971584+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d6d17800 session 0x5648d6ab54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d7a9ac00 session 0x5648dabded20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1942662 data_alloc: 301989888 data_used: 16728064
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153059328 unmapped: 22224896 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:48.971785+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153165824 unmapped: 22118400 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:49.971978+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b3c46000/0x0/0x1bfc00000, data 0x59c0e6a/0x5ae8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6ab4780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b3c46000/0x0/0x1bfc00000, data 0x59c0e6a/0x5ae8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 179 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153190400 unmapped: 22093824 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:50.972133+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153206784 unmapped: 22077440 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:51.972323+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153206784 unmapped: 22077440 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 180 ms_handle_reset con 0x5648d43a5800 session 0x5648dabdfa40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:52.972475+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1948079 data_alloc: 301989888 data_used: 16744448
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153231360 unmapped: 22052864 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 180 ms_handle_reset con 0x5648d5d42000 session 0x5648d6a14d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:53.972655+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153231360 unmapped: 22052864 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:54.972850+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153264128 unmapped: 22020096 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 180 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6a14780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 180 heartbeat osd_stat(store_statfs(0x1b3c35000/0x0/0x1bfc00000, data 0x59cfc5d/0x5af7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:55.973692+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153264128 unmapped: 22020096 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:56.973950+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.138516426s of 10.439592361s, submitted: 84
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 heartbeat osd_stat(store_statfs(0x1b3c35000/0x0/0x1bfc00000, data 0x59cfc5d/0x5af7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153272320 unmapped: 22011904 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:57.974161+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 ms_handle_reset con 0x5648d8922400 session 0x5648d6c8ad20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1957637 data_alloc: 301989888 data_used: 16756736
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153272320 unmapped: 22011904 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:58.974364+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153272320 unmapped: 22011904 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:59.974550+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153280512 unmapped: 22003712 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 ms_handle_reset con 0x5648d8922400 session 0x5648d6c8b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:00.974724+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 ms_handle_reset con 0x5648d43a5800 session 0x5648d766d860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153280512 unmapped: 22003712 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 heartbeat osd_stat(store_statfs(0x1b3c2a000/0x0/0x1bfc00000, data 0x59daed8/0x5b04000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:01.974901+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 ms_handle_reset con 0x5648d5d42000 session 0x5648d766d2c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153280512 unmapped: 22003712 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 heartbeat osd_stat(store_statfs(0x1b3c29000/0x0/0x1bfc00000, data 0x59daee9/0x5b05000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:02.975043+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1961879 data_alloc: 301989888 data_used: 16756736
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153305088 unmapped: 21979136 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:03.975240+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153305088 unmapped: 21979136 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:04.975396+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153329664 unmapped: 21954560 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:05.975560+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153346048 unmapped: 21938176 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 183 handle_osd_map epochs [182,183], i have 183, src has [1,183]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 183 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6a86000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:06.975711+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.766902924s of 10.194535255s, submitted: 114
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153378816 unmapped: 21905408 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a87800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 184 ms_handle_reset con 0x5648d7a87800 session 0x5648d6a95680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:07.975882+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 184 heartbeat osd_stat(store_statfs(0x1b3bfb000/0x0/0x1bfc00000, data 0x59ffe1e/0x5b31000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1981661 data_alloc: 301989888 data_used: 16785408
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153419776 unmapped: 21864448 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 185 ms_handle_reset con 0x5648d85b8000 session 0x5648d92d0780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 185 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d766c000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:08.976039+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153452544 unmapped: 21831680 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d43a5800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:09.976247+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153477120 unmapped: 21807104 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 186 handle_osd_map epochs [185,186], i have 186, src has [1,186]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 186 ms_handle_reset con 0x5648d43a5800 session 0x5648dbda61e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:10.976395+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b3bdc000/0x0/0x1bfc00000, data 0x5a1dacc/0x5b51000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153501696 unmapped: 21782528 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:11.976552+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 186 ms_handle_reset con 0x5648d6d17c00 session 0x5648d69cc3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 187 ms_handle_reset con 0x5648d5d42000 session 0x5648d96283c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153853952 unmapped: 21430272 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 187 ms_handle_reset con 0x5648d5d42000 session 0x5648d9d86780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:12.976729+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1992021 data_alloc: 301989888 data_used: 16809984
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153894912 unmapped: 21389312 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 50
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:13.976910+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 153952256 unmapped: 21331968 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:14.977104+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155000832 unmapped: 20283392 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:15.977272+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155058176 unmapped: 20226048 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 187 heartbeat osd_stat(store_statfs(0x1b3bc9000/0x0/0x1bfc00000, data 0x5a31d04/0x5b65000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:16.977426+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.497847557s of 10.002332687s, submitted: 339
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155074560 unmapped: 20209664 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 188 heartbeat osd_stat(store_statfs(0x1b3baa000/0x0/0x1bfc00000, data 0x5a4e3e1/0x5b83000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:17.977579+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1996299 data_alloc: 301989888 data_used: 16830464
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155074560 unmapped: 20209664 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:18.977733+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155074560 unmapped: 20209664 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:19.977891+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155074560 unmapped: 20209664 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 189 ms_handle_reset con 0x5648d6d17c00 session 0x5648d9e55680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:20.978084+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 189 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d9e55a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155082752 unmapped: 20201472 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:21.978240+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155107328 unmapped: 20176896 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 ms_handle_reset con 0x5648d85b8000 session 0x5648d9e55e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:22.978372+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 heartbeat osd_stat(store_statfs(0x1b3b91000/0x0/0x1bfc00000, data 0x5a61c6e/0x5b9c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2008833 data_alloc: 301989888 data_used: 16842752
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155123712 unmapped: 20160512 heap: 175284224 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 ms_handle_reset con 0x5648d7a9b400 session 0x5648d9628b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 heartbeat osd_stat(store_statfs(0x1b3b91000/0x0/0x1bfc00000, data 0x5a61c6e/0x5b9c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:23.978573+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155222016 unmapped: 28459008 heap: 183681024 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 heartbeat osd_stat(store_statfs(0x1b3b91000/0x0/0x1bfc00000, data 0x5a61c6e/0x5b9c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [0,2])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:24.978747+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 ms_handle_reset con 0x5648d5d42000 session 0x5648d9d87e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6a14f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155246592 unmapped: 28434432 heap: 183681024 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 heartbeat osd_stat(store_statfs(0x1b3389000/0x0/0x1bfc00000, data 0x626bd72/0x63a5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:25.978955+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155213824 unmapped: 36864000 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:26.979127+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 190 handle_osd_map epochs [190,191], i have 190, src has [1,191]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 191 heartbeat osd_stat(store_statfs(0x1b1b78000/0x0/0x1bfc00000, data 0x7a7c5b4/0x7bb6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.541200638s of 10.117086411s, submitted: 136
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163651584 unmapped: 28426240 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:27.979308+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2454760 data_alloc: 301989888 data_used: 16859136
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155328512 unmapped: 36749312 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:28.979493+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155467776 unmapped: 36610048 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:29.979688+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163913728 unmapped: 28164096 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:30.979882+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 192 heartbeat osd_stat(store_statfs(0x1ae349000/0x0/0x1bfc00000, data 0xb2a6770/0xb3e4000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155639808 unmapped: 36438016 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:31.980061+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164175872 unmapped: 27901952 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:32.980246+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2900616 data_alloc: 301989888 data_used: 16871424
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155852800 unmapped: 36225024 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:33.980434+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 155959296 unmapped: 36118528 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:34.980590+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165617664 unmapped: 26460160 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 192 heartbeat osd_stat(store_statfs(0x1ab31f000/0x0/0x1bfc00000, data 0xe2d156a/0xe40f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:35.980827+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165855232 unmapped: 26222592 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:36.980994+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 157605888 unmapped: 34471936 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:37.981257+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.357991219s of 10.392288208s, submitted: 108
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3344970 data_alloc: 301989888 data_used: 16883712
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 157679616 unmapped: 34398208 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:38.981461+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166182912 unmapped: 25894912 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:39.981609+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 157851648 unmapped: 34226176 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:40.981820+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x1a62d8000/0x0/0x1bfc00000, data 0x13318d23/0x13456000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 157917184 unmapped: 34160640 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:41.981979+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 157974528 unmapped: 34103296 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:42.982162+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3679710 data_alloc: 301989888 data_used: 16883712
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 158031872 unmapped: 34045952 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:43.982305+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x1a4ac0000/0x0/0x1bfc00000, data 0x14b2ff08/0x14c6e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 158269440 unmapped: 33808384 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:44.982477+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 158343168 unmapped: 33734656 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:45.982673+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 158392320 unmapped: 33685504 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:46.982839+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166854656 unmapped: 25223168 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:47.983415+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x1a12a0000/0x0/0x1bfc00000, data 0x1834e908/0x1848e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.665649414s of 10.526628494s, submitted: 50
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4122558 data_alloc: 301989888 data_used: 16883712
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 158580736 unmapped: 33497088 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:48.983591+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167084032 unmapped: 24993792 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:49.983796+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 159875072 unmapped: 32202752 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:50.983990+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x19fa5d000/0x0/0x1bfc00000, data 0x19b8e5be/0x19cd1000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 51
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 159940608 unmapped: 32137216 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:51.984163+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x19f25e000/0x0/0x1bfc00000, data 0x1a38e523/0x1a4d0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160006144 unmapped: 32071680 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:52.995424+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4353184 data_alloc: 301989888 data_used: 16883712
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160006144 unmapped: 32071680 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:53.995656+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160153600 unmapped: 31924224 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:54.995907+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160161792 unmapped: 31916032 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:55.996153+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x19d219000/0x0/0x1bfc00000, data 0x1c3d52f4/0x1c515000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160219136 unmapped: 31858688 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:56.996373+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85c7000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 ms_handle_reset con 0x5648d85c7000 session 0x5648d7898780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160514048 unmapped: 31563776 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:57.996599+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4682890 data_alloc: 301989888 data_used: 16883712
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.322772026s of 10.250596046s, submitted: 74
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160645120 unmapped: 31432704 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:58.996775+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x19b207000/0x0/0x1bfc00000, data 0x1e3e7141/0x1e527000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9cc00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 ms_handle_reset con 0x5648d7a9cc00 session 0x5648d7898960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 169181184 unmapped: 22896640 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:59.996897+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 ms_handle_reset con 0x5648da21a400 session 0x5648dabde960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d5d42000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 heartbeat osd_stat(store_statfs(0x19a9f8000/0x0/0x1bfc00000, data 0x1ebf59f5/0x1ed36000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 ms_handle_reset con 0x5648da21a000 session 0x5648d7898f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9cc00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 194 ms_handle_reset con 0x5648d7a9cc00 session 0x5648d9629860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 169254912 unmapped: 22822912 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:00.997039+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85c7000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 194 ms_handle_reset con 0x5648d85c7000 session 0x5648d6c8b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 160931840 unmapped: 31145984 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d6d17c00 session 0x5648d78992c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:01.997181+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d5d42000 session 0x5648d71b6b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 169410560 unmapped: 22667264 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:02.997369+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d6d17c00 session 0x5648d7899e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9cc00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5323312 data_alloc: 301989888 data_used: 16900096
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 169566208 unmapped: 22511616 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:03.997508+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 heartbeat osd_stat(store_statfs(0x1961d8000/0x0/0x1bfc00000, data 0x23410673/0x23556000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d7a9cc00 session 0x5648d9d86780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 161251328 unmapped: 30826496 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:04.997698+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85c7000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d85c7000 session 0x5648dbda7860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172105728 unmapped: 19972096 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:05.997907+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648da21a000 session 0x5648d96294a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648da21a800 session 0x5648dbda6960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d6d17c00 session 0x5648d9d863c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9cc00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163569664 unmapped: 28508160 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:06.998062+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85c7000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648da21a000 session 0x5648d7899a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648d85c7000 session 0x5648d9e550e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 ms_handle_reset con 0x5648da21ac00 session 0x5648d69cda40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 heartbeat osd_stat(store_statfs(0x192e0c000/0x0/0x1bfc00000, data 0x267d9104/0x26922000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 ms_handle_reset con 0x5648d7a9cc00 session 0x5648d9e55a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 ms_handle_reset con 0x5648da21b000 session 0x5648d6cb5c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163979264 unmapped: 28098560 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:07.998260+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 ms_handle_reset con 0x5648d6d17c00 session 0x5648dda50d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 197 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6a15e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85c7000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 197 ms_handle_reset con 0x5648d85c7000 session 0x5648d765f2c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5823165 data_alloc: 301989888 data_used: 16916480
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.982829571s of 10.054895401s, submitted: 254
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 197 ms_handle_reset con 0x5648da21a800 session 0x5648d6c8a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164274176 unmapped: 27803648 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:08.998396+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 197 ms_handle_reset con 0x5648da21a800 session 0x5648d6f5ba40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164290560 unmapped: 27787264 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:09.998539+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 197 ms_handle_reset con 0x5648d7a9ac00 session 0x5648dda51a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165404672 unmapped: 26673152 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:10.998702+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 ms_handle_reset con 0x5648da21b000 session 0x5648d766cd20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 ms_handle_reset con 0x5648d6d17c00 session 0x5648d96c65a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 ms_handle_reset con 0x5648da21a000 session 0x5648d6cc05a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 heartbeat osd_stat(store_statfs(0x19199a000/0x0/0x1bfc00000, data 0x27c47472/0x27d93000, compress 0x0/0x0/0x0, omap 0x648, meta 0x64cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163463168 unmapped: 28614656 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:11.998862+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 ms_handle_reset con 0x5648d6d17c00 session 0x5648d96c7c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163471360 unmapped: 28606464 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:12.999013+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 199 ms_handle_reset con 0x5648da21ac00 session 0x5648d96290e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2180024 data_alloc: 301989888 data_used: 16936960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163487744 unmapped: 28590080 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:13.999173+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 200 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d78992c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:14.999333+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163192832 unmapped: 28884992 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 200 ms_handle_reset con 0x5648da21a800 session 0x5648d7899a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 200 heartbeat osd_stat(store_statfs(0x1b356e000/0x0/0x1bfc00000, data 0x5c75420/0x5dc0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:15.999583+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162062336 unmapped: 30015488 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 201 ms_handle_reset con 0x5648da21b000 session 0x5648d9e541e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 201 ms_handle_reset con 0x5648d6d17c00 session 0x5648dda51c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:16.999734+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162070528 unmapped: 30007296 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:17.999903+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162095104 unmapped: 29982720 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 203 ms_handle_reset con 0x5648da21a800 session 0x5648d6f1b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2201448 data_alloc: 301989888 data_used: 16961536
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:19.000149+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162111488 unmapped: 29966336 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.193119049s of 10.428631783s, submitted: 401
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 203 ms_handle_reset con 0x5648da21ac00 session 0x5648d96285a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 203 handle_osd_map epochs [203,204], i have 203, src has [1,204]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6a86000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 ms_handle_reset con 0x5648da21b000 session 0x5648d7b645a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:20.000297+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162177024 unmapped: 29900800 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b3542000/0x0/0x1bfc00000, data 0x5c98a98/0x5dea000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:21.000510+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162234368 unmapped: 29843456 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 206 handle_osd_map epochs [205,206], i have 206, src has [1,206]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 206 handle_osd_map epochs [205,206], i have 206, src has [1,206]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 206 ms_handle_reset con 0x5648da21a800 session 0x5648d7b64000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:22.000701+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162299904 unmapped: 29777920 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 206 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6a14780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b3535000/0x0/0x1bfc00000, data 0x5c9d78e/0x5df7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:23.000891+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 207 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6f5b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162316288 unmapped: 29761536 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 207 ms_handle_reset con 0x5648da21ac00 session 0x5648d6cb8d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2230480 data_alloc: 301989888 data_used: 16973824
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 207 ms_handle_reset con 0x5648da21b000 session 0x5648dbda61e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:24.001131+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162357248 unmapped: 29720576 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 ms_handle_reset con 0x5648da21b400 session 0x5648d96281e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:25.001282+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162340864 unmapped: 29736960 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6a7e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 ms_handle_reset con 0x5648da21a800 session 0x5648d92d1c20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:26.001534+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162357248 unmapped: 29720576 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 ms_handle_reset con 0x5648da21ac00 session 0x5648d96c6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:27.001768+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162390016 unmapped: 29687808 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 ms_handle_reset con 0x5648da21b800 session 0x5648d71b63c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d50781e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 heartbeat osd_stat(store_statfs(0x1b3536000/0x0/0x1bfc00000, data 0x5ca1d42/0x5df8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 208 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:28.001913+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162422784 unmapped: 29655040 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2226861 data_alloc: 301989888 data_used: 16982016
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:29.002068+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162422784 unmapped: 29655040 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:30.002279+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162422784 unmapped: 29655040 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.155056000s of 11.181994438s, submitted: 322
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:31.002477+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162422784 unmapped: 29655040 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:32.002662+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162447360 unmapped: 29630464 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 209 heartbeat osd_stat(store_statfs(0x1b3532000/0x0/0x1bfc00000, data 0x5ca4221/0x5dfc000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:33.002813+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162480128 unmapped: 29597696 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2239705 data_alloc: 301989888 data_used: 16994304
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:34.002972+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162521088 unmapped: 29556736 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:35.003147+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 162521088 unmapped: 29556736 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 211 handle_osd_map epochs [211,212], i have 211, src has [1,212]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 212 ms_handle_reset con 0x5648da21a800 session 0x5648d6f1a3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 212 heartbeat osd_stat(store_statfs(0x1b3528000/0x0/0x1bfc00000, data 0x5ca8bcf/0x5e05000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:36.003662+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163586048 unmapped: 28491776 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 212 ms_handle_reset con 0x5648da21ac00 session 0x5648d5078f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 213 ms_handle_reset con 0x5648da21b400 session 0x5648d7b652c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21bc00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 213 ms_handle_reset con 0x5648da21bc00 session 0x5648d91372c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:37.003924+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163610624 unmapped: 28467200 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:38.004274+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163635200 unmapped: 28442624 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 214 handle_osd_map epochs [213,214], i have 214, src has [1,214]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2248565 data_alloc: 301989888 data_used: 17006592
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:39.004479+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163643392 unmapped: 28434432 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:40.004673+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 214 heartbeat osd_stat(store_statfs(0x1b351e000/0x0/0x1bfc00000, data 0x5cafab1/0x5e0e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163643392 unmapped: 28434432 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.887663841s of 10.340756416s, submitted: 144
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:41.004857+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163643392 unmapped: 28434432 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:42.005064+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163692544 unmapped: 28385280 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 215 heartbeat osd_stat(store_statfs(0x1b351b000/0x0/0x1bfc00000, data 0x5cb2000/0x5e12000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:43.005274+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163717120 unmapped: 28360704 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b3515000/0x0/0x1bfc00000, data 0x5cb454f/0x5e18000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2262344 data_alloc: 301989888 data_used: 17031168
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:44.005459+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163717120 unmapped: 28360704 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 ms_handle_reset con 0x5648d7a9ac00 session 0x5648dbda6000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 ms_handle_reset con 0x5648da21a800 session 0x5648d69f72c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:45.005731+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163725312 unmapped: 28352512 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b3514000/0x0/0x1bfc00000, data 0x5cb455f/0x5e19000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:46.005988+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163725312 unmapped: 28352512 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:47.006145+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163725312 unmapped: 28352512 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:48.006371+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163725312 unmapped: 28352512 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 ms_handle_reset con 0x5648da21ac00 session 0x5648d9136f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2264571 data_alloc: 301989888 data_used: 17031168
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:49.006526+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163758080 unmapped: 28319744 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 ms_handle_reset con 0x5648d930ac00 session 0x5648d9629e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 217 ms_handle_reset con 0x5648da21b400 session 0x5648dbda7e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:50.006691+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163799040 unmapped: 28278784 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 217 handle_osd_map epochs [217,218], i have 217, src has [1,218]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.795913696s of 10.002943039s, submitted: 63
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:51.006889+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b3509000/0x0/0x1bfc00000, data 0x5cb920e/0x5e23000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163848192 unmapped: 28229632 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 ms_handle_reset con 0x5648d930b800 session 0x5648d50792c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 ms_handle_reset con 0x5648d7a9ac00 session 0x5648dda50b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 ms_handle_reset con 0x5648d930a400 session 0x5648d6f1b680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:52.007086+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163880960 unmapped: 28196864 heap: 192077824 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b350b000/0x0/0x1bfc00000, data 0x5cb920e/0x5e23000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:53.007308+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 176627712 unmapped: 23846912 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2634317 data_alloc: 301989888 data_used: 17047552
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:54.007521+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 168189952 unmapped: 32284672 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b010d000/0x0/0x1bfc00000, data 0x90b9163/0x9221000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:55.007665+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 163995648 unmapped: 36478976 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:56.007857+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164012032 unmapped: 36462592 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:57.008028+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 168222720 unmapped: 32251904 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:58.008227+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172474368 unmapped: 28000256 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:59.008673+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3680321 data_alloc: 301989888 data_used: 17059840
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 heartbeat osd_stat(store_statfs(0x1a6d08000/0x0/0x1bfc00000, data 0x124bb5b4/0x12625000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,1,1,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 180862976 unmapped: 19611648 heap: 200474624 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:00.008822+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164102144 unmapped: 40574976 heap: 204677120 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.181935310s of 10.037582397s, submitted: 215
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:01.008966+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172531712 unmapped: 32145408 heap: 204677120 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648da21a800 session 0x5648d6f5a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 heartbeat osd_stat(store_statfs(0x1a2d09000/0x0/0x1bfc00000, data 0x164bb5b4/0x16625000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:02.009166+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 168361984 unmapped: 36315136 heap: 204677120 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6de9400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d6de9400 session 0x5648d91370e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:03.009310+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 168370176 unmapped: 36306944 heap: 204677120 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:04.009465+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4812536 data_alloc: 301989888 data_used: 17059840
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188342272 unmapped: 20537344 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d7a9ac00 session 0x5648dda503c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:05.009610+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 176840704 unmapped: 32038912 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d930a400 session 0x5648d9ffc1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:06.009795+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 heartbeat osd_stat(store_statfs(0x19a506000/0x0/0x1bfc00000, data 0x1ecbb74c/0x1ee28000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 168468480 unmapped: 40411136 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648da21ac00 session 0x5648d766d860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d930ac00 session 0x5648d6cb83c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648da21a800 session 0x5648d71b7e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d930b800 session 0x5648d7899e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:07.010020+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165314560 unmapped: 43565056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:08.010131+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6c28b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164904960 unmapped: 43974656 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 ms_handle_reset con 0x5648d930a400 session 0x5648d9ffcb40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:09.010283+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2395029 data_alloc: 301989888 data_used: 17059840
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164929536 unmapped: 43950080 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 220 ms_handle_reset con 0x5648d930ac00 session 0x5648d6f5a1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:10.010459+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164913152 unmapped: 43966464 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21a800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:11.010611+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 221 handle_osd_map epochs [220,221], i have 221, src has [1,221]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.425664902s of 10.253893852s, submitted: 260
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164929536 unmapped: 43950080 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 221 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 222 ms_handle_reset con 0x5648da21a800 session 0x5648dabdf2c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:12.010794+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 222 heartbeat osd_stat(store_statfs(0x1b34fa000/0x0/0x1bfc00000, data 0x5cc265e/0x5e31000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164937728 unmapped: 43941888 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 222 ms_handle_reset con 0x5648d7a9b400 session 0x5648d9e545a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:13.011029+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 222 ms_handle_reset con 0x5648d8922400 session 0x5648d5d77860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164864000 unmapped: 44015616 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 52
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 223 ms_handle_reset con 0x5648d85b8000 session 0x5648d71b6960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:14.011491+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2408354 data_alloc: 301989888 data_used: 17084416
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:15.011647+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:16.011961+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b34f8000/0x0/0x1bfc00000, data 0x5cc4c11/0x5e35000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:17.012126+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 224 heartbeat osd_stat(store_statfs(0x1b34f9000/0x0/0x1bfc00000, data 0x5cc4c11/0x5e35000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:18.012330+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 225 heartbeat osd_stat(store_statfs(0x1b34f4000/0x0/0x1bfc00000, data 0x5cc718c/0x5e39000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:19.012465+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2417147 data_alloc: 301989888 data_used: 17096704
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:20.012630+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 225 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d765e3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 225 ms_handle_reset con 0x5648d930a400 session 0x5648d7b641e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 225 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6cc0780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:21.012821+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.844905853s of 10.099040985s, submitted: 305
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165060608 unmapped: 43819008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 226 heartbeat osd_stat(store_statfs(0x1b34ea000/0x0/0x1bfc00000, data 0x5ccbe74/0x5e44000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:22.012969+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165076992 unmapped: 43802624 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 227 handle_osd_map epochs [227,228], i have 227, src has [1,228]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 228 handle_osd_map epochs [227,228], i have 228, src has [1,228]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:23.013143+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165068800 unmapped: 43810816 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 228 ms_handle_reset con 0x5648d7a9b400 session 0x5648d78981e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:24.013287+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2434378 data_alloc: 301989888 data_used: 17121280
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165093376 unmapped: 43786240 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:25.013430+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 228 ms_handle_reset con 0x5648d85b8000 session 0x5648d96294a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164913152 unmapped: 43966464 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 229 ms_handle_reset con 0x5648d8922400 session 0x5648d6ab54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 229 heartbeat osd_stat(store_statfs(0x1b2b47000/0x0/0x1bfc00000, data 0x6668df9/0x67e6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x68cf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:26.013645+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164929536 unmapped: 43950080 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 229 ms_handle_reset con 0x5648d930ac00 session 0x5648d765e5a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:27.013844+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164962304 unmapped: 43917312 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 229 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d96c7a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 230 ms_handle_reset con 0x5648d85b8000 session 0x5648d6a94b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:28.013985+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 230 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6c8ba40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164642816 unmapped: 44236800 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 231 ms_handle_reset con 0x5648d930b800 session 0x5648d9ffcf00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648da21ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 231 ms_handle_reset con 0x5648d8922400 session 0x5648d9d865a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:29.014176+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2629276 data_alloc: 301989888 data_used: 17145856
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164667392 unmapped: 44212224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648da21ac00 session 0x5648dabde1e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d71b65a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:30.014380+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164691968 unmapped: 44187648 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9b400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d7a9b400 session 0x5648d6be2b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:31.014535+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d85b8000 session 0x5648d6f5b680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 164749312 unmapped: 44130304 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d930b800 session 0x5648d69f65a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 heartbeat osd_stat(store_statfs(0x1b1c26000/0x0/0x1bfc00000, data 0x718442b/0x7308000, compress 0x0/0x0/0x0, omap 0x648, meta 0x6ccf9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d48400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:32.014689+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.696251869s of 10.818823814s, submitted: 302
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d6d48400 session 0x5648d6f1b680
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 169041920 unmapped: 39837696 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d930b800 session 0x5648d6ab4b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 ms_handle_reset con 0x5648d6d17c00 session 0x5648d9137860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 233 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d5d76780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 233 heartbeat osd_stat(store_statfs(0x1afbf6000/0x0/0x1bfc00000, data 0x8013907/0x8197000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:33.014859+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 233 ms_handle_reset con 0x5648d8922400 session 0x5648d6f1bc20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165904384 unmapped: 42975232 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:34.015032+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751157 data_alloc: 301989888 data_used: 17158144
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165904384 unmapped: 42975232 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 235 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6f1b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:35.015174+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d48400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 235 ms_handle_reset con 0x5648d6d48400 session 0x5648d6ab4d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165896192 unmapped: 42983424 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 235 heartbeat osd_stat(store_statfs(0x1afbed000/0x0/0x1bfc00000, data 0x8018472/0x819f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:36.015381+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165896192 unmapped: 42983424 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:37.015524+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165371904 unmapped: 43507712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 236 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 237 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6ab54a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:38.015676+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165421056 unmapped: 43458560 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 237 heartbeat osd_stat(store_statfs(0x1b13ff000/0x0/0x1bfc00000, data 0x667b815/0x6802000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:39.030424+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2570314 data_alloc: 301989888 data_used: 17166336
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 165421056 unmapped: 43458560 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 238 ms_handle_reset con 0x5648d930b800 session 0x5648d6cc0780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:40.030584+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166535168 unmapped: 42344448 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8922800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 238 ms_handle_reset con 0x5648d8922800 session 0x5648d5d77860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:41.030749+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166535168 unmapped: 42344448 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:42.030917+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.142984390s of 10.106576920s, submitted: 303
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 239 heartbeat osd_stat(store_statfs(0x1b1f1d000/0x0/0x1bfc00000, data 0x5ce7e6e/0x5e70000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166551552 unmapped: 42328064 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:43.031092+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166551552 unmapped: 42328064 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:44.031284+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2499969 data_alloc: 301989888 data_used: 17166336
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166551552 unmapped: 42328064 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:45.031444+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166551552 unmapped: 42328064 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:46.031642+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 241 heartbeat osd_stat(store_statfs(0x1b1f17000/0x0/0x1bfc00000, data 0x5cec7dd/0x5e76000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166395904 unmapped: 42483712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:47.031831+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 166395904 unmapped: 42483712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 ms_handle_reset con 0x5648d6d17c00 session 0x5648dabdfa40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:48.031970+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d48400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 ms_handle_reset con 0x5648d6d48400 session 0x5648dabdf2c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6c283c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 ms_handle_reset con 0x5648d930b800 session 0x5648d6c28960
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d8923400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167452672 unmapped: 41426944 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 ms_handle_reset con 0x5648d8923400 session 0x5648d6c29860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:49.032168+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2588253 data_alloc: 301989888 data_used: 17166336
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167993344 unmapped: 40886272 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 heartbeat osd_stat(store_statfs(0x1b151c000/0x0/0x1bfc00000, data 0x66e4253/0x6872000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:50.032323+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167993344 unmapped: 40886272 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:51.032986+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167993344 unmapped: 40886272 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:52.033084+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.719498634s of 10.007939339s, submitted: 128
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167788544 unmapped: 41091072 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:53.033295+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b14ec000/0x0/0x1bfc00000, data 0x67106af/0x68a1000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167788544 unmapped: 41091072 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:54.033476+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2607361 data_alloc: 301989888 data_used: 18227200
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:55.033683+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:56.033896+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:57.034058+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:58.034272+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b14ea000/0x0/0x1bfc00000, data 0x67107e5/0x68a3000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:59.034410+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2609129 data_alloc: 301989888 data_used: 18227200
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:00.034583+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:01.034745+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:02.034921+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.922871590s of 10.008887291s, submitted: 26
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:03.035126+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b14ec000/0x0/0x1bfc00000, data 0x671074a/0x68a2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b14ec000/0x0/0x1bfc00000, data 0x671074a/0x68a2000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:04.035297+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2607335 data_alloc: 301989888 data_used: 18231296
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167796736 unmapped: 41082880 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:05.035455+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 167813120 unmapped: 41066496 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:06.035639+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173375488 unmapped: 35504128 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b063b000/0x0/0x1bfc00000, data 0x75b87e5/0x774b000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:07.035824+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172523520 unmapped: 36356096 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:08.035952+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:09.036140+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2737639 data_alloc: 301989888 data_used: 18243584
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:10.036336+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:11.036555+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:12.036709+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b0594000/0x0/0x1bfc00000, data 0x76639bd/0x77f9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:13.036865+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.496051788s of 11.011401176s, submitted: 115
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:14.037036+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2740147 data_alloc: 301989888 data_used: 18243584
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:15.037149+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:16.037363+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:17.037571+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:18.037786+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b0574000/0x0/0x1bfc00000, data 0x7685922/0x781a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:19.037985+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2739901 data_alloc: 301989888 data_used: 18243584
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172859392 unmapped: 36020224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b0568000/0x0/0x1bfc00000, data 0x76908dc/0x7825000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:20.038158+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172859392 unmapped: 36020224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:21.038351+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172859392 unmapped: 36020224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:22.038504+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172859392 unmapped: 36020224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b0568000/0x0/0x1bfc00000, data 0x7690815/0x7824000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:23.038707+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b0568000/0x0/0x1bfc00000, data 0x7690815/0x7824000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172875776 unmapped: 36003840 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:24.038846+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2739163 data_alloc: 301989888 data_used: 18243584
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172875776 unmapped: 36003840 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:25.039015+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172875776 unmapped: 36003840 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.572092056s of 12.660041809s, submitted: 17
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:26.039239+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172875776 unmapped: 36003840 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 244 heartbeat osd_stat(store_statfs(0x1b0565000/0x0/0x1bfc00000, data 0x7692db1/0x7828000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:27.039470+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172892160 unmapped: 35987456 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:28.039638+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172892160 unmapped: 35987456 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d48400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:29.039795+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2743961 data_alloc: 301989888 data_used: 18255872
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172908544 unmapped: 35971072 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:30.039989+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172908544 unmapped: 35971072 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7a9ac00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:31.040132+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 244 ms_handle_reset con 0x5648d930b800 session 0x5648d6f1b4a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 244 heartbeat osd_stat(store_statfs(0x1b055f000/0x0/0x1bfc00000, data 0x7697e7a/0x782e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172941312 unmapped: 35938304 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:32.040310+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 244 handle_osd_map epochs [244,245], i have 244, src has [1,245]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 245 ms_handle_reset con 0x5648db42a400 session 0x5648d6cc05a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172957696 unmapped: 35921920 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 246 ms_handle_reset con 0x5648db42a000 session 0x5648d7898d20
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c2c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 246 ms_handle_reset con 0x5648d51c2c00 session 0x5648d92d0b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 246 ms_handle_reset con 0x5648d7a9ac00 session 0x5648d6cb2f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:33.040479+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172998656 unmapped: 35880960 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c2c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:34.040624+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 246 ms_handle_reset con 0x5648d930b800 session 0x5648d6a14780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2756706 data_alloc: 301989888 data_used: 18268160
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172998656 unmapped: 35880960 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:35.040808+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 247 ms_handle_reset con 0x5648db42a000 session 0x5648d96c74a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173023232 unmapped: 35856384 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:36.041029+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.033048630s of 10.294373512s, submitted: 71
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 248 ms_handle_reset con 0x5648db42a400 session 0x5648d765e3c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 248 ms_handle_reset con 0x5648d51c2c00 session 0x5648dbda6f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173015040 unmapped: 35864576 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b0549000/0x0/0x1bfc00000, data 0x76a1a09/0x7843000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:37.041280+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173096960 unmapped: 35782656 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:38.041433+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c2400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173096960 unmapped: 35782656 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:39.041614+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 249 ms_handle_reset con 0x5648d51c2400 session 0x5648d9d86000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2775696 data_alloc: 301989888 data_used: 18296832
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173105152 unmapped: 35774464 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:40.041794+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 53
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c2c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173293568 unmapped: 35586048 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:41.041996+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d930b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 249 heartbeat osd_stat(store_statfs(0x1b0544000/0x0/0x1bfc00000, data 0x76a424a/0x7849000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42a000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 249 ms_handle_reset con 0x5648db42a000 session 0x5648d6cb4b40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173293568 unmapped: 35586048 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42a400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c3800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 250 ms_handle_reset con 0x5648db42a400 session 0x5648d69f7860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:42.042128+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173293568 unmapped: 35586048 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 250 handle_osd_map epochs [250,251], i have 250, src has [1,251]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 251 ms_handle_reset con 0x5648d51c3800 session 0x5648d6a87a40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 251 ms_handle_reset con 0x5648d930b800 session 0x5648d71b65a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:43.042314+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173293568 unmapped: 35586048 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:44.042506+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2781288 data_alloc: 301989888 data_used: 18309120
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173293568 unmapped: 35586048 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648dc022c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:45.042689+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173293568 unmapped: 35586048 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 251 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 252 ms_handle_reset con 0x5648dc022c00 session 0x5648d7898780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:46.042902+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173301760 unmapped: 35577856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c3800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.472881317s of 10.782907486s, submitted: 94
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:47.043071+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b053b000/0x0/0x1bfc00000, data 0x76aaf61/0x7852000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173301760 unmapped: 35577856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b053c000/0x0/0x1bfc00000, data 0x76aae9a/0x7851000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:48.043274+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 253 ms_handle_reset con 0x5648d51c3800 session 0x5648d6f1b0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648dadf0c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 253 ms_handle_reset con 0x5648dadf0c00 session 0x5648d9e554a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173301760 unmapped: 35577856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:49.043441+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 254 ms_handle_reset con 0x5648d6d48400 session 0x5648d91363c0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2786343 data_alloc: 301989888 data_used: 18305024
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173301760 unmapped: 35577856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:50.043630+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 254 ms_handle_reset con 0x5648d6d17c00 session 0x5648d6cb9e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173301760 unmapped: 35577856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648db42b800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 254 handle_osd_map epochs [254,255], i have 254, src has [1,255]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:51.043819+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 255 heartbeat osd_stat(store_statfs(0x1b0537000/0x0/0x1bfc00000, data 0x76af4f1/0x7856000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 255 ms_handle_reset con 0x5648db42b800 session 0x5648d6c2a780
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172163072 unmapped: 36716544 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:52.043959+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172171264 unmapped: 36708352 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:53.044132+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172138496 unmapped: 36741120 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 256 handle_osd_map epochs [256,257], i have 256, src has [1,257]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:54.044302+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2595377 data_alloc: 301989888 data_used: 17252352
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172138496 unmapped: 36741120 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:55.044495+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648dad7c400
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172138496 unmapped: 36741120 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:56.044674+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 258 heartbeat osd_stat(store_statfs(0x1b1ecf000/0x0/0x1bfc00000, data 0x5d14579/0x5ebf000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173285376 unmapped: 35594240 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.164410591s of 10.002748489s, submitted: 272
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:57.044826+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173301760 unmapped: 35577856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 260 ms_handle_reset con 0x5648dad7c400 session 0x5648d9ffd0e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648dad7c000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:58.044980+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 260 ms_handle_reset con 0x5648dad7c000 session 0x5648d92d10e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173318144 unmapped: 35561472 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:59.045129+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2609715 data_alloc: 301989888 data_used: 17264640
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173318144 unmapped: 35561472 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:00.045275+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173318144 unmapped: 35561472 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:01.045430+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173318144 unmapped: 35561472 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 260 heartbeat osd_stat(store_statfs(0x1b1e7f000/0x0/0x1bfc00000, data 0x5d6148a/0x5f0e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:02.045591+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172589056 unmapped: 36290560 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:03.045725+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172597248 unmapped: 36282368 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 261 handle_osd_map epochs [261,262], i have 261, src has [1,262]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:04.045875+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2614875 data_alloc: 301989888 data_used: 17276928
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172597248 unmapped: 36282368 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:05.046035+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 262 handle_osd_map epochs [261,262], i have 262, src has [1,262]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172785664 unmapped: 36093952 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:06.046266+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 172965888 unmapped: 35913728 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:07.046428+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b1e12000/0x0/0x1bfc00000, data 0x5dcafd5/0x5f7b000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 262 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 262 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.699213982s of 10.159156799s, submitted: 173
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 262 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 263 heartbeat osd_stat(store_statfs(0x1b1e0d000/0x0/0x1bfc00000, data 0x5dcd534/0x5f7f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173162496 unmapped: 35717120 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:08.046643+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174268416 unmapped: 34611200 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:09.046804+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2632895 data_alloc: 301989888 data_used: 17276928
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173424640 unmapped: 35454976 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 265 heartbeat osd_stat(store_statfs(0x1b1e03000/0x0/0x1bfc00000, data 0x5dd2054/0x5f88000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:10.046988+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d51c3800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 265 ms_handle_reset con 0x5648d51c3800 session 0x5648d6a7e000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173424640 unmapped: 35454976 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 265 handle_osd_map epochs [265,266], i have 265, src has [1,266]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:11.047150+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17800
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 266 ms_handle_reset con 0x5648d6d17800 session 0x5648d9136f00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173432832 unmapped: 35446784 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:12.047346+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 266 heartbeat osd_stat(store_statfs(0x1b1de6000/0x0/0x1bfc00000, data 0x5defb46/0x5fa7000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 266 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 266 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 267 heartbeat osd_stat(store_statfs(0x1b1de1000/0x0/0x1bfc00000, data 0x5df20f9/0x5fab000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173449216 unmapped: 35430400 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d6d17c00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 267 ms_handle_reset con 0x5648d6d17c00 session 0x5648d7b654a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:13.047560+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173580288 unmapped: 35299328 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:14.047758+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 269 handle_osd_map epochs [268,269], i have 269, src has [1,269]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2644065 data_alloc: 301989888 data_used: 17289216
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 173588480 unmapped: 35291136 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:15.047929+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174637056 unmapped: 34242560 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:16.048119+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174825472 unmapped: 34054144 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:17.048277+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174825472 unmapped: 34054144 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.509747505s of 10.819299698s, submitted: 100
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:18.048453+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 270 heartbeat osd_stat(store_statfs(0x1b1d76000/0x0/0x1bfc00000, data 0x5e59dd5/0x6017000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174850048 unmapped: 34029568 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:19.048585+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2650367 data_alloc: 301989888 data_used: 17289216
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174907392 unmapped: 33972224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:20.048768+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174907392 unmapped: 33972224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:21.048890+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174907392 unmapped: 33972224 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:22.049082+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 270 heartbeat osd_stat(store_statfs(0x1b1d29000/0x0/0x1bfc00000, data 0x5ea7d5c/0x6065000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175038464 unmapped: 33841152 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:23.049361+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175046656 unmapped: 33832960 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:24.049551+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2657981 data_alloc: 301989888 data_used: 17301504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175046656 unmapped: 33832960 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b1cfb000/0x0/0x1bfc00000, data 0x5ed2f01/0x6092000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:25.049730+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175546368 unmapped: 33333248 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:26.049944+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175546368 unmapped: 33333248 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:27.050104+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175554560 unmapped: 33325056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:28.050297+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.350198746s of 10.538918495s, submitted: 68
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175677440 unmapped: 33202176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:29.050454+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2663841 data_alloc: 301989888 data_used: 17301504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175677440 unmapped: 33202176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:30.050643+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175628288 unmapped: 33251328 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b1c86000/0x0/0x1bfc00000, data 0x5f48478/0x6108000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:31.050836+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175628288 unmapped: 33251328 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:32.051022+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174735360 unmapped: 34144256 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:33.051159+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174735360 unmapped: 34144256 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b1c45000/0x0/0x1bfc00000, data 0x5f89178/0x6149000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:34.051303+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670131 data_alloc: 301989888 data_used: 17301504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 174735360 unmapped: 34144256 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:35.051458+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175882240 unmapped: 32997376 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:36.051674+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175882240 unmapped: 32997376 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:37.051814+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175890432 unmapped: 32989184 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:38.051987+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b1bd2000/0x0/0x1bfc00000, data 0x5ffb735/0x61bc000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 175890432 unmapped: 32989184 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.197720528s of 10.368879318s, submitted: 35
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:39.052157+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2680911 data_alloc: 301989888 data_used: 17301504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 176103424 unmapped: 32776192 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:40.052282+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 176316416 unmapped: 32563200 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:41.052482+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 176316416 unmapped: 32563200 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:42.052629+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 176447488 unmapped: 32432128 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 ms_handle_reset con 0x5648d51c2c00 session 0x5648d9629e00
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:43.052832+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b1b5a000/0x0/0x1bfc00000, data 0x607181b/0x6234000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 heartbeat osd_stat(store_statfs(0x1b1b56000/0x0/0x1bfc00000, data 0x60758ce/0x6238000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 177889280 unmapped: 30990336 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 54
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:44.052990+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2681897 data_alloc: 301989888 data_used: 17301504
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 177864704 unmapped: 31014912 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:45.053153+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 177938432 unmapped: 30941184 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:46.053319+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 177954816 unmapped: 30924800 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:47.053544+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b1b13000/0x0/0x1bfc00000, data 0x60b7bd1/0x627a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x7e6f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 177954816 unmapped: 30924800 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:48.053687+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 177954816 unmapped: 30924800 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:49.053859+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.101608276s of 10.365187645s, submitted: 264
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b16ce000/0x0/0x1bfc00000, data 0x60fcec7/0x62c0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x826f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2693071 data_alloc: 301989888 data_used: 17317888
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178094080 unmapped: 30785536 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/321501752' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b16ce000/0x0/0x1bfc00000, data 0x60fcec7/0x62c0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x826f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:50.053996+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 179314688 unmapped: 29564928 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:51.054171+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b16b8000/0x0/0x1bfc00000, data 0x6112b60/0x62d6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x826f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178446336 unmapped: 30433280 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:52.054344+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178675712 unmapped: 30203904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:53.054587+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178683904 unmapped: 30195712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:54.054789+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2706533 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178683904 unmapped: 30195712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:55.054954+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178683904 unmapped: 30195712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:56.055146+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178683904 unmapped: 30195712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b15e1000/0x0/0x1bfc00000, data 0x61e9184/0x63ad000, compress 0x0/0x0/0x0, omap 0x648, meta 0x826f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:57.055362+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 178683904 unmapped: 30195712 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:58.055553+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 179011584 unmapped: 29868032 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:59.055806+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.752937317s of 10.021323204s, submitted: 61
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2715909 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 181379072 unmapped: 27500544 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:00.056002+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 181518336 unmapped: 27361280 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:01.056225+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b03a3000/0x0/0x1bfc00000, data 0x6288933/0x644b000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 180797440 unmapped: 28082176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:02.056497+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 180903936 unmapped: 27975680 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:03.056687+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 180953088 unmapped: 27926528 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b0359000/0x0/0x1bfc00000, data 0x62cfb3f/0x6494000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:04.056861+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2729475 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 181198848 unmapped: 27680768 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:05.057052+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 182304768 unmapped: 26574848 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:06.057261+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b030f000/0x0/0x1bfc00000, data 0x631b74a/0x64df000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b02e9000/0x0/0x1bfc00000, data 0x634164b/0x6505000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 182304768 unmapped: 26574848 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:07.057448+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 182468608 unmapped: 26411008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:08.057629+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 182468608 unmapped: 26411008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:09.057767+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.729795456s of 10.076586723s, submitted: 73
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2732421 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 182468608 unmapped: 26411008 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:10.057939+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 184016896 unmapped: 24862720 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:11.058100+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b024d000/0x0/0x1bfc00000, data 0x63dded8/0x65a1000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 184295424 unmapped: 24584192 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:12.058301+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 184295424 unmapped: 24584192 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:13.058494+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 183508992 unmapped: 25370624 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:14.058691+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2746027 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 183582720 unmapped: 25296896 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:15.058861+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 183590912 unmapped: 25288704 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:16.059088+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 183754752 unmapped: 25124864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:17.059304+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b0190000/0x0/0x1bfc00000, data 0x649acb8/0x665e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 184811520 unmapped: 24068096 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:18.059553+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 184811520 unmapped: 24068096 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:19.059787+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2754385 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185016320 unmapped: 23863296 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:20.059967+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.880260468s of 11.232035637s, submitted: 78
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 183754752 unmapped: 25124864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:21.060151+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 183861248 unmapped: 25018368 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:22.060355+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 184090624 unmapped: 24788992 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:23.060517+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b009f000/0x0/0x1bfc00000, data 0x658c591/0x674f000, compress 0x0/0x0/0x0, omap 0x648, meta 0x940f9b8), peers [0,1,3,4,5] op hist [2])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185253888 unmapped: 23625728 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:24.060716+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2763767 data_alloc: 301989888 data_used: 17330176
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185253888 unmapped: 23625728 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:25.060874+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b1056000/0x0/0x1bfc00000, data 0x65d662f/0x6798000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185532416 unmapped: 23347200 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:26.061073+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185532416 unmapped: 23347200 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:27.061275+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185638912 unmapped: 23240704 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:28.061476+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b1014000/0x0/0x1bfc00000, data 0x6615d8d/0x67d9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 185999360 unmapped: 22880256 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:29.061692+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2768005 data_alloc: 301989888 data_used: 17342464
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186007552 unmapped: 22872064 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:30.061991+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.690424919s of 10.002165794s, submitted: 82
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186023936 unmapped: 22855680 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:31.062182+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187072512 unmapped: 21807104 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:32.062380+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187072512 unmapped: 21807104 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:33.062636+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187080704 unmapped: 21798912 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:34.062774+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b0f4a000/0x0/0x1bfc00000, data 0x66dcf2f/0x68a3000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2786099 data_alloc: 301989888 data_used: 17354752
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186703872 unmapped: 22175744 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:35.062922+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186900480 unmapped: 21979136 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:36.063151+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186900480 unmapped: 21979136 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:37.063317+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b0ec8000/0x0/0x1bfc00000, data 0x675fd76/0x6926000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186089472 unmapped: 22790144 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:38.063490+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186089472 unmapped: 22790144 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:39.063675+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2799377 data_alloc: 301989888 data_used: 17354752
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187211776 unmapped: 21667840 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:40.063828+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b0e44000/0x0/0x1bfc00000, data 0x67e43a1/0x69aa000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.643847466s of 10.003645897s, submitted: 83
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 275 handle_osd_map epochs [275,276], i have 275, src has [1,276]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187613184 unmapped: 21266432 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:41.063983+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187891712 unmapped: 20987904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:42.064159+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187924480 unmapped: 20955136 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:43.064310+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 276 heartbeat osd_stat(store_statfs(0x1b0da3000/0x0/0x1bfc00000, data 0x68819dd/0x6a4a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188170240 unmapped: 20709376 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:44.064510+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2814499 data_alloc: 301989888 data_used: 17367040
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 186687488 unmapped: 22192128 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:45.064634+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 276 heartbeat osd_stat(store_statfs(0x1b0d01000/0x0/0x1bfc00000, data 0x6921f7d/0x6aec000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 187842560 unmapped: 21037056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:46.064874+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188252160 unmapped: 20627456 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:47.065080+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 276 handle_osd_map epochs [276,277], i have 276, src has [1,277]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188252160 unmapped: 20627456 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:48.065261+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b0cfd000/0x0/0x1bfc00000, data 0x69263f5/0x6af0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188366848 unmapped: 20512768 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:49.065463+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2829923 data_alloc: 301989888 data_used: 17379328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188465152 unmapped: 20414464 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:50.065640+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.587804794s of 10.003247261s, submitted: 110
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 188473344 unmapped: 20406272 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:51.065827+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b0c96000/0x0/0x1bfc00000, data 0x698d1d7/0x6b58000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189587456 unmapped: 19292160 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:52.066088+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189587456 unmapped: 19292160 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:53.066301+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189587456 unmapped: 19292160 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:54.066477+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b0c24000/0x0/0x1bfc00000, data 0x69fec9e/0x6bca000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [0,0,0,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2842281 data_alloc: 301989888 data_used: 17379328
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189620224 unmapped: 19259392 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:55.066673+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 277 handle_osd_map epochs [277,278], i have 277, src has [1,278]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189685760 unmapped: 19193856 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:56.066912+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189702144 unmapped: 19177472 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:57.067092+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189816832 unmapped: 19062784 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:58.067279+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189292544 unmapped: 19587072 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:59.067489+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2854995 data_alloc: 301989888 data_used: 17391616
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 189292544 unmapped: 19587072 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:00.067639+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.618044853s of 10.007405281s, submitted: 89
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b0b28000/0x0/0x1bfc00000, data 0x6afa7c0/0x6cc5000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:01.067811+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 190341120 unmapped: 18538496 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b0af5000/0x0/0x1bfc00000, data 0x6b2ebed/0x6cf9000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:02.067962+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 191488000 unmapped: 17391616 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b0ab1000/0x0/0x1bfc00000, data 0x6b72c9f/0x6d3d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:03.068141+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 191488000 unmapped: 17391616 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:04.068348+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 191488000 unmapped: 17391616 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2871537 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:05.068545+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 192249856 unmapped: 16629760 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:06.068804+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 192323584 unmapped: 16556032 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:07.068966+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 192323584 unmapped: 16556032 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:08.069129+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0983000/0x0/0x1bfc00000, data 0x6c9c6c1/0x6e6a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 192651264 unmapped: 16228352 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:09.069287+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 192684032 unmapped: 16195584 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2880673 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:10.069508+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 192684032 unmapped: 16195584 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.597756386s of 10.000662804s, submitted: 90
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0912000/0x0/0x1bfc00000, data 0x6d0e735/0x6edc000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:11.069683+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 193888256 unmapped: 14991360 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:12.069863+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 193896448 unmapped: 14983168 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:13.070108+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 193896448 unmapped: 14983168 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:14.070314+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 194428928 unmapped: 14450688 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b08c0000/0x0/0x1bfc00000, data 0x6d5f421/0x6f2d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2899901 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0851000/0x0/0x1bfc00000, data 0x6dce63a/0x6f9d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:15.070520+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 194109440 unmapped: 14770176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:16.070788+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195371008 unmapped: 13508608 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:17.070974+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195723264 unmapped: 13156352 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:18.071150+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195518464 unmapped: 13361152 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:19.071360+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0785000/0x0/0x1bfc00000, data 0x6e97fa7/0x7067000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195518464 unmapped: 13361152 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2910897 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:20.071559+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195870720 unmapped: 13008896 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0785000/0x0/0x1bfc00000, data 0x6e97fa7/0x7067000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.620576859s of 10.000597000s, submitted: 86
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:21.071746+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 196698112 unmapped: 12181504 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:22.071954+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 196706304 unmapped: 12173312 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:23.072175+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 197181440 unmapped: 11698176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:24.072401+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195346432 unmapped: 13533184 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2923369 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:25.072593+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195411968 unmapped: 13467648 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:26.072791+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b066e000/0x0/0x1bfc00000, data 0x6fb0e06/0x7180000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195592192 unmapped: 13287424 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:27.072929+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 195592192 unmapped: 13287424 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:28.073141+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 196673536 unmapped: 12206080 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:29.073339+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 197025792 unmapped: 11853824 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2932487 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:30.073563+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 197025792 unmapped: 11853824 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.618735313s of 10.003035545s, submitted: 82
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b05d7000/0x0/0x1bfc00000, data 0x704892b/0x7217000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:31.073756+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 197795840 unmapped: 11083776 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:32.073944+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198246400 unmapped: 10633216 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:33.074098+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198246400 unmapped: 10633216 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:34.074294+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 197885952 unmapped: 10993664 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0506000/0x0/0x1bfc00000, data 0x711a87d/0x72e8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b0506000/0x0/0x1bfc00000, data 0x711a87d/0x72e8000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2935623 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:35.074538+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:36.074769+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:37.074975+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:38.075163+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:39.075500+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934933 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:40.075803+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:41.076033+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:42.076303+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:43.076495+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:44.076650+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.703669548s of 13.882040977s, submitted: 40
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2935565 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:45.076839+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:46.077068+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04d9000/0x0/0x1bfc00000, data 0x714769c/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:47.077283+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:48.077520+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04d9000/0x0/0x1bfc00000, data 0x714769c/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:49.077746+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04da000/0x0/0x1bfc00000, data 0x714769a/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:50.077896+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2935613 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:51.078094+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:52.078282+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:53.078479+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:54.078698+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04d9000/0x0/0x1bfc00000, data 0x714769d/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:55.078899+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2936339 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:56.079132+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.882432938s of 11.947295189s, submitted: 12
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04da000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:57.079337+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04da000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198082560 unmapped: 10797056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:58.079569+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:59.079756+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:00.079931+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934571 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:01.080133+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:02.080359+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:03.080581+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:04.080791+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:05.081018+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934747 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:06.081266+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:07.081439+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:08.081625+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198090752 unmapped: 10788864 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:09.081796+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:10.081954+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934747 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:11.082133+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:12.082576+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:13.082847+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:14.083014+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:15.083247+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934747 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:16.083429+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198098944 unmapped: 10780672 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:17.083595+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:18.083751+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:19.083932+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:20.084126+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934747 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.131034851s of 24.149797440s, submitted: 3
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:21.084412+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:22.084708+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:23.084892+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:24.085082+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198107136 unmapped: 10772480 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:25.085312+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934763 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:26.085542+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:27.085777+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:28.085959+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:29.086148+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:30.086330+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934763 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04db000/0x0/0x1bfc00000, data 0x71475d4/0x7313000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.552414894s of 10.561663628s, submitted: 1
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:31.086483+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:32.086651+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198115328 unmapped: 10764288 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:33.086833+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04d9000/0x0/0x1bfc00000, data 0x714769d/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:34.087016+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04da000/0x0/0x1bfc00000, data 0x714769b/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:35.087270+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2936339 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:36.087495+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04da000/0x0/0x1bfc00000, data 0x714769b/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:37.087723+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b04da000/0x0/0x1bfc00000, data 0x714769b/0x7314000, compress 0x0/0x0/0x0, omap 0x648, meta 0x840f9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:38.087930+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:39.088118+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:40.088306+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2934747 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 198131712 unmapped: 10747904 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:41.088475+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200253440 unmapped: 8626176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:42.088685+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200253440 unmapped: 8626176 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af2fb000/0x0/0x1bfc00000, data 0x718752a/0x7353000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:43.088883+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200261632 unmapped: 8617984 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:44.089085+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.175740242s of 13.276747704s, submitted: 18
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200376320 unmapped: 8503296 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:45.089262+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2948703 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200376320 unmapped: 8503296 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:46.089499+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200376320 unmapped: 8503296 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:47.089728+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af29e000/0x0/0x1bfc00000, data 0x71e3ae5/0x73b0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200597504 unmapped: 8282112 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:48.090046+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200597504 unmapped: 8282112 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:49.090285+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af29e000/0x0/0x1bfc00000, data 0x71e3ae5/0x73b0000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200605696 unmapped: 8273920 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:50.090458+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2948255 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200605696 unmapped: 8273920 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:51.090636+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200605696 unmapped: 8273920 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:52.090801+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af268000/0x0/0x1bfc00000, data 0x7219163/0x73e6000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 200613888 unmapped: 8265728 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:53.090951+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 199983104 unmapped: 8896512 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:54.091142+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 199983104 unmapped: 8896512 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:55.091285+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2957459 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.933197975s of 11.092078209s, submitted: 31
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201154560 unmapped: 7725056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:56.091519+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201154560 unmapped: 7725056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:57.091725+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201154560 unmapped: 7725056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:58.091891+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af1c7000/0x0/0x1bfc00000, data 0x72b8ac1/0x7487000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201154560 unmapped: 7725056 heap: 208879616 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:59.092102+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201170944 unmapped: 8757248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:00.092307+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2965663 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201277440 unmapped: 8650752 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:01.092483+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201277440 unmapped: 8650752 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:02.092660+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201850880 unmapped: 8077312 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af143000/0x0/0x1bfc00000, data 0x733d34a/0x750b000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:03.092827+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201850880 unmapped: 8077312 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:04.093039+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201850880 unmapped: 8077312 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:05.093250+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2964187 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.890022278s of 10.071253777s, submitted: 36
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 201990144 unmapped: 7938048 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:06.093460+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 202113024 unmapped: 7815168 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:07.093661+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 202113024 unmapped: 7815168 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:08.093836+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 ms_handle_reset con 0x5648d85cc400 session 0x5648d71b7860
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d7e4d000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 202113024 unmapped: 7815168 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af101000/0x0/0x1bfc00000, data 0x73822b8/0x754d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:09.094056+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 202113024 unmapped: 7815168 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:10.094295+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2966993 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203243520 unmapped: 6684672 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:11.094562+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203243520 unmapped: 6684672 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:12.094775+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203243520 unmapped: 6684672 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0e0000/0x0/0x1bfc00000, data 0x73a2b1f/0x756e000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:13.094968+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:14.095271+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:15.095445+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2967973 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.944179535s of 10.001319885s, submitted: 12
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0d4000/0x0/0x1bfc00000, data 0x73ae9d6/0x757a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:16.095684+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:17.095849+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:18.096039+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:19.096305+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203251712 unmapped: 6676480 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0d4000/0x0/0x1bfc00000, data 0x73ae9d6/0x757a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:20.096511+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2969293 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203202560 unmapped: 6725632 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:21.096708+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0d4000/0x0/0x1bfc00000, data 0x73ae9d6/0x757a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:22.096918+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:23.097097+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:24.097299+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0d4000/0x0/0x1bfc00000, data 0x73ae9d6/0x757a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:25.097568+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0d4000/0x0/0x1bfc00000, data 0x73ae9d6/0x757a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2969293 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:26.097804+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:27.098005+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203218944 unmapped: 6709248 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:28.098243+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.622063637s of 12.631659508s, submitted: 1
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203309056 unmapped: 6619136 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:29.098423+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203317248 unmapped: 6610944 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:30.098585+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2970125 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203317248 unmapped: 6610944 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0b9000/0x0/0x1bfc00000, data 0x73ca1d2/0x7595000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:31.098767+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203317248 unmapped: 6610944 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:32.098947+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0b2000/0x0/0x1bfc00000, data 0x73d0f44/0x759c000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af0a0000/0x0/0x1bfc00000, data 0x73e343a/0x75ae000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203317248 unmapped: 6610944 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:33.099290+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203317248 unmapped: 6610944 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:34.099484+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203317248 unmapped: 6610944 heap: 209928192 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:35.099680+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af07d000/0x0/0x1bfc00000, data 0x7403e48/0x75d1000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [1,1])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2980677 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203505664 unmapped: 7471104 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:36.099924+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203522048 unmapped: 7454720 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:37.100178+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203522048 unmapped: 7454720 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af051000/0x0/0x1bfc00000, data 0x74302f2/0x75fd000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:38.100417+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 203579392 unmapped: 7397376 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.477046967s of 10.595286369s, submitted: 23
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:39.100645+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 heartbeat osd_stat(store_statfs(0x1af014000/0x0/0x1bfc00000, data 0x746dd12/0x763a000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 24K writes, 91K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.01 MB/s
                                                          Cumulative WAL: 24K writes, 8729 syncs, 2.79 writes per sync, written: 0.08 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 42K keys, 11K commit groups, 1.0 writes per commit group, ingest: 41.50 MB, 0.07 MB/s
                                                          Interval WAL: 11K writes, 4451 syncs, 2.48 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204038144 unmapped: 6938624 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:40.100823+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2979481 data_alloc: 301989888 data_used: 17403904
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 279 handle_osd_map epochs [279,280], i have 279, src has [1,280]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204062720 unmapped: 6914048 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:41.101057+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204079104 unmapped: 6897664 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:42.101275+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 280 heartbeat osd_stat(store_statfs(0x1af006000/0x0/0x1bfc00000, data 0x747b456/0x7648000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204079104 unmapped: 6897664 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:43.101462+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204079104 unmapped: 6897664 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:44.101625+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 280 heartbeat osd_stat(store_statfs(0x1af006000/0x0/0x1bfc00000, data 0x747b456/0x7648000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204095488 unmapped: 6881280 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:45.101811+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2984393 data_alloc: 301989888 data_used: 17420288
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 6758400 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:46.102034+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 6758400 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:47.102239+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 6758400 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:48.102448+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _renew_subs
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204226560 unmapped: 6750208 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:49.102640+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204226560 unmapped: 6750208 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:50.102878+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2986051 data_alloc: 301989888 data_used: 17432576
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204242944 unmapped: 6733824 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:51.103256+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204242944 unmapped: 6733824 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:52.103448+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204251136 unmapped: 6725632 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:53.103681+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:54.103899+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:55.104086+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2986051 data_alloc: 301989888 data_used: 17432576
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:56.104314+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:57.104502+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:58.104722+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:59.104978+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204267520 unmapped: 6709248 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:00.105182+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2986051 data_alloc: 301989888 data_used: 17432576
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204283904 unmapped: 6692864 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:01.105403+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204283904 unmapped: 6692864 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:02.105592+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204283904 unmapped: 6692864 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:03.105767+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204283904 unmapped: 6692864 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:04.105988+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204283904 unmapped: 6692864 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:05.106183+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2986051 data_alloc: 301989888 data_used: 17432576
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204292096 unmapped: 6684672 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:06.106518+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204292096 unmapped: 6684672 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:07.106718+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204292096 unmapped: 6684672 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:08.106904+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204292096 unmapped: 6684672 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:09.107080+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204308480 unmapped: 6668288 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:10.107290+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2986051 data_alloc: 301989888 data_used: 17432576
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204308480 unmapped: 6668288 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:11.107471+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204308480 unmapped: 6668288 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:12.107682+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1aefd4000/0x0/0x1bfc00000, data 0x74aa3da/0x7679000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204308480 unmapped: 6668288 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:13.107819+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.216300964s of 34.416736603s, submitted: 53
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204308480 unmapped: 6668288 heap: 210976768 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:14.108037+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205103104 unmapped: 17416192 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:15.108252+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 heartbeat osd_stat(store_statfs(0x1ae7d4000/0x0/0x1bfc00000, data 0x7caa3da/0x7e79000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 281 handle_osd_map epochs [281,282], i have 281, src has [1,282]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 282 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3046037 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: handle_auth_request added challenge on 0x5648d85b8000
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205103104 unmapped: 17416192 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:16.108465+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 282 handle_osd_map epochs [282,283], i have 282, src has [1,283]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 283 heartbeat osd_stat(store_statfs(0x1ae7cf000/0x0/0x1bfc00000, data 0x7cac90d/0x7e7d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 283 ms_handle_reset con 0x5648d85b8000 session 0x5648dbda61e0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 283 heartbeat osd_stat(store_statfs(0x1ae7cf000/0x0/0x1bfc00000, data 0x7cac90d/0x7e7d000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:17.108639+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 283 heartbeat osd_stat(store_statfs(0x1aefca000/0x0/0x1bfc00000, data 0x74aee94/0x7681000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:18.108792+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:19.108965+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:20.109085+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2995373 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:21.109251+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:22.109417+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204144640 unmapped: 18374656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 283 handle_osd_map epochs [283,284], i have 283, src has [1,284]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:23.109558+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc8000/0x0/0x1bfc00000, data 0x74b12ad/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.421961784s of 10.588351250s, submitted: 39
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204169216 unmapped: 18350080 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:24.109758+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204177408 unmapped: 18341888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:25.109928+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996649 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204177408 unmapped: 18341888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc8000/0x0/0x1bfc00000, data 0x74b12ad/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:26.110153+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204193792 unmapped: 18325504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:27.110304+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204193792 unmapped: 18325504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:28.110477+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204193792 unmapped: 18325504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:29.110659+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204193792 unmapped: 18325504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:30.110846+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc8000/0x0/0x1bfc00000, data 0x74b12ad/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996649 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204193792 unmapped: 18325504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:31.110994+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204193792 unmapped: 18325504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:32.111132+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204210176 unmapped: 18309120 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:33.111256+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:34.111572+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:35.111815+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996649 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc8000/0x0/0x1bfc00000, data 0x74b12ad/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:36.111990+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:37.112223+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:38.112454+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc8000/0x0/0x1bfc00000, data 0x74b12ad/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc8000/0x0/0x1bfc00000, data 0x74b12ad/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:39.112586+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204218368 unmapped: 18300928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:40.112716+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996649 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204226560 unmapped: 18292736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:41.112866+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204234752 unmapped: 18284544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.182542801s of 18.193357468s, submitted: 8
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 ms_handle_reset con 0x5648d6d16800 session 0x5648d6cb85a0
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:42.113086+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205045760 unmapped: 17473536 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:43.113283+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205045760 unmapped: 17473536 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Got map version 55
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:44.113441+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205053952 unmapped: 17465344 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:45.113594+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205053952 unmapped: 17465344 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:46.113768+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205062144 unmapped: 17457152 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:47.113905+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205062144 unmapped: 17457152 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:48.114046+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205086720 unmapped: 17432576 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:49.114177+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205086720 unmapped: 17432576 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:50.114398+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205086720 unmapped: 17432576 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:51.114543+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205086720 unmapped: 17432576 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:52.114628+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205094912 unmapped: 17424384 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:53.114751+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205094912 unmapped: 17424384 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:54.114905+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205094912 unmapped: 17424384 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:55.115247+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205094912 unmapped: 17424384 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:56.115478+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205111296 unmapped: 17408000 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:57.115647+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:58.115910+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:59.116124+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:00.116303+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:01.116612+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:02.116834+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:03.118438+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205127680 unmapped: 17391616 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:04.120694+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205135872 unmapped: 17383424 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:05.121317+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205152256 unmapped: 17367040 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:06.123553+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:07.124108+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205152256 unmapped: 17367040 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:08.124275+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205152256 unmapped: 17367040 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:09.124466+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205152256 unmapped: 17367040 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:10.125632+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205152256 unmapped: 17367040 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:11.126032+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205160448 unmapped: 17358848 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:12.126672+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205160448 unmapped: 17358848 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:13.127086+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:14.127265+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:15.127615+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:16.127830+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:17.128382+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:18.128644+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:19.128889+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:20.129070+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205168640 unmapped: 17350656 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:21.129345+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205176832 unmapped: 17342464 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:22.129712+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205176832 unmapped: 17342464 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:23.129854+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205176832 unmapped: 17342464 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:24.129999+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205185024 unmapped: 17334272 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:25.130152+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205185024 unmapped: 17334272 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:26.130475+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205185024 unmapped: 17334272 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:27.130819+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205185024 unmapped: 17334272 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:28.131130+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205185024 unmapped: 17334272 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:29.131329+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:30.131644+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:31.131795+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:32.131923+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:33.132242+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:34.132600+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:35.132786+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:36.133031+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:37.133219+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:38.133444+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:39.133626+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:40.133805+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:41.133965+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:42.134167+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:43.134336+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:44.134537+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205201408 unmapped: 17317888 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:45.134734+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205209600 unmapped: 17309696 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:46.134911+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205209600 unmapped: 17309696 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:47.135083+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205209600 unmapped: 17309696 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:48.135290+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205209600 unmapped: 17309696 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:49.135460+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:50.135603+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:51.138600+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:52.138790+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:53.138922+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:54.139091+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:55.139275+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:56.139478+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:57.139677+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205217792 unmapped: 17301504 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:58.139829+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205225984 unmapped: 17293312 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:59.140006+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205225984 unmapped: 17293312 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:00.140246+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205225984 unmapped: 17293312 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:01.140531+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205242368 unmapped: 17276928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:02.140726+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205242368 unmapped: 17276928 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:03.140914+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205250560 unmapped: 17268736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:04.141050+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205250560 unmapped: 17268736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:05.141307+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205250560 unmapped: 17268736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:06.141645+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205250560 unmapped: 17268736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:07.141905+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205250560 unmapped: 17268736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:08.142066+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205250560 unmapped: 17268736 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:09.142299+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:10.142459+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:11.142669+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:12.142863+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:13.143027+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:14.143289+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:15.144474+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:16.144664+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205258752 unmapped: 17260544 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:17.144791+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:18.144933+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:19.145092+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:20.145269+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:21.145458+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:22.145621+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:23.145727+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:24.145897+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205283328 unmapped: 17235968 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:25.146018+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205299712 unmapped: 17219584 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:26.146177+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205299712 unmapped: 17219584 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:27.146305+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205299712 unmapped: 17219584 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:28.146466+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205299712 unmapped: 17219584 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:29.146587+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205299712 unmapped: 17219584 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:30.146712+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205307904 unmapped: 17211392 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:31.146838+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: bluestore.MempoolThread(0x5648d3639b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2996169 data_alloc: 301989888 data_used: 17444864
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'config show' '{prefix=config show}'
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205340672 unmapped: 17178624 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:32.146995+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 205004800 unmapped: 17514496 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: osd.2 284 heartbeat osd_stat(store_statfs(0x1aefc9000/0x0/0x1bfc00000, data 0x74b14c0/0x7685000, compress 0x0/0x0/0x0, omap 0x648, meta 0x95af9b8), peers [0,1,3,4,5] op hist [])
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: tick
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:33.147125+0000)
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: prioritycache tune_memory target: 5709082009 mapped: 204767232 unmapped: 17752064 heap: 222519296 old mem: 4047413338 new mem: 4047413338
Dec 06 10:33:03 np0005548788.localdomain ceph-osd[31731]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 06 10:33:03 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/908175058' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/294876320' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2422302273' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1710489525' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: pgmap v833: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/4062777244' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1935362394' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2953662664' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1342647983' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2985416432' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/321501752' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3932188867' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/908175058' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3193103153' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2348448191' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1689303103' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1182164294' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/77878866' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2370934995' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain rsyslogd[760]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2574211342' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 06 10:33:04 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4154228913' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1107750515' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:05.256 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/77878866' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2370934995' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/730220607' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3268197787' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/946269881' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/4280751732' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2574211342' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/4154228913' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2389094995' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1798206855' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1230936092' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1464918040' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:05 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1107750515' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.50115 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3451307951' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.69737 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.59539 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.59533 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: pgmap v834: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.50127 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.50124 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.69758 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/322497280' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:06 np0005548788.localdomain ceph-mon[293643]: from='client.59560 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548788.localdomain systemd[1]: Starting Hostname Service...
Dec 06 10:33:06 np0005548788.localdomain systemd[1]: Started Hostname Service.
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1050406131' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.59566 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.50136 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.50139 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.69773 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.69779 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.59572 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.50145 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.69785 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.59587 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.69791 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/488060457' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.50157 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1050406131' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3447850753' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.503 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.505 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.506 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceilometer_agent_compute[237344]: 2025-12-06 10:33:07.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "versions"} v 0)
Dec 06 10:33:07 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/375595248' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/730980905' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:33:08 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:08.149 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2525006127' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.69803 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.59605 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.69815 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.50175 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.59617 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: pgmap v835: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/375595248' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1050443617' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.50187 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/2289626626' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.69827 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.59629 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/730980905' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1525022835' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1041013098' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/2525006127' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:08 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:33:08 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:33:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:33:08 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:33:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: ERROR   10:33:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:33:08 np0005548788.localdomain openstack_network_exporter[242070]: 
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/41112147' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3853367064' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2625254279' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/759059760' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/41112147' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3002096115' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:10 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:10.258 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='client.59683 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: pgmap v836: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='client.50256 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/995061289' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2611527651' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3002096115' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/509212967' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "df"} v 0)
Dec 06 10:33:10 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1083196786' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3836096139' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 06 10:33:11 np0005548788.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 06 10:33:11 np0005548788.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 06 10:33:11 np0005548788.localdomain kernel: cfg80211: failed to load regulatory.db
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/356022594' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.69911 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/1083196786' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/1474627565' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3948867050' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3836096139' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2126088694' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:11 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1936156670' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: pgmap v837: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/356022594' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3406386216' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: from='client.59728 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/3352183805' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1607197834' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 06 10:33:12 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3357817178' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:33:13 np0005548788.localdomain nova_compute[281005]: 2025-12-06 10:33:13.149 281009 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3983119808' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.50292 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.69950 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/1106523137' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3357817178' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/572908366' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.106:0/3983119808' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.108:0/2359850793' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 06 10:33:13 np0005548788.localdomain ceph-mon[293643]: from='client.? 172.18.0.107:0/3184309527' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 06 10:33:14 np0005548788.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3.
Dec 06 10:33:14 np0005548788.localdomain ceph-mon[293643]: mon.np0005548788@0(leader) e15 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 06 10:33:14 np0005548788.localdomain ceph-mon[293643]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1985088088' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 06 10:33:14 np0005548788.localdomain podman[337237]: 2025-12-06 10:33:14.271406124 +0000 UTC m=+0.097144006 container health_status 6977cedd01024fab7d6e72d40133db66dd2a58e75752bb6ea4d5a8937c7a03d3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
